text
stringlengths
0
473k
[SOURCE: https://en.wikipedia.org/wiki/Thirty-seventh_government_of_Israel#cite_note-85] | [TOKENS: 9915]
Contents Thirty-seventh government of Israel The thirty-seventh government of Israel is the current cabinet of Israel, formed on 29 December 2022, following the Knesset election the previous month. The coalition government currently consists of five parties — Likud, Shas, Otzma Yehudit, Religious Zionist Party and New Hope — and is led by Benjamin Netanyahu, who took office as the prime minister of Israel for the sixth time. The government is widely regarded as the most right-wing government in the country's history, and includes far-right politicians. Several of the government's policy proposals have led to controversies, both within Israel and abroad, with the government's attempts at reforming the judiciary leading to a wave of demonstrations across the country. Following the outbreak of the Gaza war, opposition leader Yair Lapid initiated discussions with Netanyahu on the formation of an emergency government. On 11 October 2023, National Unity MKs Benny Gantz, Gadi Eisenkot, Gideon Sa'ar, Hili Tropper, and Yifat Shasha-Biton joined the Security Cabinet of Israel to form an emergency national unity government. Their accession to the Security Cabinet and to the government (as ministers without portfolio) was approved by the Knesset the following day. Gantz, Netanyahu, and Defense Minister Yoav Gallant became part of the newly formed Israeli war cabinet, with Eisenkot and Ron Dermer serving as observers. National Unity left the government in June 2024. New Hope rejoined the government in September. Otzma Yehudit announced on 19 January 2025 that it had withdrawn from the government, which took effect on 21 January, following the cabinet's acceptance of the three-phase Gaza war ceasefire proposal, though it rejoined two months later. United Torah Judaism left the government in July 2025 over dissatisfaction with the government's draft conscription law. Shas left the government several days later, though it remains part of the coalition. Background The right-wing bloc of parties, led by Benjamin Netanyahu, known in Israel as the national camp, won 64 of the 120 seats in the elections for the Knesset, while the coalition led by the incumbent prime minister Yair Lapid won 51 seats. The new majority has been variously described as the most right-wing government in Israeli history, as well as Israel's most religious government. Shortly after the elections, Lapid conceded to Netanyahu, and congratulated him, wishing him luck "for the sake of the Israeli people". On 15 November, the swearing-in ceremony for the newly elected members of the 25th Knesset was held during the opening session. The vote to appoint a new Speaker of the Knesset, which is usually conducted at the opening session, as well as the swearing in of cabinet members were postponed since ongoing coalition negotiations had not yet resulted in agreement on these positions. Government formation Yair Lapid Yesh Atid Benjamin Netanyahu Likud On 3 November 2022, Netanyahu told his aide Yariv Levin to begin informal coalition talks with allied parties, after 97% of the vote was counted. The leader of the Shas party Aryeh Deri met with Yitzhak Goldknopf, the leader of United Torah Judaism and its Agudat Yisrael faction, on 4 November. The two parties agreed to cooperate as members of the next government. The Degel HaTorah faction of United Torah Judaism stated on 5 November that it will maintain its ideological stance about not seeking any ministerial posts, as per the instruction of its spiritual leader Rabbi Gershon Edelstein, but will seek other senior posts like Knesset committee chairmen and deputy ministers. Netanyahu himself started holding talks on 6 November. He first met with Moshe Gafni, the leader of Degel HaTorah, and then with Goldknopf. Meanwhile, the Religious Zionist Party leader Bezalel Smotrich and the leader of its Otzma Yehudit faction Itamar Ben-Gvir pledged that they would not enter the coalition without the other faction. Gafni later met with Smotrich for coalition talks. Smotrich then met with Netanyahu. On 7 November, Netanyahu met with Ben-Gvir who demanded the Ministry of Public Security with expanded powers for himself and the Ministry of Education or Transport and Road Safety for Yitzhak Wasserlauf. A major demand among all of Netanyahu's allies was that the Knesset be allowed to ignore the rulings of the Supreme Court. Netanyahu met with the Noam faction leader and its sole MK Avi Maoz on 8 November after he threatened to boycott the coalition. He demanded complete control of the Western Wall by the Haredi rabbinate and removal of what he considered as anti-Zionist and anti-Jewish content in schoolbooks. President Isaac Herzog began consultations with heads of all the political parties on 9 November after the election results were certified. During the consultations, he expressed his reservations about Ben-Gvir becoming a member in the next government. Shas met with Likud for coalition talks on 10 November. By 11 November, Netanyahu had secured recommendations from 64 MKs, which constituted a majority. He was given the mandate to form the thirty-seventh government of Israel by President Herzog on 13 November. Otzma Yehudit and Noam officially split from Religious Zionism on 20 November as per a pre-election agreement. On 25 November, Otzma Yehudit and Likud signed a coalition agreement, under which Ben-Gvir will assume the newly created position of National Security Minister, whose powers would be more expansive than that of the Minister of Public Security, including overseeing the Israel Police and the Israel Border Police in the West Bank, as well as giving powers to authorities to shoot thieves stealing from military bases. Yitzhak Wasserlauf was given the Ministry for the Development of the Negev and the Galilee with expanded powers to regulate new West Bank settlements, while separating it from the "Periphery" portfolio, which will be given to Shas. The deal also includes giving the Ministry of Heritage to Amihai Eliyahu, separating it from the "Jerusalem Affairs" portfolio, the chairmanship of the Knesset's Public Security Committee to Zvika Fogel and that of the Special Committee for the Israeli Citizens' Fund to Limor Son Har-Melech, the post of Deputy Economic Minister to Almog Cohen, establishment of a national guard, and expansion of mobilization of reservists in the Border Police. Netanyahu and Maoz signed a coalition agreement on 27 November, under which the latter would become a deputy minister, would head an agency on Jewish identity in the Prime Minister's Office, and would also head Nativ, which processes the aliyah from the former Soviet Union. The agency for Jewish identity would have authority over educational content taught outside the regular curriculum in schools, in addition to the department of the Ministry of Education overseeing external teaching and partnerships, which would bring nonofficial organisations permitted to teach and lecture at schools under its purview. Likud signed a coalition agreement with the Religious Zionist Party on 1 December. Under the deal, Smotrich would serve as the Minister of Finance in rotation with Aryeh Deri, and the party will receive the post of a minister within the Ministry of Defense with control over the departments administering settlement and open lands under the Coordinator of Government Activities in the Territories, in addition to another post of a deputy minister. The deal also includes giving the post of Minister of Aliyah and Integration to Ofir Sofer, the newly created National Missions Ministry to Orit Strook, and the chairmanship of the Knesset's Constitution, Law and Justice Committee to Simcha Rothman. Likud and United Torah Judaism signed a coalition agreement on 6 December, to allow request for an extension to the deadline. Under it, the party would receive the Ministry of Construction and Housing, the chairmanship of the Knesset Finance Committee which will be given to Moshe Gafni, the Ministry of Jerusalem and Tradition (which would replace the Ministry of Jerusalem Affairs and Heritage), in addition to several posts of deputy ministers and chairmanships of Knesset committees. Likud also signed a deal with Shas by 8 December, securing interim coalition agreements with all of their allies. Under the deal, Deri will first serve as the Minister of Interior and Health, before rotating posts with Smotrich after two years. The party will also receive the Ministry of Religious Services and Welfare Ministries, as well as posts of deputy ministers in the Ministry of Education and Interior. The vote to replace then-incumbent Knesset speaker Mickey Levy was scheduled for 13 December, after Likud and its allies secured the necessary number of signatures for it. Yariv Levin of Likud was elected as an interim speaker by 64 votes, while his opponents Merav Ben-Ari of Yesh Atid and Ayman Odeh of Hadash received 45 and five votes respectively. Netanyahu asked Herzog for a 14-day extension after the agreement with Shas to finalise the roles his allied parties would play. Herzog on 9 December extended the deadline to 21 December. On that date, Netanyahu informed Herzog that he had succeeded in forming a coalition, with the new government expected to be sworn in by 2 January 2023. The government was sworn in on 29 December 2022. Timeline Israeli law stated that people convicted of crimes cannot serve in the government. An amendment to that law was made in late 2022, known colloquially as the Deri Law, to allow those who had been convicted without prison time to serve. This allowed Deri to be appointed to the cabinet. Shas leader Aryeh Deri was appointed to be Minister of Health, Minister of the Interior, and Vice Prime Minister in December 2022. He was fired in January 2023, following a Supreme Court decision that his appointment was unreasonable, since he had been convicted of fraud, and had promised not to seek government roles through a plea deal. In March 2023, Defence Minister Yoav Gallant called on the government to delay legislation related to the judicial reform. Prime Minister Netanyahu announced that he had been dismissed from his position, leading to the continuation of mass protests across the country (which had started in January in Tel Aviv). Gallant continued to serve as a minister as he had not received formal notice of dismissal, and two weeks later it was announced that Netanyahu had reversed his decision. Public Safety Minister Itamar Ben-Gvir (Otzma Yehudit leader) and Minister of Justice Yariv Levin (Likud) both threatened to resign if the judicial reform was delayed.[better source needed] After the outbreak of the Gaza war, five members of the National Unity party joined the government as ministers without portfolio, with leader Benny Gantz being made a member of the new Israeli war cabinet (along with Netanyahu and Gallant). As the war progressed, minister of national security Itamar Ben-Gvir threatened to leave the government if the war was ended. A month later in mid December, he again threatened to leave if the war did not maintain "full strength". Gideon Sa'ar stated on 16 March that his New Hope party would resign from the government and join the opposition if Prime Minister Benjamin Netanyahu did not appoint him to the Israeli war cabinet. Netanyahu did not do so, resulting in Sa'ar's New Hope party leaving the government nine days later, reducing the size of the coalition from 76 MKs to 72. Ben-Gvir and Bezalel Smotrich, of the National Religious Party–Religious Zionism party, have indicated that they will withdraw their parties from the government if the January 2025 Gaza war ceasefire is adopted, which would bring down the government. Ben-Gvir announced on 5 June that the members of his party would be allowed to vote as they wish, though his party resumed support on 9 June. On 18 May, Gantz set an 8 June deadline for withdrawal from the coalition, which was delayed by a day following the 2024 Nuseirat rescue operation. Gantz and his party left the government on 9 June, giving the government 64 seats in the Knesset. Sa'ar and his New Hope party rejoined the Netanyahu government on 30 September, increasing the number of seats held by the government to 68. The High Court of Justice ruled on 28 March 2024 that yeshiva funds would no longer be available for students who are "eligible for enlistment", effectively allowing ultra-Orthodox Jews to be drafted into the IDF. Attorney general Gali Baharav-Miara indicated on 31 March that the conscription process must begin on 1 April. The court ruled on 25 June that the IDF must begin to draft yeshiva students. Likud announced on 7 July that it would not put forward any legislation after Shas and United Torah Judaism said that they would boycott the plenary session over the lack of legislation dealing with the Haredi draft. The Ultra-Orthodox boycott continued for a second day, with UTJ briefly ending its boycott on 9 July to unsuccessfully vote in favor of a bill which would have weakened the Law of Return. Yuli Edelstein, who was replaced by Boaz Bismuth on the Foreign Affairs and Defense Committee in early August, published a draft version of the conscription law shortly before his ouster. Bismuth cancelled the work on the draft law in September 2025, which Edelstein called "a shame." Bismuth released the official version of the draft law in late November 2025. It weakened penalties for draft evaders, with Edelstein saying it was "the exact opposite" of the bill which he attempted to pass. Members of Otzma Yehudit resigned from the government on 19 January 2025 over the January 2025 Gaza war ceasefire, which took effect on 21 January. The members rejoined in March, following the "resumption" of the war in Gaza. Avi Maoz of the Noam party left the government in March 2025. On 4 June 2025, senior rabbis for United Torah Judaism Dov Lando and Moshe Hillel Hirsch instructed the party's MKs to pass a bill which would dissolve the Knesset. Yesh Atid, Yisrael Beytenu and The Democrats announced that they will "submit a bill" for dissolution on 11 June, with Yesh Atid tabling the bill on 4 June. There were also reports that Shas would vote in favor of Knesset dissolution amidst division within the governing coalition on Haredi conscription. This jeopardized the coalition's majority and would have triggered new elections if the bill passed. The following day, Agudat Yisrael, one of the United Torah Judaism factions, confirmed that it would submit a bill to dissolve the Knesset. Asher Medina, a Shas spokesman, indicated on 9 June that the party would vote in favor of a preliminary bill to dissolve the Knesset. The rabbis of Degel HaTorah instructed the parties' MKs on 12 June 2025 to oppose the dissolution of the Knesset, which was followed by Yuli Edelstein and the Shas and Degel HaTorah parties announcing that a deal had been reached, with "rabbinical leaders" telling their parties to delay the dissolution vote by a week. Shas and Degel HaTorah voted against the dissolution bill, which led to the bill failing its preliminary reading in a vote of 61 against and 53 in favor. MKs Ya'akov Tessler and Moshe Roth of Agudat Yisrael voted in favor of dissolution. Another dissolution bill will be unable to be brought forward for six months. If the bill had passed its preliminary reading, in addition to three more readings, an election would have been held in approximately three months; The Jerusalem Post posited it would have been held in October. Degel HaTorah announced on 14 July 2025 that it would leave the government because members of the party were dissatisfied after viewing the proposed draft bill by Yuli Edelstein regarding Haredi exemptions from the Israeli draft. Several hours later, Agudat Yisrael announced that it would also leave the government. Deputy Transportation Minister Uri Maklev, Moshe Gafni, the head of the Knesset Finance Committee, Ya'akov Asher, the head of the Knesset Interior and Environment Protection Committee and Jerusalem Affairs minister Meir Porush all submitted their resignations, with their resignations taking effect in 48 hours. Sports Minister Ya'akov Tessler and "Special Committee for Public Petitions Chair" Yitzhak Pindrus also submitted resignations. Yisrael Eichler submitted his resignation as the "head of the Knesset Labor and Welfare Committee" the same day. The resignations will leave Netanyahu's government with a 60-seat majority in the Knesset, as Avi Maoz, of the Noam party, left the government in March 2025. Despite Edelstein's ouster in August, a spokesman for UTJ head Yitzhak Goldknopf remarked that it would not change the faction's withdrawal from the government. The religious council for Shas, called the Moetzet Chachmei HaTorah, instructed the party on 16 July to leave the government, but stay in the coalition. The following day, various cabinet ministers submitted their resignations, including "Interior Minister Moshe Arbel, Social Affairs Minister Ya'akov Margi and Religious Services Minister Michael Malchieli." Malchieli reportedly has postponed his resignation so he could attend a 20 July meeting of the panel investigating whether attorney general Gali Baharav-Miara should be dismissed. Deputy Minister of Agriculture Moshe Abutbul, Minister of Health Uriel Buso and Haim Biton, a minister in the Education Ministry, also submitted their resignation letters, while Arbel retracted his resignation letter. The last cabinet member from the party to submit it was Labor Minister Yoav Ben-Tzur. The ministers who resigned will return to the Knesset, replacing MKs Moshe Roth, Yitzhak Pindrus and Eliyahu Baruchi. Members of government Listed below are the current ministers in the government: Principles and priorities According to the agreements signed between Likud and each of its coalition partners, and the incoming government's published guideline principles, its stated priorities are to combat the cost of living, further centralize Orthodox control over the state religious services, pass judicial reforms which include legislation to reduce judicial controls on executive and legislative power, expand settlements in the West Bank, and consider an annexation of the West Bank. Before the vote of confidence in his new government in the Knesset, Netanyahu presented three top priorities for the new government: internal security and governance, halting the nuclear program of Iran, and the development of infrastructure, with a focus on further connecting the center of the country with its periphery. Policies The government's flagship program, centered around reforms in the judicial branch, drew widespread criticism. Critics said it would have negative effects on the separation of powers, the office of the Attorney General, the economy, public health, women and minorities, workers' rights, scientific research, the overall strength of Israel's democracy and its foreign relations. After weeks of public protests on Israel's streets, joined by a growing number of military reservists, Minister of Defense Yoav Gallant spoke against the reform on 25 March, calling for a halt of the legislative process "for the sake of Israel's security". The next day, Netanyahu announced that he would be removed from his post, sparking another wave of protest across Israel and ultimately leading to Netanyahu agreeing to pause the legislation. On 10 April, Netanyahu announced that Gallant would keep his post. On 27 March 2023, after the public protests and general strikes, Netanyahu announced a pause in the reform process to allow for dialogue with opposition parties. However, negotiations aimed at reaching a compromise collapsed in June, and the government resumed its plans to unilaterally pass parts of the legislation. On 24 July 2023, the Knesset passed a bill that curbs the power of the Supreme Court to declare government decisions unreasonable; on 1 January 2024, the Supreme Court struck the bill down. The Knesset passed a "watered-down" version of the judicial reform package in late March 2025 which "changes the composition" of the judicial selection committee. In December 2022 Minister of National Security Itamar Ben-Gvir sought to amend the law that regulates the operations of the Israel Police, such that the ministry will have more direct control of its forces and policies, including its investigative priorities. Attorney General Gali Baharav-Miara objected to the draft proposal, raising concerns that the law would enable the politicization of police work, and the draft was amended to partially address those concerns. Nevertheless, in March 2023 Deputy Attorney General Gil Limon stated that the Attorney General's fears had been realized, referring to several instances of ministerial involvement in the day-to-day work of the otherwise independent police force – statements that were repeated by the Attorney General herself two days later. Separately, Police Commissioner Kobi Shabtai instructed Deputy Commissioners to avoid direct communication with the minister, later stating that "the Israel Police will remain apolitical, and act only according to law". Following appeals by the Association for Civil Rights in Israel and the Movement for Quality Government in Israel, the High Court of Justice instructed Ben-Gvir "to refrain from giving operational directions to the police... [especially] as regards to protests and demonstrations against the government." As talks of halting the judicial reform gained wind during March 2023, Minister of National Security Itamar Ben-Gvir threatened to resign if the legislation implementing the changes was suspended. To appease Ben-Gvir, Prime Minister Netanyahu announced that the government would promote the creation of a new National Guard, to be headed by Ben-Gvir. On 29 March, thousands of Israelis demonstrated in Tel Aviv, Haifa and Jerusalem against this decision. On 1 April, the New York Times quoted Gadeer Nicola, head of the Arab department at the Association for Civil Rights in Israel, as saying "If this thing passes, it will be an imminent danger to the rights of Arab citizens in this country. This will create two separate systems of applying the law. The regular police which will operate against Jewish citizens — and a militarized militia to deal only with Arab citizens." The same day, while speaking on Israel's Channel 13 about those whom he'd like to see enlist in the National Guard, Ben-Gvir specifically mentioned La Familia, the far-right fan club of the Beitar Jerusalem soccer team. On 2 April, Israel's cabinet approved the establishment of a law enforcement body that would operate independently of the police, under Ben-Gvir's authority. According to the decision, the Minister was to establish a committee chaired by the Director General of the Ministry of National Security, with representatives of the ministries of defense, justice and finance, as well as the police and the IDF, to outline the operations of the new organization. The committee's recommendations will be submitted to the government for consideration. Addressing a conference on 4 April, Police Commissioner Kobi Shabtai said that he is not opposed to the establishment of a security body which would answer to the police, but "a separate body? Absolutely not." The police chief said he had warned Ben-Gvir that the establishment of a security body separate from the police is "unnecessary, with extremely high costs that may harm citizens' personal security." During a press conference on 10 April, Prime Minister Netanyahu said, in what has been seen by some news outlets as a concession to the protesters, that "This will not be anyone's militia, it will be a security body, orderly, professional, that will be subordinate to one of the [existing] security bodies." The committee established by the government recommended the government to order the establishment of the National Guard immediately while allocating budgets. The National Guard, under whose command will be a superintendent of the police, will not be subordinate to Ben-Gvir. It will be subordinate to the police commissioner and will be part of Israel Border Police. The Ministry of Defense and Finance opposed the conclusions. The Israeli National Security Council called for further discussion on this. The coalition's efforts to expand the purview of Rabbinical courts; force some organizations, such as hospitals, to enforce certain religious practices; amend the Law Prohibiting Discrimination to allow gender segregation and discrimination on the grounds of religious belief; expand funding for religious causes; and put into law the exemption of yeshiva and kolel students from conscription have drawn criticism. According to the Haaretz op-ed of 7 March 2023, "the current coalition is interested... in modifying the public space so it suits the religious lifestyle. The legal coup is meant to castrate anyone who can prevent it, most of all the HCJ." Several banks and institutional investors, including the Israel Discount Bank and AIG have committed to avoid investing in, or providing credit to any organization that will discriminate against others on ground of religion, race, gender or sexual orientation. A series of technology companies and investment firms including Wiz, Intel Israel, Salesforce and Microsoft Israel Research and Development, have criticized the proposed changes to the Law Prohibiting Discrimination, with Wiz stating that it will require its suppliers to commit to preventing discrimination. Over sixty prominent law firms pledged that they will neither represent, nor do business with discriminating individuals and organizations. Insight Partners, a major private equity fund operating in Israel, released a statement warning against intolerance and any attempt to harm personal liberties. Orit Lahav, chief executive of the women's rights organization Mavoi Satum ("Dead End"), said that "the Rabbinical courts are the most discriminatory institution in the State of Israel... Limiting the HCJ[d] while expanding the jurisdiction of the Rabbinical courts would... cause significant harm to women." Anat Thon Ashkenazy, Director of the Center for Democratic Values and Institutions at the Israel Democracy Institute, said that "almost every part of the reform could harm women... the meaning of an override clause is that even if the court says that the law on gender segregation is illegitimate, is harmful, the Knesset could say 'Okay, we say otherwise'". She added that "there is a very broad institutional framework here, after which there will come legislation that harms women's right and we will have no way of protecting or stopping it." During July 2023, 20 professional medical associations signed a letter of position warning against the ramifications to public health that would result from the exclusion of women from the public sphere. They cited, among others, a rise in prevalence of risk factors for cardiovascular disease, pregnancy-related ailments, psychological distress, and the risk of suicide. On 30 July the Knesset passed an amendment to penal law adding sexual offenses to those offenses whose penalty can be doubled if done on grounds of "nationalistic terrorism, racism or hostility towards a certain community". According to MK Limor Son Har-Melech, the bill is meant to penalize any individual who "[intends to] harm a woman sexually based on her Jewishness". The law was criticized by MK Gilad Kariv as "populist, nationalistic, and dangerous towards the Arab citizens of Israel", and by MK Ahmad Tibi as a "race law", and was objected to by legal advisors at the Ministry of Justice and the Knesset Committee on National Security. Activist Orit Kamir wrote that "the amendment... is neither feminist, equal, nor progressive, but the opposite: it subordinates women's sexuality to the nationalistic, racist patriarchy. It hijacks the Law for Prevention of Sexual Harassment to serve a world view that tags women as sexual objects that personify the nation's honor." Yael Sherer, director of the Lobby to Combat Sexual Violence, criticized the law as being informed by dated ideas about sexual assault, and proposed that MKs "dedicate a session... to give victims of sexual assault an opportunity to come out of the darkness... instead of [submitting] declarative bills that change nothing and are not meant but for grabbing headlines". In Israel, during 2022, 24 women "were murdered because they were women," which was an increase of 50% compared to 2021. A law permitting courts to order men subject to a restraining order following domestic violence offenses to wear electronic tags was drafted during the previous Knesset and had passed its first reading unanimously. On 22 March 2023, the Knesset voted to reject the bill. It had been urged to do so by National Security Minister Itamar Ben-Gvir, who said that the bill was unfair to men. Earlier in the week, Ben-Gvir had blocked the measure from advancing in the ministerial legislative committee. The MKs voting against the bill included Prime Minister Netanyahu. The Association of Families of Murder Victims said that by rejecting the law, National Security Minister Itamar Ben-Gvir "brings joy to violent men and abandons the women threatened with murder… unsupervised restraining orders endanger women's lives even more. They give women the illusion of being protected, and then they are murdered." MK Pnina Tamano-Shata, chairwoman of the Knesset Committee on the Status of Women and Gender Equality, said that "the coalition proved today that it despises women's lives." The NGO Amutat Bat Melech [he], which assists Orthodox and ultra-Orthodox women who suffer from domestic violence, said that: "Rejecting the electronic bracelet bill is disconnected from the terrible reality of seven femicides since the beginning of the year. This is an effective tool of the first degree that could have saved lives and reduced the threat to women suffering from domestic violence. This is a matter of life and death, whose whole purpose is to provide a solution to defend women." The agreement signed by the coalition parties includes the setting up of a committee to draft changes to the Law of Return. Israeli religious parties have long demanded that the "grandchild clause" of the Law of Return be cancelled. This clause grants citizenship to anyone with at least one Jewish grandparent, as long as they do not practice another religion. If the grandchild clause were to be removed from the Law of Return then around 3 million people who are currently eligible for aliyah would no longer be eligible. The heads of the Jewish Agency, the Jewish Federations of North America, the World Zionist Organization and Keren Hayesod sent a joint letter to Prime Minister Netanyahu, expressing their "deep concern" about any changes to the Law of Return, adding that "Any change in the delicate and sensitive status quo on issues such as the Law of Return or conversion could threaten to unravel the ties between us and keep us away from each other." The Executive Council of Australian Jewry and the Zionist Federation of Australia issued a joint statement saying "We… view with deep concern… proposals in relation to religious pluralism and the law of return that risk damaging Israel's… relationship with Diaspora Jewry." On 19 March 2023, Israeli Finance Minister Bezalel Smotrich spoke in Paris at a memorial service for a Likud activist. The lectern at which Smotrich spoke was covered with a flag depicting the 'Greater Land of Israel,' encompassing the whole of Mandatory Palestine, as well as Trans-Jordan. During his speech, Smotrich said that "there's no such thing as Palestinians because there's no such thing as a Palestinian people." He added that the Palestinian people are a fictitious nation invented only to fight the Zionist movement, asking "Is there a Palestinian history or culture? There isn't any." The event received widespread media coverage. On 21 March, a spokesman for the US State Department sharply criticized Smotrich's comments. "The comments, which were delivered at a podium adorned with an inaccurate and provocative map, are offensive, they are deeply concerning, and, candidly, they're dangerous. The Palestinians have a rich history and culture, and the United States greatly values our partnership with the Palestinian people," he said. The Jordanian Foreign Ministry also voiced disapproval: "The Israeli Minister of Finance's use, during his participation in an event held yesterday in Paris, of a map of Israel that includes the borders of the Hashemite Kingdom of Jordan and the occupied Palestinian territories represents a reckless inflammatory act, and a violation of international norms and the Jordanian-Israeli peace treaty." Additionally, a map encompassing Mandatory Palestine and Trans-Jordan with a Jordanian flag on it was placed on a central lectern in the Jordanian Parliament. Jordan's parliament voted to expel the Israeli ambassador. Israel's Ministry of Foreign Affairs released a clarification relating to the matter, stating that "Israel is committed to the 1994 peace agreement with Jordan. There has been no change in the position of the State of Israel, which recognizes the territorial integrity of the Hashemite Kingdom of Jordan". Ahead of a Europe Day event due to take place on 9 May 2023, far-right wing National Security Minister Itamar Ben-Gvir was assigned as a representative of the government and a speaker at the event by the government secretariat, which deals with placing ministers at receptions on the occasion of the national days of the foreign embassies. The European Union requested that Ben-Gvir not attend, but the government did not make changes to the plan. On 8 May, the European delegation to Israel cancelled the reception, stating that: "The EU Delegation to Israel is looking forward to celebrating Europe Day on May 9, as it does every year. Regrettably, this year we have decided to cancel the diplomatic reception, as we do not want to offer a platform to someone whose views contradict the values the European Union stands for. However, the Europe Day cultural event for the Israeli public will be maintained to celebrate with our friends and partners in Israel the strong and constructive bilateral relationship". Israel's Opposition Leader Yair Lapid stated: "Sending Itamar Ben-Gvir to a gathering of EU ambassadors is a serious professional mistake. The government is embarrassing a large group of friendly countries, jeopardizing future votes in international institutions, and damaging our foreign relations. Last year, after a decade of efforts, we succeeded in signing an economic-political agreement with the European Union that will contribute to the Israeli economy and our foreign relations. Why risk it, and for what? Ben-Gvir is not a legitimate person in the international community (and not really in Israel either), and sometimes you have to be both wise and just and simply send someone else". On 23 February 2023, Defense Minister Gallant signed an agreement assigning governmental powers in the West Bank to a body to be headed by Minister Bezalel Smotrich, who will effectively become the governor of the West Bank, controlling almost all areas of life in the area, including planning, building and infrastructure. Israeli governments have hitherto been careful to keep the occupation as a military government. The temporary holding of power by an occupying military force, pending a negotiated settlement, is a principle of international law – an expression of the prohibition against obtaining sovereignty through conquest that was introduced in the wake of World War II. An editorial in Haaretz noted that the assignment of governmental powers in the West Bank to a civilian governor, alongside the plan to expand the dual justice system so that Israeli law will apply fully to settlers in the West Bank, constitutes de jure annexation of the West Bank. On 26 February 2023, following the 2023 Huwara shooting in which two Israelis were killed by an unidentified attacker, hundreds of Israeli settlers attacked the Palestinian town of Huwara and three nearby villages, setting alight hundreds of Palestinian homes (some with people in them), businesses, a school, and numerous vehicles, killing one Palestinian man and injuring 100 others. Bezalel Smotrich subsequently called on Twitter for Huwara to be "wiped out" by the Israeli government. Zvika Fogel MK, of the ultra-nationalist Otzma Yehudit, which forms part of the governing coalition, said that he "looks very favorably upon" the results of the rampage. Members of the coalition proposed an amendment to the Disengagement Law, which would allow Israelis to resettle settlements vacated during the 2005 Israeli disengagement from Gaza and the northern West Bank. The evacuated settlements were considered illegal under international law, according to most countries. The proposal was approved for voting by the Foreign Affairs and Defense Committee on 9 March 2023, while the committee was still waiting for briefing materials from the NSS, IDF, MFA and Shin Bet, and was passed on 21 March. The US has requested clarification from Israeli ambassador Michael Herzog. A US State Department spokesman stated that "The U.S. strongly urges Israel to refrain from allowing the return of settlers to the area covered by the legislation, consistent with both former Prime Minister Sharon and the current Israeli Government's commitment to the United States," noting that the actions represent a clear violation of undertakings given by the Sharon government to the Bush administration in 2005 and Netanyahu's far-right coalition to the Biden administration the previous week. Minister of Communication Shlomo Karhi had initially intended to cut the funding of the Israeli Public Broadcasting Corporation (also known by its blanket branding Kan) by 400 million shekels – roughly half of its total budget – closing several departments, and privatizing content creation. In response, the Director-General of the European Broadcasting Union, Noel Curran, sent two urgent letters to Netanyahu, expressing his concerns and calling on the Israeli government to "safeguard the independence of our Member KAN and ensure it is allowed to operate in a sustainable way, with funding that is both stable, adequate, fair, and transparent." On 25 January 2023, nine journalist organizations representing some of Kan's competitors issued a statement of concern, acknowledging the "important contribution of public broadcasting in creating a worthy, unbiased and non-prejudicial journalistic platform", and noting that "the existence of the [broadcasting] corporation as a substantial public broadcast organization strengthens media as a whole, adding to the competition in the market rather than weakening it." They also expressed their concern that the "real reason" for the proposal was actually "an attempt to silence voices from which... [the Minister] doesn't always draw satisfaction". The same day, hundreds of journalists, actors and filmmakers protested in Tel Aviv. The proposal was eventually put on hold. On 22 February 2023 it was reported that Prime Minister Netanyahu was attempting to appoint his close associate Yossi Shelley as the deputy to the National Statistician — a highly sensitive position in charge of providing accurate data for decision makers. The appointment of Shelley, who did not possess the required qualifications for the role, was withdrawn following publication. In its daily editorial, Haaretz tied this attempt with the judicial reform: "once they take control of the judiciary, law enforcement and public media, they wish to control the state's data base, the dry numerical data it uses to plan its future". Netanyahu also proposed Avi Simhon for the role, and eventually froze all appointments at the Israel Central Bureau of Statistics. Also on 22 February 2023, it was revealed that Yoav Kish, the Minister of Education, was promoting a draft government decision change to the National Library of Israel board of directors which would grant him more power over the institution. In response, the Hebrew University — which owned the library until 2008 – announced that if the draft is accepted, it will withdraw its collections from the library. The university's collections, which according to the university constitute some 80% of the library's collection, include the Agnon archive, the original manuscript of Hatikvah, and the Rothschild Haggadah, the oldest known Haggadah. A group of 300 authors and poets signed an open letter against the move, further noting their objection against "political takeover" of public broadcasting, as well as "any legislation that will castrate the judiciary and damage the democratic foundations of the state of Israel". Several days later, it was reported that a series of donors decided to withhold their donations to the library, totaling some 80 million shekels. On 3 March a petition against the move by 1,500 academics, including Israel Prize laureates, was sent to Kish. The proposal has been seen by some as retribution against Shai Nitzan, the former State Attorney and the library's current rector. On 5 March it was reported that the Legal Advisor to the Ministry of Finance, Asi Messing, was withholding the proposal. According to Messing, the proposal – which was being promoted as part of the Economic Arrangements Law – "was not reviewed... by the qualified personnel in the Ministry of Finance, does not align with any of the common goals of the economic plan, was not agreed to by myself and was not approved by the Attorney General." As of February 2023, the government has been debating several proposals that will significantly weaken the Ministry of Environmental Protection, including reducing the environmental regulation of planning and development and electricity production. One of the main proposals, the transferal of a 3 billion shekel fund meant to finance waste management plants from the Ministry of Environmental Protection to the Ministry of the Interior, was eventually withdrawn. The Minister of Environmental Protection, Idit Silman, has been criticized for using for meeting with climate change denialists, for wasteful and personally-motivated travel on the ministry's expense, for politicizing the role, and for engaging in political activity on the ministry's time. The government has been noted for an unusually high number of dismissals and resignations of senior career civil servants, and for the frequent attempts to replace them with candidates with known political associations, who are often less competent. According to sources, Netanyahu and people in his vicinity are seeking out civil servants who were appointed by the previous government, intent on replacing them with people loyal to him. Governmental nominees for various positions have been criticized for lack of expertise. In addition to the nominee to the position of Deputy National Statistician (see above), the Director General of the Ministry of Finance, Shlomi Heisler; the Director General of the Ministry of Justice, Itamar Donenfeld; and the Director General of Ministry of Transport, Moshe Ben Zaken, have all been criticized for incompetence, lack of familiarity with their Ministries' subject matter, lack of interest in the job, or lack of experience in managing large organizations. It has been reported that in some ministries, senior officials were enacting slowdowns as a means for dealing with the new ministers and director generals. On 28 July the director general of the Ministry of Education resigned, citing as reason the societal "rift". Asaf Zalel, a retired Air Force Brigadier General, was appointed in January. When asked about attempts to appoint his personal friend and attorney to the board of directors of a state-owned company, Minister David Amsalem replied: "that is my job, due to my authority to appoint directors. I put forward people that I know and hold in esteem". Under Minister of Transport Miri Regev, the ministry has either dismissed or lost the heads of the National Public Transport Authority, Israel Airports Authority, National Road Safety Authority, Israel Railways, and several officials in Netivei Israel. The current chair of Netivei Israel is Likud member and Regev associate Yigal Amadi, and the legal counsel is Einav Abuhzira, daughter of a former Likud branch chair. Abuhzira was appointed instead of Elad Berdugo, nephew of Netanyahu surrogate Yaakov Bardugo, after he was disqualified for the role by the Israel Government Companies Authority. In July 2023 the Ministry of Communications, Shlomo Karhi, and the minister in charge of the Israel Government Companies Authority, Dudi Amsalem, deposed the chair of the Israel Postal Company, Michael Vaknin. The chair, who was hired to lead the company's financial recovery after years of operational loss and towards privatization, has gained the support of officials at the Authority and at the Ministry of Finance; nevertheless, the ministers claimed that his performance is inadequate, and nominated in his place Yiftah Ron-Tal, who has known ties to Netanyahu and Smotrich. They also nominated four new directors, two of which have known political associations, and a third who was a witness in Netanyahu's trial. The coalition is allowed to spend a portion of the state's budget on a discretionary basis, meant to coax member parties to reach an agreement on the budget. As of May 2023, the government was pushing an allocation of over 13 billion shekels over two years - almost seven times the amount allocated by the previous government. Most of the funds will be allocated for uses associated with the religious, orthodox and settler communities. The head of the Budget Department at the Ministry of Finance, Yoav Gardos, objected to the allocations, claiming they would exacerbate unemployment in the Orthodox community, which is projected to cost the economy a total of 6.7 trillion shekels in lost produce by 2065. At the onset of the Gaza war and the declaration of a state of national emergency, Minister of Finance Bezalel Smotrich instructed government agencies to continue with the planned distribution of discretionary funds. Corruption During March 2023, the government was promoting an amendment to the Law on Public Service (Gifts) that would allow Netanyahu to receive donations to fund his legal defense. The amendment follows a decision by the High Court of Justice (HCJ) that forced Netanyahu to refund US$270,000 given to him and his wife by his late cousin, Nathan Mileikowsky, for their legal defense. This is in contrast to past statements by Minister of Justice Yariv Levin, who spoke against the possible conflict of interests that can result from such transactions. The bill was opposed by the Attorney General Gali Baharav-Miara, who stressed that it could "create a real opportunity for governmental corruption", and was eventually withdrawn at the end of March. As of March 2023, the coalition was promoting a bill that would prevent judicial review of ministerial appointments. The bill is intended to prevent the HCJ from reviewing the appointment of the twice-convicted chairman of Shas, Aryeh Deri (convicted of bribery, fraud, and breach of trust), to a ministerial position, after his previous appointment was annulled on grounds of unreasonableness. The bill follows on the heels of another amendment, that relaxed the ban on the appointment of convicted criminals, so that Deri - who was handed a suspended sentence after his second conviction - could be appointed. The bill is opposed by the Attorney General, as well as by the Knesset Legal Adviser, Sagit Afik. Israeli law allows for declaring a Prime Minister (as well as several other high-ranking public officials) to be temporarily or permanently incapacitated, but does not specify the conditions which can lead to a declaration of incapacitation. In the case of the Prime Minister, the authority to do so is given to the Attorney General. In March 2023, the coalition advanced a bill that passes this authority from the Attorney General to the government with the approval of the Knesset committee, and clarified that incapacitation can only result from medical or mental conditions. On 3 January 2024, the Supreme Court ruled by a majority of 6 out of 11 that the validity of the law will be postponed to the next Knesset because the bill in its immediate application is a personal law and is intended to serve a distinct personal purpose. Later, the court rejected a petition regarding the definition of Netanyahu as an incapacitated prime minister due to his ongoing trial and conflict of interests. Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Internet#cite_note-162] | [TOKENS: 9291]
Contents Internet The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that comprises private, public, academic, business, and government networks of local to global scope, linked by electronic, wireless, and optical networking technologies. The Internet carries a vast range of information services and resources, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, discussion groups, internet telephony, streaming media and file sharing. Most traditional communication media, including telephone, radio, television, paper mail, newspapers, and print publishing, have been transformed by the Internet, giving rise to new media such as email, online music, digital newspapers, news aggregators, and audio and video streaming websites. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has also grown to occupy a significant market across industries, enabling firms to extend brick and mortar presences to serve larger markets. Business-to-business and financial services on the Internet affect supply chains across entire industries. The origins of the Internet date back to research that enabled the time-sharing of computer resources, the development of packet switching, and the design of computer networks for data communication. The set of communication protocols to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France. The Internet has no single centralized governance in either technological implementation or policies for access and usage. Each constituent network sets its own policies. The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the non-profit Internet Engineering Task Force (IETF). Terminology The word internetted was used as early as 1849, meaning interconnected or interwoven. The word Internet was used in 1945 by the United States War Department in a radio operator's manual, and in 1974 as the shorthand form of Internetwork. Today, the term Internet most commonly refers to the global system of interconnected computer networks, though it may also refer to any group of smaller networks. The word Internet may be capitalized as a proper noun, although this is becoming less common. This reflects the tendency in English to capitalize new terms and move them to lowercase as they become familiar. The word is sometimes still capitalized to distinguish the global internet from smaller networks, though many publications, including the AP Stylebook since 2016, recommend the lowercase form in every case. In 2016, the Oxford English Dictionary found that, based on a study of around 2.5 billion printed and online sources, "Internet" was capitalized in 54% of cases. The terms Internet and World Wide Web are often used interchangeably; it is common to speak of "going on the Internet" when using a web browser to view web pages. However, the World Wide Web, or the Web, is only one of a large number of Internet services. It is the global collection of web pages, documents and other web resources linked by hyperlinks and URLs. History In the 1960s, computer scientists began developing systems for time-sharing of computer resources. J. C. R. Licklider proposed the idea of a universal network while working at Bolt Beranek & Newman and, later, leading the Information Processing Techniques Office at the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. Research into packet switching,[c] one of the fundamental Internet technologies, started in the work of Paul Baran at RAND in the early 1960s and, independently, Donald Davies at the United Kingdom's National Physical Laboratory in 1965. After the Symposium on Operating Systems Principles in 1967, packet switching from the proposed NPL network was incorporated into the design of the ARPANET, an experimental resource sharing network proposed by ARPA. ARPANET development began with two network nodes which were interconnected between the University of California, Los Angeles and the Stanford Research Institute on 29 October 1969. The third site was at the University of California, Santa Barbara, followed by the University of Utah. By the end of 1971, 15 sites were connected to the young ARPANET. Thereafter, the ARPANET gradually developed into a decentralized communications network, connecting remote centers and military bases in the United States. Other user networks and research networks, such as the Merit Network and CYCLADES, were developed in the late 1960s and early 1970s. Early international collaborations for the ARPANET were rare. Connections were made in 1973 to Norway (NORSAR and, later, NDRE) and to Peter Kirstein's research group at University College London, which provided a gateway to British academic networks, the first internetwork for resource sharing. ARPA projects, the International Network Working Group and commercial initiatives led to the development of various protocols and standards by which multiple separate networks could become a single network, or a network of networks. In 1974, Vint Cerf at Stanford University and Bob Kahn at DARPA published a proposal for "A Protocol for Packet Network Intercommunication". Cerf and his graduate students used the term internet as a shorthand for internetwork in RFC 675. The Internet Experiment Notes and later RFCs repeated this use. The work of Louis Pouzin and Robert Metcalfe had important influences on the resulting TCP/IP design. National PTTs and commercial providers developed the X.25 standard and deployed it on public data networks. The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet Protocol Suite (TCP/IP) was standardized, which facilitated worldwide proliferation of interconnected networks. TCP/IP network access expanded again in 1986 when the National Science Foundation Network (NSFNet) provided access to supercomputer sites in the United States for researchers, first at speeds of 56 kbit/s and later at 1.5 Mbit/s and 45 Mbit/s. The NSFNet expanded into academic and research organizations in Europe, Australia, New Zealand and Japan in 1988–89. Although other network protocols such as UUCP and PTT public data networks had global reach well before this time, this marked the beginning of the Internet as an intercontinental network. Commercial Internet service providers emerged in 1989 in the United States and Australia. The ARPANET was decommissioned in 1990. The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web, marked the beginning of the transition to the modern Internet. Steady advances in semiconductor technology and optical networking created new economic opportunities for commercial involvement in the expansion of the network in its core and for delivering services to the public. In mid-1989, MCI Mail and Compuserve established connections to the Internet, delivering email and public access products to the half million users of the Internet. Just months later, on 1 January 1990, PSInet launched an alternate Internet backbone for commercial use; one of the networks that added to the core of the commercial Internet of later years. In March 1990, the first high-speed T1 (1.5 Mbit/s) link between the NSFNET and Europe was installed between Cornell University and CERN, allowing much more robust communications than were capable with satellites. Later in 1990, Tim Berners-Lee began writing WorldWideWeb, the first web browser, after two years of lobbying CERN management. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the HyperText Transfer Protocol (HTTP) 0.9, the HyperText Markup Language (HTML), the first Web browser (which was also an HTML editor and could access Usenet newsgroups and FTP files), the first HTTP server software (later known as CERN httpd), the first web server, and the first Web pages that described the project itself. In 1991 the Commercial Internet eXchange was founded, allowing PSInet to communicate with the other commercial networks CERFnet and Alternet. Stanford Federal Credit Union was the first financial institution to offer online Internet banking services to all of its members in October 1994. In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. By 1995, the Internet was fully commercialized in the U.S. when the NSFNet was decommissioned, removing the last restrictions on use of the Internet to carry commercial traffic. As technology advanced and commercial opportunities fueled reciprocal growth, the volume of Internet traffic started experiencing similar characteristics as that of the scaling of MOS transistors, exemplified by Moore's law, doubling every 18 months. This growth, formalized as Edholm's law, was catalyzed by advances in MOS technology, laser light wave systems, and noise performance. Since 1995, the Internet has tremendously impacted culture and commerce, including the rise of near-instant communication by email, instant messaging, telephony (Voice over Internet Protocol or VoIP), two-way interactive video calls, and the World Wide Web. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1 Gbit/s, 10 Gbit/s, or more. The Internet continues to grow, driven by ever-greater amounts of online information and knowledge, commerce, entertainment and social networking services. During the late 1990s, it was estimated that traffic on the public Internet grew by 100 percent per year, while the mean annual growth in the number of Internet users was thought to be between 20% and 50%. This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network. In November 2006, the Internet was included on USA Today's list of the New Seven Wonders. As of 31 March 2011[update], the estimated total number of Internet users was 2.095 billion (30% of world population). It is estimated that in 1993 the Internet carried only 1% of the information flowing through two-way telecommunication. By 2000 this figure had grown to 51%, and by 2007 more than 97% of all telecommunicated information was carried over the Internet. Modern smartphones can access the Internet through cellular carrier networks, and internet usage by mobile and tablet devices exceeded desktop worldwide for the first time in October 2016. As of 2018[update], 80% of the world's population were covered by a 4G network. The International Telecommunication Union (ITU) estimated that, by the end of 2017, 48% of individual users regularly connect to the Internet, up from 34% in 2012. Mobile Internet connectivity has played an important role in expanding access in recent years, especially in Asia and the Pacific and in Africa. The number of unique mobile cellular subscriptions increased from 3.9 billion in 2012 to 4.8 billion in 2016, two-thirds of the world's population, with more than half of subscriptions located in Asia and the Pacific. The limits that users face on accessing information via mobile applications coincide with a broader process of fragmentation of the Internet. Fragmentation restricts access to media content and tends to affect the poorest users the most. One solution, zero-rating, is the practice of Internet service providers allowing users free connectivity to access specific content or applications without cost. Social impact The Internet has enabled new forms of social interaction, activities, and social associations, giving rise to the scholarly study of the sociology of the Internet. Between 2000 and 2009, the number of Internet users globally rose from 390 million to 1.9 billion. By 2010, 22% of the world's population had access to computers with 1 billion Google searches every day, 300 million Internet users reading blogs, and 2 billion videos viewed daily on YouTube. In 2014 the world's Internet users surpassed 3 billion or 44 percent of world population, but two-thirds came from the richest countries, with 78 percent of Europeans using the Internet, followed by 57 percent of the Americas. However, by 2018, Asia alone accounted for 51% of all Internet users, with 2.2 billion out of the 4.3 billion Internet users in the world. China's Internet users surpassed a major milestone in 2018, when the country's Internet regulatory authority, China Internet Network Information Centre, announced that China had 802 million users. China was followed by India, with some 700 million users, with the United States third with 275 million users. However, in terms of penetration, in 2022, China had a 70% penetration rate compared to India's 60% and the United States's 90%. In 2022, 54% of the world's Internet users were based in Asia, 14% in Europe, 7% in North America, 10% in Latin America and the Caribbean, 11% in Africa, 4% in the Middle East and 1% in Oceania. In 2019, Kuwait, Qatar, the Falkland Islands, Bermuda and Iceland had the highest Internet penetration by the number of users, with 93% or more of the population with access. As of 2022, it was estimated that 5.4 billion people use the Internet, more than two-thirds of the world's population. Early computer systems were limited to the characters in the American Standard Code for Information Interchange (ASCII), a subset of the Latin alphabet. After English (27%), the most requested languages on the World Wide Web are Chinese (25%), Spanish (8%), Japanese (5%), Portuguese and German (4% each), Arabic, French and Russian (3% each), and Korean (2%). Modern character encoding standards, such as Unicode, allow for development and communication in the world's widely used languages. However, some glitches such as mojibake (incorrect display of some languages' characters) still remain. Several neologisms exist that refer to Internet users: Netizen (as in "citizen of the net") refers to those actively involved in improving online communities, the Internet in general or surrounding political affairs and rights such as free speech, Internaut refers to operators or technically highly capable users of the Internet, digital citizen refers to a person using the Internet in order to engage in society, politics, and government participation. The Internet allows greater flexibility in working hours and location, especially with the spread of unmetered high-speed connections. The Internet can be accessed almost anywhere by numerous means, including through mobile Internet devices. Mobile phones, datacards, handheld game consoles and cellular routers allow users to connect to the Internet wirelessly.[citation needed] Educational material at all levels from pre-school (e.g. CBeebies) to post-doctoral (e.g. scholarly literature through Google Scholar) is available on websites. The internet has facilitated the development of virtual universities and distance education, enabling both formal and informal education. The Internet allows researchers to conduct research remotely via virtual laboratories, with profound changes in reach and generalizability of findings as well as in communication between scientists and in the publication of results. By the late 2010s the Internet had been described as "the main source of scientific information "for the majority of the global North population".: 111 Wikis have also been used in the academic community for sharing and dissemination of information across institutional and international boundaries. In those settings, they have been found useful for collaboration on grant writing, strategic planning, departmental documentation, and committee work. The United States Patent and Trademark Office uses a wiki to allow the public to collaborate on finding prior art relevant to examination of pending patent applications. Queens, New York has used a wiki to allow citizens to collaborate on the design and planning of a local park. The English Wikipedia has the largest user base among wikis on the World Wide Web and ranks in the top 10 among all sites in terms of traffic. The Internet has been a major outlet for leisure activity since its inception, with entertaining social experiments such as MUDs and MOOs being conducted on university servers, and humor-related Usenet groups receiving much traffic. Many Internet forums have sections devoted to games and funny videos. Another area of leisure activity on the Internet is multiplayer gaming. This form of recreation creates communities, where people of all ages and origins enjoy the fast-paced world of multiplayer games. These range from MMORPG to first-person shooters, from role-playing video games to online gambling. While online gaming has been around since the 1970s, modern modes of online gaming began with subscription services such as GameSpy and MPlayer. Streaming media is the real-time delivery of digital media for immediate consumption or enjoyment by end users. Streaming companies (such as Netflix, Disney+, Amazon's Prime Video, Mubi, Hulu, and Apple TV+) now dominate the entertainment industry, eclipsing traditional broadcasters. Audio streamers such as Spotify and Apple Music also have significant market share in the audio entertainment market. Video sharing websites are also a major factor in the entertainment ecosystem. YouTube was founded on 15 February 2005 and is now the leading website for free streaming video with more than two billion users. It uses a web player to stream and show video files. YouTube users watch hundreds of millions, and upload hundreds of thousands, of videos daily. Other video sharing websites include Vimeo, Instagram and TikTok.[citation needed] Although many governments have attempted to restrict both Internet pornography and online gambling, this has generally failed to stop their widespread popularity. A number of advertising-funded ostensible video sharing websites known as "tube sites" have been created to host shared pornographic video content. Due to laws requiring the documentation of the origin of pornography, these websites now largely operate in conjunction with pornographic movie studios and their own independent creator networks, acting as de-facto video streaming services. Major players in this field include the market leader Aylo, the operator of PornHub and numerous other branded sites, as well as other independent operators such as xHamster and Xvideos. As of 2023[update], Internet traffic to pornographic video sites rivalled that of mainstream video streaming and sharing services. Remote work is facilitated by tools such as groupware, virtual private networks, conference calling, videotelephony, and VoIP so that work may be performed from any location, such as the worker's home.[citation needed] The spread of low-cost Internet access in developing countries has opened up new possibilities for peer-to-peer charities, which allow individuals to contribute small amounts to charitable projects for other individuals. Websites, such as DonorsChoose and GlobalGiving, allow small-scale donors to direct funds to individual projects of their choice. A popular twist on Internet-based philanthropy is the use of peer-to-peer lending for charitable purposes. Kiva pioneered this concept in 2005, offering the first web-based service to publish individual loan profiles for funding. The low cost and nearly instantaneous sharing of ideas, knowledge, and skills have made collaborative work dramatically easier, with the help of collaborative software, which allow groups to easily form, cheaply communicate, and share ideas. An example of collaborative software is the free software movement, which has produced, among other things, Linux, Mozilla Firefox, and OpenOffice.org (later forked into LibreOffice).[citation needed] Content management systems allow collaborating teams to work on shared sets of documents simultaneously without accidentally destroying each other's work.[citation needed] The internet also allows for cloud computing, virtual private networks, remote desktops, and remote work.[citation needed] The online disinhibition effect describes the tendency of many individuals to behave more stridently or offensively online than they would in person. A significant number of feminist women have been the target of various forms of harassment, including insults and hate speech, to, in extreme cases, rape and death threats, in response to posts they have made on social media. Social media companies have been criticized in the past for not doing enough to aid victims of online abuse. Children also face dangers online such as cyberbullying and approaches by sexual predators, who sometimes pose as children themselves. Due to naivety, they may also post personal information about themselves online, which could put them or their families at risk unless warned not to do so. Many parents choose to enable Internet filtering or supervise their children's online activities in an attempt to protect their children from pornography or violent content on the Internet. The most popular social networking services commonly forbid users under the age of 13. However, these policies can be circumvented by registering an account with a false birth date, and a significant number of children aged under 13 join such sites.[citation needed] Social networking services for younger children, which claim to provide better levels of protection for children, also exist. Internet usage has been correlated to users' loneliness. Lonely people tend to use the Internet as an outlet for their feelings and to share their stories with others, such as in the "I am lonely will anyone speak to me" thread.[citation needed] Cyberslacking can become a drain on corporate resources; employees spend a significant amount of time surfing the Web while at work. Internet addiction disorder is excessive computer use that interferes with daily life. Nicholas G. Carr believes that Internet use has other effects on individuals, for instance improving skills of scan-reading and interfering with the deep thinking that leads to true creativity. Electronic business encompasses business processes spanning the entire value chain: purchasing, supply chain management, marketing, sales, customer service, and business relationship. E-commerce seeks to add revenue streams using the Internet to build and enhance relationships with clients and partners. According to International Data Corporation, the size of worldwide e-commerce, when global business-to-business and -consumer transactions are combined, equate to $16 trillion in 2013. A report by Oxford Economics added those two together to estimate the total size of the digital economy at $20.4 trillion, equivalent to roughly 13.8% of global sales. While much has been written of the economic advantages of Internet-enabled commerce, there is also evidence that some aspects of the Internet such as maps and location-aware services may serve to reinforce economic inequality and the digital divide. Electronic commerce may be responsible for consolidation and the decline of mom-and-pop, brick and mortar businesses resulting in increases in income inequality. A 2013 Institute for Local Self-Reliance report states that brick-and-mortar retailers employ 47 people for every $10 million in sales, while Amazon employs only 14. Similarly, the 700-employee room rental start-up Airbnb was valued at $10 billion in 2014, about half as much as Hilton Worldwide, which employs 152,000 people. At that time, Uber employed 1,000 full-time employees and was valued at $18.2 billion, about the same valuation as Avis Rent a Car and The Hertz Corporation combined, which together employed almost 60,000 people. Advertising on popular web pages can be lucrative, and e-commerce. Online advertising is a form of marketing and advertising which uses the Internet to deliver promotional marketing messages to consumers. It includes email marketing, search engine marketing (SEM), social media marketing, many types of display advertising (including web banner advertising), and mobile advertising. In 2011, Internet advertising revenues in the United States surpassed those of cable television and nearly exceeded those of broadcast television.: 19 Many common online advertising practices are controversial and increasingly subject to regulation. The Internet has achieved new relevance as a political tool. The presidential campaign of Howard Dean in 2004 in the United States was notable for its success in soliciting donation via the Internet. Many political groups use the Internet to achieve a new method of organizing for carrying out their mission, having given rise to Internet activism. Social media websites, such as Facebook and Twitter, helped people organize the Arab Spring, by helping activists organize protests, communicate grievances, and disseminate information. Many have understood the Internet as an extension of the Habermasian notion of the public sphere, observing how network communication technologies provide something like a global civic forum. However, incidents of politically motivated Internet censorship have now been recorded in many countries, including western democracies. E-government is the use of technological communications devices, such as the Internet, to provide public services to citizens and other persons in a country or region. E-government offers opportunities for more direct and convenient citizen access to government and for government provision of services directly to citizens. Cybersectarianism is a new organizational form that involves: highly dispersed small groups of practitioners that may remain largely anonymous within the larger social context and operate in relative secrecy, while still linked remotely to a larger network of believers who share a set of practices and texts, and often a common devotion to a particular leader. Overseas supporters provide funding and support; domestic practitioners distribute tracts, participate in acts of resistance, and share information on the internal situation with outsiders. Collectively, members and practitioners of such sects construct viable virtual communities of faith, exchanging personal testimonies and engaging in the collective study via email, online chat rooms, and web-based message boards. In particular, the British government has raised concerns about the prospect of young British Muslims being indoctrinated into Islamic extremism by material on the Internet, being persuaded to join terrorist groups such as the so-called "Islamic State", and then potentially committing acts of terrorism on returning to Britain after fighting in Syria or Iraq.[citation needed] Applications and services The Internet carries many applications and services, most prominently the World Wide Web, including social media, electronic mail, mobile applications, multiplayer online games, Internet telephony, file sharing, and streaming media services. The World Wide Web is a global collection of documents, images, multimedia, applications, and other resources, logically interrelated by hyperlinks and referenced with Uniform Resource Identifiers (URIs), which provide a global system of named references. URIs symbolically identify services, web servers, databases, and the documents and resources that they can provide. HyperText Transfer Protocol (HTTP) is the main access protocol of the World Wide Web. Web services also use HTTP for communication between software systems for information transfer, sharing and exchanging business data and logistics and is one of many languages or protocols that can be used for communication on the Internet. World Wide Web browser software, such as Microsoft Edge, Mozilla Firefox, Opera, Apple's Safari, and Google Chrome, enable users to navigate from one web page to another via the hyperlinks embedded in the documents. These documents may also contain computer data, including graphics, sounds, text, video, multimedia and interactive content. Client-side scripts can include animations, games, office applications and scientific demonstrations. Email is an important communications service available via the Internet. The concept of sending electronic text messages between parties, analogous to mailing letters or memos, predates the creation of the Internet. Internet telephony is a common communications service realized with the Internet. The name of the principal internetworking protocol, the Internet Protocol, lends its name to voice over Internet Protocol (VoIP).[citation needed] VoIP systems now dominate many markets, being as easy and convenient as a traditional telephone, while having substantial cost savings, especially over long distances. File sharing is the practice of transferring large amounts of data in the form of computer files across the Internet, for example via file servers. The load of bulk downloads to many users can be eased by the use of "mirror" servers or peer-to-peer networks. Access to the file may be controlled by user authentication, the transit of the file over the Internet may be obscured by encryption, and money may change hands for access to the file. The price can be paid by the remote charging of funds from, for example, a credit card whose details are also passed—usually fully encrypted—across the Internet. The origin and authenticity of the file received may be checked by a digital signature. Governance The Internet is a global network that comprises many voluntarily interconnected autonomous networks. It operates without a central governing body. The technical underpinning and standardization of the core protocols (IPv4 and IPv6) is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. While the hardware components in the Internet infrastructure can often be used to support other software systems, it is the design and the standardization process of the software that characterizes the Internet and provides the foundation for its scalability and success. The responsibility for the architectural design of the Internet software systems has been assumed by the IETF. The IETF conducts standard-setting work groups, open to any individual, about the various aspects of Internet architecture. The resulting contributions and standards are published as Request for Comments (RFC) documents on the IETF web site. The principal methods of networking that enable the Internet are contained in specially designated RFCs that constitute the Internet Standards. Other less rigorous documents are simply informative, experimental, or historical, or document the best current practices when implementing Internet technologies. To maintain interoperability, the principal name spaces of the Internet are administered by the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN is governed by an international board of directors drawn from across the Internet technical, business, academic, and other non-commercial communities. The organization coordinates the assignment of unique identifiers for use on the Internet, including domain names, IP addresses, application port numbers in the transport protocols, and many other parameters. Globally unified name spaces are essential for maintaining the global reach of the Internet. This role of ICANN distinguishes it as perhaps the only central coordinating body for the global Internet. The National Telecommunications and Information Administration, an agency of the United States Department of Commerce, had final approval over changes to the DNS root zone until the IANA stewardship transition on 1 October 2016. Regional Internet registries (RIRs) were established for five regions of the world to assign IP address blocks and other Internet parameters to local registries, such as Internet service providers, from a designated pool of addresses set aside for each region:[citation needed] The Internet Society (ISOC) was founded in 1992 with a mission to "assure the open development, evolution and use of the Internet for the benefit of all people throughout the world". Its members include individuals as well as corporations, organizations, governments, and universities. Among other activities ISOC provides an administrative home for a number of less formally organized groups that are involved in developing and managing the Internet, including: the Internet Engineering Task Force (IETF), Internet Architecture Board (IAB), Internet Engineering Steering Group (IESG), Internet Research Task Force (IRTF), and Internet Research Steering Group (IRSG). On 16 November 2005, the United Nations-sponsored World Summit on the Information Society in Tunis established the Internet Governance Forum (IGF) to discuss Internet-related issues.[citation needed] Infrastructure The communications infrastructure of the Internet consists of its hardware components and a system of software layers that control various aspects of the architecture. As with any computer network, the Internet physically consists of routers, media (such as cabling and radio links), repeaters, and modems. However, as an example of internetworking, many of the network nodes are not necessarily Internet equipment per se. Internet packets are carried by other full-fledged networking protocols, with the Internet acting as a homogeneous networking standard, running across heterogeneous hardware, with the packets guided to their destinations by IP routers.[citation needed] Internet service providers (ISPs) establish worldwide connectivity between individual networks at various levels of scope. At the top of the routing hierarchy are the tier 1 networks, large telecommunication companies that exchange traffic directly with each other via very high speed fiber-optic cables and governed by peering agreements. Tier 2 and lower-level networks buy Internet transit from other providers to reach at least some parties on the global Internet, though they may also engage in peering. End-users who only access the Internet when needed to perform a function or obtain information, represent the bottom of the routing hierarchy.[citation needed] An ISP may use a single upstream provider for connectivity, or implement multihoming to achieve redundancy and load balancing. Internet exchange points are major traffic exchanges with physical connections to multiple ISPs. Large organizations, such as academic institutions, large enterprises, and governments, may perform the same function as ISPs, engaging in peering and purchasing transit on behalf of their internal networks. Research networks tend to interconnect with large subnetworks such as GEANT, GLORIAD, Internet2, and the UK's national research and education network, JANET.[citation needed] Common methods of Internet access by users include broadband over coaxial cable, fiber optics or copper wires, Wi-Fi, satellite, and cellular telephone technology.[citation needed] Grassroots efforts have led to wireless community networks. Commercial Wi-Fi services that cover large areas are available in many cities, such as New York, London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. Most servers that provide internet services are today hosted in data centers, and content is often accessed through high-performance content delivery networks. Colocation centers often host private peering connections between their customers, internet transit providers, cloud providers, meet-me rooms for connecting customers together, Internet exchange points, and landing points and terminal equipment for fiber optic submarine communication cables, connecting the internet. Internet Protocol Suite The Internet standards describe a framework known as the Internet protocol suite (also called TCP/IP, based on the first two components.) This is a suite of protocols that are ordered into a set of four conceptional layers by the scope of their operation, originally documented in RFC 1122 and RFC 1123:[citation needed] The most prominent component of the Internet model is the Internet Protocol. IP enables internetworking, essentially establishing the Internet itself. Two versions of the Internet Protocol exist, IPv4 and IPv6.[citation needed] Aside from the complex array of physical connections that make up its infrastructure, the Internet is facilitated by bi- or multi-lateral commercial contracts (e.g., peering agreements), and by technical specifications or protocols that describe the exchange of data over the network.[citation needed] For locating individual computers on the network, the Internet provides IP addresses. IP addresses are used by the Internet infrastructure to direct internet packets to their destinations. They consist of fixed-length numbers, which are found within the packet. IP addresses are generally assigned to equipment either automatically via Dynamic Host Configuration Protocol, or are configured.[citation needed] Domain Name Systems convert user-inputted domain names (e.g. "en.wikipedia.org") into IP addresses.[citation needed] Internet Protocol version 4 (IPv4) defines an IP address as a 32-bit number. IPv4 is the initial version used on the first generation of the Internet and is still in dominant use. It was designed in 1981 to address up to ≈4.3 billion (109) hosts. However, the explosive growth of the Internet has led to IPv4 address exhaustion, which entered its final stage in 2011, when the global IPv4 address allocation pool was exhausted. Because of the growth of the Internet and the depletion of available IPv4 addresses, a new version of IP IPv6, was developed in the mid-1990s, which provides vastly larger addressing capabilities and more efficient routing of Internet traffic. IPv6 uses 128 bits for the IP address and was standardized in 1998. IPv6 deployment has been ongoing since the mid-2000s and is currently in growing deployment around the world, since Internet address registries began to urge all resource managers to plan rapid adoption and conversion. By design, IPv6 is not directly interoperable with IPv4. Instead, it establishes a parallel version of the Internet not directly accessible with IPv4 software. Thus, translation facilities exist for internetworking, and some nodes have duplicate networking software for both networks. Essentially all modern computer operating systems support both versions of the Internet Protocol.[citation needed] Network infrastructure, however, has been lagging in this development.[citation needed] A subnet or subnetwork is a logical subdivision of an IP network.: 1, 16 Computers that belong to a subnet are addressed with an identical most-significant bit-group in their IP addresses. This results in the logical division of an IP address into two fields, the network number or routing prefix and the rest field or host identifier. The rest field is an identifier for a specific host or network interface.[citation needed] The routing prefix may be expressed in Classless Inter-Domain Routing (CIDR) notation written as the first address of a network, followed by a slash character (/), and ending with the bit-length of the prefix. For example, 198.51.100.0/24 is the prefix of the Internet Protocol version 4 network starting at the given address, having 24 bits allocated for the network prefix, and the remaining 8 bits reserved for host addressing. Addresses in the range 198.51.100.0 to 198.51.100.255 belong to this network. The IPv6 address specification 2001:db8::/32 is a large address block with 296 addresses, having a 32-bit routing prefix.[citation needed] For IPv4, a network may also be characterized by its subnet mask or netmask, which is the bitmask that when applied by a bitwise AND operation to any IP address in the network, yields the routing prefix. Subnet masks are also expressed in dot-decimal notation like an address. For example, 255.255.255.0 is the subnet mask for the prefix 198.51.100.0/24.[citation needed] Computers and routers use routing tables in their operating system to forward IP packets to reach a node on a different subnetwork. Routing tables are maintained by manual configuration or automatically by routing protocols. End-nodes typically use a default route that points toward an ISP providing transit, while ISP routers use the Border Gateway Protocol to establish the most efficient routing across the complex connections of the global Internet.[citation needed] The default gateway is the node that serves as the forwarding host (router) to other networks when no other route specification matches the destination IP address of a packet. Security Internet resources, hardware, and software components are the target of criminal or malicious attempts to gain unauthorized control to cause interruptions, commit fraud, engage in blackmail or access private information. Malware is malicious software used and distributed via the Internet. It includes computer viruses which are copied with the help of humans, computer worms which copy themselves automatically, software for denial of service attacks, ransomware, botnets, and spyware that reports on the activity and typing of users.[citation needed] Usually, these activities constitute cybercrime. Defense theorists have also speculated about the possibilities of hackers using cyber warfare using similar methods on a large scale. Malware poses serious problems to individuals and businesses on the Internet. According to Symantec's 2018 Internet Security Threat Report (ISTR), malware variants number has increased to 669,947,865 in 2017, which is twice as many malware variants as in 2016. Cybercrime, which includes malware attacks as well as other crimes committed by computer, was predicted to cost the world economy US$6 trillion in 2021, and is increasing at a rate of 15% per year. Since 2021, malware has been designed to target computer systems that run critical infrastructure such as the electricity distribution network. Malware can be designed to evade antivirus software detection algorithms. The vast majority of computer surveillance involves the monitoring of data and traffic on the Internet. In the United States for example, under the Communications Assistance For Law Enforcement Act, all phone calls and broadband Internet traffic (emails, web traffic, instant messaging, etc.) are required to be available for unimpeded real-time monitoring by Federal law enforcement agencies. Under the Act, all U.S. telecommunications providers are required to install packet sniffing technology to allow Federal law enforcement and intelligence agencies to intercept all of their customers' broadband Internet and VoIP traffic.[d] The large amount of data gathered from packet capture requires surveillance software that filters and reports relevant information, such as the use of certain words or phrases, the access to certain types of web sites, or communicating via email or chat with certain parties. Agencies, such as the Information Awareness Office, NSA, GCHQ and the FBI, spend billions of dollars per year to develop, purchase, implement, and operate systems for interception and analysis of data. Similar systems are operated by Iranian secret police to identify and suppress dissidents. The required hardware and software were allegedly installed by German Siemens AG and Finnish Nokia. Some governments, such as those of Myanmar, Iran, North Korea, Mainland China, Saudi Arabia and the United Arab Emirates, restrict access to content on the Internet within their territories, especially to political and religious content, with domain name and keyword filters. In Norway, Denmark, Finland, and Sweden, major Internet service providers have voluntarily agreed to restrict access to sites listed by authorities. While this list of forbidden resources is supposed to contain only known child pornography sites, the content of the list is secret. Many countries, including the United States, have enacted laws against the possession or distribution of certain material, such as child pornography, via the Internet but do not mandate filter software. Many free or commercially available software programs, called content-control software are available to users to block offensive specific on individual computers or networks in order to limit access by children to pornographic material or depiction of violence.[citation needed] Performance As the Internet is a heterogeneous network, its physical characteristics, including, for example the data transfer rates of connections, vary widely. It exhibits emergent phenomena that depend on its large-scale organization. PB per monthYear020,00040,00060,00080,000100,000120,000140,000199019952000200520102015Petabytes per monthGlobal Internet Traffic Volume The volume of Internet traffic is difficult to measure because no single point of measurement exists in the multi-tiered, non-hierarchical topology. Traffic data may be estimated from the aggregate volume through the peering points of the Tier 1 network providers, but traffic that stays local in large provider networks may not be accounted for.[citation needed] An Internet blackout or outage can be caused by local signaling interruptions. Disruptions of submarine communications cables may cause blackouts or slowdowns to large areas, such as in the 2008 submarine cable disruption. Less-developed countries are more vulnerable due to the small number of high-capacity links. Land cables are also vulnerable, as in 2011 when a woman digging for scrap metal severed most connectivity for the nation of Armenia. Internet blackouts affecting almost entire countries can be achieved by governments as a form of Internet censorship, as in the blockage of the Internet in Egypt, whereby approximately 93% of networks were without access in 2011 in an attempt to stop mobilization for anti-government protests. Estimates of the Internet's electricity usage have been the subject of controversy, according to a 2014 peer-reviewed research paper that found claims differing by a factor of 20,000 published in the literature during the preceding decade, ranging from 0.0064 kilowatt hours per gigabyte transferred (kWh/GB) to 136 kWh/GB. The researchers attributed these discrepancies mainly to the year of reference (i.e. whether efficiency gains over time had been taken into account) and to whether "end devices such as personal computers and servers are included" in the analysis. In 2011, academic researchers estimated the overall energy used by the Internet to be between 170 and 307 GW, less than two percent of the energy used by humanity. This estimate included the energy needed to build, operate, and periodically replace the estimated 750 million laptops, a billion smart phones and 100 million servers worldwide as well as the energy that routers, cell towers, optical switches, Wi-Fi transmitters and cloud storage devices use when transmitting Internet traffic. According to a non-peer-reviewed study published in 2018 by The Shift Project (a French think tank funded by corporate sponsors), nearly 4% of global CO2 emissions could be attributed to global data transfer and the necessary infrastructure. The study also said that online video streaming alone accounted for 60% of this data transfer and therefore contributed to over 300 million tons of CO2 emission per year, and argued for new "digital sobriety" regulations restricting the use and size of video files. See also Notes References Sources Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Jews#cite_note-Kunin2000-85] | [TOKENS: 15852]
Contents Jews Jews (Hebrew: יְהוּדִים‎, ISO 259-2: Yehudim, Israeli pronunciation: [jehuˈdim]), or the Jewish people, are an ethnoreligious group and nation, originating from the Israelites of ancient Israel and Judah. They traditionally adhere to Judaism. Jewish ethnicity, religion, and community are highly interrelated, as Judaism is an ethnic religion, though many ethnic Jews do not practice it. Religious Jews regard converts to Judaism as members of the Jewish nation, pursuant to the long-standing conversion process. The Israelites emerged from the pre-existing Canaanite peoples to establish Israel and Judah in the Southern Levant during the Iron Age. Originally, Jews referred to the inhabitants of the kingdom of Judah and were distinguished from the gentiles and the Samaritans. According to the Hebrew Bible, these inhabitants predominately originate from the tribe of Judah, who were descendants of Judah, the fourth son of Jacob. The tribe of Benjamin were another significant demographic in Judah and were considered Jews too. By the late 6th century BCE, Judaism had evolved from the Israelite religion, dubbed Yahwism (for Yahweh) by modern scholars, having a theology that religious Jews believe to be the expression of the Mosaic covenant between God and the Jewish people. After the Babylonian exile, Jews referred to followers of Judaism, descendants of the Israelites, citizens of Judea, or allies of the Judean state. Jewish migration within the Mediterranean region during the Hellenistic period, followed by population transfers, caused by events like the Jewish–Roman wars, gave rise to the Jewish diaspora, consisting of diverse Jewish communities that maintained their sense of Jewish history, identity, and culture. In the following millennia, Jewish diaspora communities coalesced into three major ethnic subdivisions according to where their ancestors settled: the Ashkenazim (Central and Eastern Europe), the Sephardim (Iberian Peninsula), and the Mizrahim (Middle East and North Africa). While these three major divisions account for most of the world's Jews, there are other smaller Jewish groups outside of the three. Prior to World War II, the global Jewish population reached a peak of 16.7 million, representing around 0.7% of the world's population at that time. During World War II, approximately six million Jews throughout Europe were systematically murdered by Nazi Germany in a genocide known as the Holocaust. Since then, the population has slowly risen again, and as of 2021[update], was estimated to be at 15.2 million by the demographer Sergio Della Pergola or less than 0.2% of the total world population in 2012.[b] Today, over 85% of Jews live in Israel or the United States. Israel, whose population is 73.9% Jewish, is the only country where Jews comprise more than 2.5% of the population. Jews have significantly influenced and contributed to the development and growth of human progress in many fields, both historically and in modern times, including in science and technology, philosophy, ethics, literature, governance, business, art, music, comedy, theatre, cinema, architecture, food, medicine, and religion. Jews founded Christianity and had an indirect but profound influence on Islam. In these ways and others, Jews have played a significant role in the development of Western culture. Name and etymology The term "Jew" is derived from the Hebrew word יְהוּדִי Yehudi, with the plural יְהוּדִים Yehudim. Endonyms in other Jewish languages include the Ladino ג׳ודיו Djudio (plural ג׳ודיוס, Djudios) and the Yiddish ייִד Yid (plural ייִדן Yidn). Though Genesis 29:35 and 49:8 connect "Judah" with the verb yada, meaning "praise", scholars generally agree that "Judah" most likely derives from the name of a Levantine geographic region dominated by gorges and ravines. The gradual ethnonymic shift from "Israelites" to "Jews", regardless of their descent from Judah, although not contained in the Torah, is made explicit in the Book of Esther (4th century BCE) of the Tanakh. Some modern scholars disagree with the conflation, based on the works of Josephus, Philo and Apostle Paul. The English word "Jew" is a derivation of Middle English Gyw, Iewe. The latter was loaned from the Old French giu, which itself evolved from the earlier juieu, which in turn derived from judieu/iudieu which through elision had dropped the letter "d" from the Medieval Latin Iudaeus, which, like the New Testament Greek term Ioudaios, meant both "Jew" and "Judean" / "of Judea". The Greek term was a loan from Aramaic *yahūdāy, corresponding to Hebrew יְהוּדִי Yehudi. Some scholars prefer translating Ioudaios as "Judean" in the Bible since it is more precise, denotes the community's origins and prevents readers from engaging in antisemitic eisegesis. Others disagree, believing that it erases the Jewish identity of Biblical characters such as Jesus. Daniel R. Schwartz distinguishes "Judean" and "Jew". Here, "Judean" refers to the inhabitants of Judea, which encompassed southern Palestine. Meanwhile, "Jew" refers to the descendants of Israelites that adhere to Judaism. Converts are included in the definition. But Shaye J.D. Cohen argues that "Judean" is inclusive of believers of the Judean God and allies of the Judean state. Another scholar, Jodi Magness, wrote the term Ioudaioi refers to a "people of Judahite/Judean ancestry who worshipped the God of Israel as their national deity and (at least nominally) lived according to his laws." The etymological equivalent is in use in other languages, e.g., يَهُودِيّ yahūdī (sg.), al-yahūd (pl.), in Arabic, "Jude" in German, "judeu" in Portuguese, "Juif" (m.)/"Juive" (f.) in French, "jøde" in Danish and Norwegian, "judío/a" in Spanish, "jood" in Dutch, "żyd" in Polish etc., but derivations of the word "Hebrew" are also in use to describe a Jew, e.g., in Italian (Ebreo), in Persian ("Ebri/Ebrani" (Persian: عبری/عبرانی)) and Russian (Еврей, Yevrey). The German word "Jude" is pronounced [ˈjuːdə], the corresponding adjective "jüdisch" [ˈjyːdɪʃ] (Jewish) is the origin of the word "Yiddish". According to The American Heritage Dictionary of the English Language, fourth edition (2000), It is widely recognized that the attributive use of the noun Jew, in phrases such as Jew lawyer or Jew ethics, is both vulgar and highly offensive. In such contexts Jewish is the only acceptable possibility. Some people, however, have become so wary of this construction that they have extended the stigma to any use of Jew as a noun, a practice that carries risks of its own. In a sentence such as There are now several Jews on the council, which is unobjectionable, the substitution of a circumlocution like Jewish people or persons of Jewish background may in itself cause offense for seeming to imply that Jew has a negative connotation when used as a noun. Identity Judaism shares some of the characteristics of a nation, an ethnicity, a religion, and a culture, making the definition of who is a Jew vary slightly depending on whether a religious or national approach to identity is used.[better source needed] Generally, in modern secular usage, Jews include three groups: people who were born to a Jewish family regardless of whether or not they follow the religion, those who have some Jewish ancestral background or lineage (sometimes including those who do not have strictly matrilineal descent), and people without any Jewish ancestral background or lineage who have formally converted to Judaism and therefore are followers of the religion. In the context of biblical and classical literature, Jews could refer to inhabitants of the Kingdom of Judah, or the broader Judean region, allies of the Judean state, or anyone that followed Judaism. Historical definitions of Jewish identity have traditionally been based on halakhic definitions of matrilineal descent, and halakhic conversions. These definitions of who is a Jew date back to the codification of the Oral Torah into the Babylonian Talmud, around 200 CE. Interpretations by Jewish sages of sections of the Tanakh – such as Deuteronomy 7:1–5, which forbade intermarriage between their Israelite ancestors and seven non-Israelite nations: "for that [i.e. giving your daughters to their sons or taking their daughters for your sons,] would turn away your children from following me, to serve other gods"[failed verification] – are used as a warning against intermarriage between Jews and gentiles. Leviticus 24:10 says that the son in a marriage between a Hebrew woman and an Egyptian man is "of the community of Israel." This is complemented by Ezra 10:2–3, where Israelites returning from Babylon vow to put aside their gentile wives and their children. A popular theory is that the rape of Jewish women in captivity brought about the law of Jewish identity being inherited through the maternal line, although scholars challenge this theory citing the Talmudic establishment of the law from the pre-exile period. Another argument is that the rabbis changed the law of patrilineal descent to matrilineal descent due to the widespread rape of Jewish women by Roman soldiers. Since the anti-religious Haskalah movement of the late 18th and 19th centuries, halakhic interpretations of Jewish identity have been challenged. According to historian Shaye J. D. Cohen, the status of the offspring of mixed marriages was determined patrilineally in the Bible. He brings two likely explanations for the change in Mishnaic times: first, the Mishnah may have been applying the same logic to mixed marriages as it had applied to other mixtures (Kil'ayim). Thus, a mixed marriage is forbidden as is the union of a horse and a donkey, and in both unions the offspring are judged matrilineally. Second, the Tannaim may have been influenced by Roman law, which dictated that when a parent could not contract a legal marriage, offspring would follow the mother. Rabbi Rivon Krygier follows a similar reasoning, arguing that Jewish descent had formerly passed through the patrilineal descent and the law of matrilineal descent had its roots in the Roman legal system. Origins The prehistory and ethnogenesis of the Jews are closely intertwined with archaeology, biology, historical textual records, mythology, and religious literature. The ethnic origin of the Jews lie in the Israelites, a confederation of Iron Age Semitic-speaking tribes that inhabited a part of Canaan during the tribal and monarchic periods. Modern Jews are named after and also descended from the southern Israelite Kingdom of Judah. Gary A. Rendsburg links the early Canaanite nomadic pastoralists confederation to the Shasu known to the Egyptians around the 15th century BCE. According to the Hebrew Bible narrative, Jewish history begins with the Biblical patriarchs such as Abraham, his son Isaac, Isaac's son Jacob, and the Biblical matriarchs Sarah, Rebecca, Leah, and Rachel, who lived in Canaan. The twelve sons of Jacob subsequently gave birth to the Twelve Tribes. Jacob and his family migrated to Ancient Egypt after being invited to live with Jacob's son Joseph by the Pharaoh himself. Jacob's descendants were later enslaved until the Exodus, led by Moses. Afterwards, the Israelites conquered Canaan under Moses' successor Joshua, and went through the period of the Biblical judges after the death of Joshua. Through the mediation of Samuel, the Israelites were subject to a king, Saul, who was succeeded by David and then Solomon, after whom the United Monarchy ended and was split into a separate Kingdom of Israel and a Kingdom of Judah. The Kingdom of Judah is described as comprising the tribes of Judah, Benjamin and partially, Levi. They later assimilated remnants of other tribes who migrated there from the northern Kingdom of Israel. In the extra-biblical record, the Israelites become visible as a people between 1200 and 1000 BCE. There is well accepted archeological evidence referring to "Israel" in the Merneptah Stele, which dates to about 1200 BCE, and in the Mesha stele from 840 BCE. It is debated whether a period like that of the Biblical judges occurred and if there ever was a United Monarchy. There is further disagreement about the earliest existence of the Kingdoms of Israel and Judah and their extent and power. Historians agree that a Kingdom of Israel existed by c. 900 BCE,: 169–95 there is a consensus that a Kingdom of Judah existed by c. 700 BCE at least, and recent excavations in Khirbet Qeiyafa have provided strong evidence for dating the Kingdom of Judah to the 10th century BCE. In 587 BCE, Nebuchadnezzar II, King of the Neo-Babylonian Empire, besieged Jerusalem, destroyed the First Temple and deported parts of the Judahite population. Scholars disagree regarding the extent to which the Bible should be accepted as a historical source for early Israelite history. Rendsburg states that there are two approximately equal groups of scholars who debate the historicity of the biblical narrative, the minimalists who largely reject it, and the maximalists who largely accept it, with the minimalists being the more vocal of the two. Some of the leading minimalists reframe the biblical account as constituting the Israelites' inspiring national myth narrative, suggesting that according to the modern archaeological and historical account, the Israelites and their culture did not overtake the region by force, but instead branched out of the Canaanite peoples and culture through the development of a distinct monolatristic—and later monotheistic—religion of Yahwism centered on Yahweh, one of the gods of the Canaanite pantheon. The growth of Yahweh-centric belief, along with a number of cultic practices, gradually gave rise to a distinct Israelite ethnic group, setting them apart from other Canaanites. According to Dever, modern archaeologists have largely discarded the search for evidence of the biblical narrative surrounding the patriarchs and the exodus. According to the maximalist position, the modern archaeological record independently points to a narrative which largely agrees with the biblical account. This narrative provides a testimony of the Israelites as a nomadic people known to the Egyptians as belonging to the Shasu. Over time these nomads left the desert and settled on the central mountain range of the land of Canaan, in simple semi-nomadic settlements in which pig bones are notably absent. This population gradually shifted from a tribal lifestyle to a monarchy. While the archaeological record of the ninth century BCE provides evidence for two monarchies, one in the south under a dynasty founded by a figure named David with its capital in Jerusalem, and one in the north under a dynasty founded by a figure named Omri with its capital in Samaria. It also points to an early monarchic period in which these regions shared material culture and religion, suggesting a common origin. Archaeological finds also provide evidence for the later cooperation of these two kingdoms in their coalition against Aram, and for their destructions by the Assyrians and later by the Babylonians. Genetic studies on Jews show that most Jews worldwide bear a common genetic heritage which originates in the Middle East, and that they share certain genetic traits with other Gentile peoples of the Fertile Crescent. The genetic composition of different Jewish groups shows that Jews share a common gene pool dating back four millennia, as a marker of their common ancestral origin. Despite their long-term separation, Jewish communities maintained their unique commonalities, propensities, and sensibilities in culture, tradition, and language. History The earliest recorded evidence of a people by the name of Israel appears in the Merneptah Stele, which dates to around 1200 BCE. The majority of scholars agree that this text refers to the Israelites, a group that inhabited the central highlands of Canaan, where archaeological evidence shows that hundreds of small settlements were constructed between the 12th and 10th centuries BCE. The Israelites differentiated themselves from neighboring peoples through various distinct characteristics including religious practices, prohibition on intermarriage, and an emphasis on genealogy and family history. In the 10th century BCE, two neighboring Israelite kingdoms—the northern Kingdom of Israel and the southern Kingdom of Judah—emerged. Since their inception, they shared ethnic, cultural, linguistic and religious characteristics despite a complicated relationship. Israel, with its capital mostly in Samaria, was larger and wealthier, and soon developed into a regional power. In contrast, Judah, with its capital in Jerusalem, was less prosperous and covered a smaller, mostly mountainous territory. However, while in Israel the royal succession was often decided by a military coup d'état, resulting in several dynasty changes, political stability in Judah was much greater, as it was ruled by the House of David for the whole four centuries of its existence. Scholars also describe Biblical Jews as a 'proto-nation', in the modern nationalist sense, comparable to classical Greeks, the Gauls and the British Celts. Around 720 BCE, Kingdom of Israel was destroyed when it was conquered by the Neo-Assyrian Empire, which came to dominate the ancient Near East. Under the Assyrian resettlement policy, a significant portion of the northern Israelite population was exiled to Mesopotamia and replaced by immigrants from the same region. During the same period, and throughout the 7th century BCE, the Kingdom of Judah, now under Assyrian vassalage, experienced a period of prosperity and witnessed a significant population growth. This prosperity continued until the Neo-Assyrian king Sennacherib devastated the region of Judah in response to a rebellion in the area, ultimately halting at Jerusalem. Later in the same century, the Assyrians were defeated by the rising Neo-Babylonian Empire, and Judah became its vassal. In 587 BCE, following a revolt in Judah, the Babylonian king Nebuchadnezzar II besieged and destroyed Jerusalem and the First Temple, putting an end to the kingdom. The majority of Jerusalem's residents, including the kingdom's elite, were exiled to Babylon. According to the Book of Ezra, the Persian Cyrus the Great ended the Babylonian exile in 538 BCE, the year after he captured Babylon. The exile ended with the return under Zerubbabel the Prince (so called because he was a descendant of the royal line of David) and Joshua the Priest (a descendant of the line of the former High Priests of the Temple) and their construction of the Second Temple circa 521–516 BCE. As part of the Persian Empire, the former Kingdom of Judah became the province of Judah (Yehud Medinata), with a smaller territory and a reduced population. Judea was under control of the Achaemenids until the fall of their empire in c. 333 BCE to Alexander the Great. After several centuries under foreign imperial rule, the Maccabean Revolt against the Seleucid Empire resulted in an independent Hasmonean kingdom, under which the Jews once again enjoyed political independence for a period spanning from 110 to 63 BCE. Under Hasmonean rule the boundaries of their kingdom were expanded to include not only the land of the historical kingdom of Judah, but also the Galilee and Transjordan. In the beginning of this process the Idumeans, who had infiltrated southern Judea after the destruction of the First Temple, were converted en masse. In 63 BCE, Judea was conquered by the Romans. From 37 BCE to 6 CE, the Romans allowed the Jews to maintain some degree of independence by installing the Herodian dynasty as vassal kings. However, Judea eventually came directly under Roman control and was incorporated into the Roman Empire as the province of Judaea. The Jewish–Roman wars, a series of failed uprisings against Roman rule during the first and second centuries CE, had profound and devastating consequences for the Jewish population of Judaea. The First Jewish–Roman War (66–73/74 CE) culminated in the destruction of Jerusalem and the Second Temple, after which the significantly diminished Jewish population was stripped of political autonomy. A few generations later, the Bar Kokhba revolt (132–136 CE) erupted in response to Roman plans to rebuild Jerusalem as a Roman colony, and, possibly, to restrictions on circumcision. Its violent suppression by the Romans led to the near-total depopulation of Judea, and the demographic and cultural center of Jewish life shifted to Galilee. Jews were subsequently banned from residing in Jerusalem and the surrounding area, and the province of Judaea was renamed Syria Palaestina. These developments effectively ended Jewish efforts to restore political sovereignty in the region for nearly two millennia. Similar upheavals impacted the Jewish communities in the empire's eastern provinces during the Diaspora Revolt (115–117 CE), leading to the near-total destruction of Jewish diaspora communities in Libya, Cyprus and Egypt, including the highly influential community in Alexandria. The destruction of the Second Temple in 70 CE brought profound changes to Judaism. With the Temple's central place in Jewish worship gone, religious practices shifted towards prayer, Torah study (including Oral Torah), and communal gatherings in synagogues. Judaism also lost much of its sectarian nature.: 69 Two of the three main sects that flourished during the late Second Temple period, namely the Sadducees and Essenes, eventually disappeared, while Pharisaic beliefs became the foundational, liturgical, and ritualistic basis of Rabbinic Judaism, which emerged as the prevailing form of Judaism since late antiquity. The Jewish diaspora existed well before the destruction of the Second Temple in 70 CE and had been ongoing for centuries, with the dispersal driven by both forced expulsions and voluntary migrations. In Mesopotamia, a testimony to the beginnings of the Jewish community can be found in Joachin's ration tablets, listing provisions allotted to the exiled Judean king and his family by Nebuchadnezzar II, and further evidence are the Al-Yahudu tablets, dated to the 6th–5th centuries BCE and related to the exiles from Judea arriving after the destruction of the First Temple, though there is ample evidence for the presence of Jews in Babylonia even from 626 BCE. In Egypt, the documents from Elephantine reveal the trials of a community founded by a Persian Jewish garrison at two fortresses on the frontier during the 5th–4th centuries BCE, and according to Josephus the Jewish community in Alexandria existed since the founding of the city in the 4th century BCE by Alexander the Great. By 200 BCE, there were well established Jewish communities both in Egypt and Mesopotamia ("Babylonia" in Jewish sources) and in the two centuries that followed, Jewish populations were also present in Asia Minor, Greece, Macedonia, Cyrene, and, beginning in the middle of the first century BCE, in the city of Rome. Later, in the first centuries CE, as a result of the Jewish-Roman Wars, a large number of Jews were taken as captives, sold into slavery, or compelled to flee from the regions affected by the wars, contributing to the formation and expansion of Jewish communities across the Roman Empire as well as in Arabia and Mesopotamia. After the Bar Kokhba revolt, the Jewish population in Judaea—now significantly reduced— made efforts to recover from the revolt's devastating effects, but never fully regained its former strength. Between the second and fourth centuries CE, the region of Galilee emerged as the primary center of Jewish life in Syria Palaestina, experiencing both demographic growth and cultural development. It was during this period that two central rabbinic texts, the Mishnah and the Jerusalem Talmud, were composed. The Romans recognized the patriarchs—rabbinic sages such as Judah ha-Nasi—as representatives of the Jewish people, granting them a certain degree of autonomy. However, as the Roman Empire gave way to the Christianized Byzantine Empire under Constantine, Jews began to face persecution by both the Church and imperial authorities, Jews came to be persecuted by the church and the authorities, and many immigrated to communities in the diaspora. By the fourth century CE, Jews are believed to have lost their demographic majority in Syria Palaestina. The long-established Jewish community of Mesopotamia, which had been living under Parthian and later Sasanian rule, beyond the confines of the Roman Empire, became an important center of Jewish study as Judea's Jewish population declined. Estimates often place the Babylonian Jewish community of the 3rd to 7th centuries at around one million, making it the largest Jewish diaspora community of that period. Under the political leadership of the exilarch, who was regarded as a royal heir of the House of David, this community had an autonomous status and served as a place of refuge for the Jews of Syria Palaestina. A number of significant Talmudic academies, such as the Nehardea, Pumbedita, and Sura academies, were established in Mesopotamia, and many important Amoraim were active there. The Babylonian Talmud, a centerpiece of Jewish religious law, was compiled in Babylonia in the 3rd to 6th centuries. Jewish diaspora communities are generally described to have coalesced into three major ethnic subdivisions according to where their ancestors settled: the Ashkenazim (initially in the Rhineland and France), the Sephardim (initially in the Iberian Peninsula), and the Mizrahim (Middle East and North Africa). Romaniote Jews, Tunisian Jews, Yemenite Jews, Egyptian Jews, Ethiopian Jews, Bukharan Jews, Mountain Jews, and other groups also predated the arrival of the Sephardic diaspora. During the same period, Jewish communities in the Middle East thrived under Islamic rule, especially in cities like Baghdad, Cairo, and Damascus. In Babylonia, from the 7th to 11th centuries the Pumbedita and Sura academies led the Arab and to an extent the entire Jewish world. The deans and students of said academies defined the Geonic period in Jewish history. Following this period were the Rishonim who lived from the 11th to 15th centuries. Like their European counterparts, Jews in the Middle East and North Africa also faced periods of persecution and discriminatory policies, with the Almohad Caliphate in North Africa and Iberia issuing forced conversion decrees, causing Jews such as Maimonides to seek safety in other regions. Despite experiencing repeated waves of persecution, Ashkenazi Jews in Western Europe worked in a variety of fields, making an impact on their communities' economy and societies. In Francia, for example, figures like Isaac Judaeus and Armentarius occupied prominent social and economic positions. Francia also witnessed the development of a sophisticated tradition of biblical commentary, as exemplified by Rashi and the tosafists. In 1144, the first documented blood libel occurred in Norwich, England, marking an escalation in the pattern of discrimination and violence that Jews had already been subjected to throughout medieval Europe. During the 12th and 13th centuries, Jews faced frequent antisemitic legislation - including laws prescribing distinctive dress - alongside segregation, repeated blood libels, pogroms, and massacres such as the Rhineland Massacres (1066). The Jews of the Holy Roman Empire were designated Servi camerae regis (“servants of the imperial chamber”) by Frederick II, a status that afforded limited protection while simultaneously entangling them in the political struggles between the emperor and the German principalities and cities. Persecution intensified during the Black Death in the mid-14th century, when Jews were accused of poisoning wells and many communities were destroyed. These pressures, combined with major expulsions such as that from England in 1290, gradually pushed Ashkenazi Jewish populations eastward into Poland, Lithuania, and Russia. One of the largest Jewish communities of the Middle Ages was in the Iberian Peninsula, which for a time contained the largest Jewish population in Europe. Iberian Jewry endured discrimination under the Visigoths but saw its fortunes improve under Umayyad rule and later the Taifa kingdoms. During this period, the Jews of Muslim Spain entered a "Golden Age" marked by achievements in Hebrew poetry and literature, religious scholarship, grammar, medicine and science, with leading figures including Hasdai ibn Shaprut, Judah Halevi, Moses ibn Ezra and Solomon ibn Gabirol. Jews also rose to high office, most notably Samuel ibn Naghrillah, a scholar and poet who served as grand vizier and military commander of Granada. The Golden Age ended with the rise of the radical Almoravid and Almohad dynasties, whose persecutions drove many Jews from Iberia (including Maimonides), together with the advancing Reconquista. In 1391, widespread pogroms swept across Spain, leaving thousands dead and forcing mass conversions. The Spanish Inquisition was later established to pursue, torture and execute conversos who continued to practice Judaism in secret, while public disputations were staged to discredit Judaism. In 1492, after the Reconquista, Isabella I of Castile and Ferdinand II of Aragon decreed the expulsion of all Jews who refused conversion, sending an estimated 200,000 into exile in Portugal, Italy, North Africa, and the Ottoman Empire. In 1497, Portugal's Jews, about 30,000, were formally ordered expelled but instead were forcibly converted to retain their economic role. In 1498, some 3,500 Jews were expelled from Navarre. Many converts outwardly adopted Christianity while secretly preserving Jewish practices, becoming crypto-Jews (also known as marranos or anusim), who remained targets of the various Inquisitions for centuries. Following the expulsions from Spain and Portugal in the 1490s, Jewish exiles dispersed across the Mediterranean, Europe, and North Africa. Many settled in the Ottoman Empire—which, replacing the Iberian Peninsula, became home to the world's largest Jewish population—where new communities developed in Anatolia, the Balkans, and the Land of Israel. Cities such as Istanbul and Thessaloniki grew into major Jewish centers, while in 16th-century Safed a flourishing spiritual life took shape. There, Solomon Alkabetz, Moses Cordovero, and Isaac Luria developed influential new schools of Kabbalah, giving powerful impetus to Jewish mysticism, and Joseph Karo composed the Shulchan Aruch, which became a cornerstone of Jewish law. In the 17th century, Portuguese conversos who returned to Judaism and engaged in trade and banking helped establish Amsterdam as a prosperous Jewish center, while also forming communities in cities such as Antwerp and London. This period also witnessed waves of messianic fervor, most notably the rise of the Sabbatean movement in the 1660s, led by Sabbatai Zvi of İzmir, which reverberated throughout the Jewish world. In Eastern Europe, Poland–Lithuania became the principal center of Ashkenazi Jewry, eventually becoming home to the largest Jewish population in the world. Jewish life flourished there from in the early modern era, supported by relative stability, economic opportunity, and strong communal institutions. The mid-17th century brought devastation with the Cossack uprisings in Ukraine, which reversed migration flows and sent refugees westward, yet Poland–Lithuania remained the demographic and cultural heartland of Ashkenazic Jewry. Following the partitions of Poland, most of its Jews came under Russian rule and were confined to the "Pale of Settlement." The 18th century also witnessed new religious and intellectual currents. Hasidism, founded by Baal Shem Tov, emphasized mysticism and piety, while its opponents, the Misnagdim ("opponents") led by the Vilna Gaon, defended rabbinic scholarship and tradition. In Western Europe, during the 1760s and 1770s, the Haskalah (Jewish Enlightenment) emerged in German-speaking lands, where figures such as Moses Mendelssohn promoted secular learning, vernacular literacy, and integration into European society. Elsewhere, Jews began to be re-admitted to Western Europe, including England, where Menasseh ben Israel petitioned Oliver Cromwell for their return. In the Americas, Jews of Sephardic descent first arrived as conversos in Spanish and Portuguese colonies, where many faced trial by Inquisition tribunals for "judaizing." A more durable presence began in Dutch Brazil, where Jews openly practiced their religion and established the first synagogues in the New World, before the Portuguese reconquest forced their dispersal to Amsterdam, the Caribbean, and North America. Sephardic communities took root in Curaçao, Suriname, Jamaica, and Barbados, later joined by Ashkenazi migrants. In North America, Jews were present from the mid-17th century, with New Amsterdam hosting the first organized congregation in 1654. By the time of the American Revolution, small communities in New York, Newport, Philadelphia, Savannah, and Charleston played an active role in the struggle for independence. In the late 19th century, Jews in Western Europe gradually achieved legal emancipation, though social acceptance remained limited by persistent antisemitism and rising nationalism. In Eastern Europe, particularly within the Russian Empire's Pale of Settlement, Jews faced mounting legal restrictions and recurring pogroms. From this environment emerged Zionism, a national revival movement originating in Central and Eastern Europe that sought to re-establish a Jewish polity in the Land of Israel as a means of returning the Jewish people to their ancestral homeland and ending centuries of exile and persecution. This led to waves of Jewish migration to Ottoman-controlled Palestine. Theodor Herzl, who is considered the father of political Zionism, offered his vision of a future Jewish state in his 1896 book Der Judenstaat (The Jewish State); a year later, he presided over the First Zionist Congress. The antisemitism that inflicted Jewish communities in Europe also triggered a mass exodus of 2.8 million Jews to the United States between 1881 and 1924. Despite this, some Jews of Europe and the United States were able to make great achievements in various fields of science and culture. Among the most influential from this period are Albert Einstein in physics, Sigmund Freud in psychology, Franz Kafka in literature, and Irving Berlin in music. Many Nobel Prize winners at this time were Jewish, as is still the case. When Adolf Hitler and the Nazi Party came to power in Germany in 1933, the situation for Jews deteriorated rapidly as a direct result of Nazi policies. Many Jews fled from Europe to Mandatory Palestine, the United States, and the Soviet Union as a result of racial anti-Semitic laws, economic difficulties, and the fear of an impending war. World War II started in 1939, and by 1941, Hitler occupied almost all of Europe. Following the German invasion of the Soviet Union in 1941, the Final Solution—an extensive, organized effort with an unprecedented scope intended to annihilate the Jewish people—began, and resulted in the persecution and murder of Jews in Europe and North Africa. In Poland, three million were murdered in gas chambers in all concentration camps combined, with one million at the Auschwitz camp complex alone. The Holocaust is the name given to this genocide, in which six million Jews in total were systematically murdered. Before and during the Holocaust, enormous numbers of Jews immigrated to Mandatory Palestine. In 1944, the Jewish insurgency in Mandatory Palestine began with the aim of gaining full independence from the United Kingdom. On 14 May 1948, upon the termination of the mandate, David Ben-Gurion declared the creation of the State of Israel, a Jewish and democratic state. Immediately afterwards, all neighboring Arab states invaded, and were resisted by the newly formed Israel Defense Forces. In 1949, the war ended and Israel started building its state and absorbing waves of Aliyah, granting citizenship to Jews all over the world via the Law of Return passed in 1950. However, both the Israeli–Palestinian conflict and wider Arab–Israeli conflict continue to this day. Culture The Jewish people and the religion of Judaism are strongly interrelated. Converts to Judaism have a status within the Jewish people equal to those born into it. However, converts who go on to practice no Judaism are likely to be viewed with skepticism. Mainstream Judaism does not proselytize, and conversion is considered a difficult task. A significant portion of conversions are undertaken by children of mixed marriages, or would-be or current spouses of Jews. The Hebrew Bible, a religious interpretation of the traditions and early history of the Jews, established the first of the Abrahamic religions, which are now practiced by 54 percent of the world. Judaism guides its adherents in both practice and belief, and has been called not only a religion, but also a "way of life," which has made drawing a clear distinction between Judaism, Jewish culture, and Jewish identity rather difficult. Throughout history, in eras and places as diverse as the ancient Hellenic world, in Europe before and after The Age of Enlightenment (see Haskalah), in Islamic Spain and Portugal, in North Africa and the Middle East, India, China, or the contemporary United States and Israel, cultural phenomena have developed that are in some sense characteristically Jewish without being at all specifically religious. Some factors in this come from within Judaism, others from the interaction of Jews or specific communities of Jews with their surroundings, and still others from the inner social and cultural dynamics of the community, as opposed to from the religion itself. This phenomenon has led to considerably different Jewish cultures unique to their own communities. Hebrew is the liturgical language of Judaism (termed lashon ha-kodesh, "the holy tongue"), the language in which most of the Hebrew scriptures (Tanakh) were composed, and the daily speech of the Jewish people for centuries. By the 5th century BCE, Aramaic, a closely related tongue, joined Hebrew as the spoken language in Judea. By the 3rd century BCE, some Jews of the diaspora were speaking Greek. Others, such as in the Jewish communities of Asoristan, known to Jews as Babylonia, were speaking Hebrew and Aramaic, the languages of the Babylonian Talmud. Dialects of these same languages were also used by the Jews of Syria Palaestina at that time.[citation needed] For centuries, Jews worldwide have spoken the local or dominant languages of the regions they migrated to, often developing distinctive dialectal forms or branches that became independent languages. Yiddish is the Judaeo-German language developed by Ashkenazi Jews who migrated to Central Europe. Ladino is the Judaeo-Spanish language developed by Sephardic Jews who migrated to the Iberian Peninsula. Due to many factors, including the impact of the Holocaust on European Jewry, the Jewish exodus from Arab and Muslim countries, and widespread emigration from other Jewish communities around the world, ancient and distinct Jewish languages of several communities, including Judaeo-Georgian, Judaeo-Arabic, Judaeo-Berber, Krymchak, Judaeo-Malayalam and many others, have largely fallen out of use. For over sixteen centuries Hebrew was used almost exclusively as a liturgical language, and as the language in which most books had been written on Judaism, with a few speaking only Hebrew on the Sabbath. Hebrew was revived as a spoken language by Eliezer ben Yehuda, who arrived in Palestine in 1881. It had not been used as a mother tongue since Tannaic times. Modern Hebrew is designated as the "State language" of Israel. Despite efforts to revive Hebrew as the national language of the Jewish people, knowledge of the language is not commonly possessed by Jews worldwide and English has emerged as the lingua franca of the Jewish diaspora. Although many Jews once had sufficient knowledge of Hebrew to study the classic literature, and Jewish languages like Yiddish and Ladino were commonly used as recently as the early 20th century, most Jews lack such knowledge today and English has by and large superseded most Jewish vernaculars. The three most commonly spoken languages among Jews today are Hebrew, English, and Russian. Some Romance languages, particularly French and Spanish, are also widely used. Yiddish has been spoken by more Jews in history than any other language, but it is far less used today following the Holocaust and the adoption of Modern Hebrew by the Zionist movement and the State of Israel. In some places, the mother language of the Jewish community differs from that of the general population or the dominant group. For example, in Quebec, the Ashkenazic majority has adopted English, while the Sephardic minority uses French as its primary language. Similarly, South African Jews adopted English rather than Afrikaans. Due to both Czarist and Soviet policies, Russian has superseded Yiddish as the language of Russian Jews, but these policies have also affected neighboring communities. Today, Russian is the first language for many Jewish communities in a number of Post-Soviet states, such as Ukraine and Uzbekistan,[better source needed] as well as for Ashkenazic Jews in Azerbaijan, Georgia, and Tajikistan. Although communities in North Africa today are small and dwindling, Jews there had shifted from a multilingual group to a monolingual one (or nearly so), speaking French in Algeria, Morocco, and the city of Tunis, while most North Africans continue to use Arabic or Berber as their mother tongue.[citation needed] There is no single governing body for the Jewish community, nor a single authority with responsibility for religious doctrine. Instead, a variety of secular and religious institutions at the local, national, and international levels lead various parts of the Jewish community on a variety of issues. Today, many countries have a Chief Rabbi who serves as a representative of that country's Jewry. Although many Hasidic Jews follow a certain hereditary Hasidic dynasty, there is no one commonly accepted leader of all Hasidic Jews. Many Jews believe that the Messiah will act a unifying leader for Jews and the entire world. A number of modern scholars of nationalism support the existence of Jewish national identity in antiquity. One of them is David Goodblatt, who generally believes in the existence of nationalism before the modern period. In his view, the Bible, the parabiblical literature and the Jewish national history provide the base for a Jewish collective identity. Although many of the ancient Jews were illiterate (as were their neighbors), their national narrative was reinforced through public readings. The Hebrew language also constructed and preserved national identity. Although it was not widely spoken after the 5th century BCE, Goodblatt states: the mere presence of the language in spoken or written form could invoke the concept of a Jewish national identity. Even if one knew no Hebrew or was illiterate, one could recognize that a group of signs was in Hebrew script. ... It was the language of the Israelite ancestors, the national literature, and the national religion. As such it was inseparable from the national identity. Indeed its mere presence in visual or aural medium could invoke that identity. Anthony D. Smith, an historical sociologist considered one of the founders of the field of nationalism studies, wrote that the Jews of the late Second Temple period provide "a closer approximation to the ideal type of the nation [...] than perhaps anywhere else in the ancient world." He adds that this observation "must make us wary of pronouncing too readily against the possibility of the nation, and even a form of religious nationalism, before the onset of modernity." Agreeing with Smith, Goodblatt suggests omitting the qualifier "religious" from Smith's definition of ancient Jewish nationalism, noting that, according to Smith, a religious component in national memories and culture is common even in the modern era. This view is echoed by political scientist Tom Garvin, who writes that "something strangely like modern nationalism is documented for many peoples in medieval times and in classical times as well," citing the ancient Jews as one of several "obvious examples", alongside the classical Greeks and the Gaulish and British Celts. Fergus Millar suggests that the sources of Jewish national identity and their early nationalist movements in the first and second centuries CE included several key elements: the Bible as both a national history and legal source, the Hebrew language as a national language, a system of law, and social institutions such as schools, synagogues, and Sabbath worship. Adrian Hastings argued that Jews are the "true proto-nation", that through the model of ancient Israel found in the Hebrew Bible, provided the world with the original concept of nationhood which later influenced Christian nations. However, following Jerusalem's destruction in the first century CE, Jews ceased to be a political entity and did not resemble a traditional nation-state for almost two millennia. Despite this, they maintained their national identity through collective memory, religion and sacred texts, even without land or political power, and remained a nation rather than just an ethnic group, eventually leading to the rise of Zionism and the establishment of Israel. Steven Weitzman suggests that Jewish nationalist sentiment in antiquity was encouraged because under foreign rule (Persians, Greeks, Romans) Jews were able to claim that they were an ancient nation. This claim was based on the preservation and reverence of their scriptures, the Hebrew language, the Temple and priesthood, and other traditions of their ancestors. Doron Mendels further observes that the Hasmonean kingdom, one of the few examples of indigenous statehood at its time, significantly reinforced Jewish national consciousness. The memory of this period of independence contributed to the persistent efforts to revive Jewish sovereignty in Judea, leading to the major revolts against Roman rule in the 1st and 2nd centuries CE. Demographics Within the world's Jewish population there are distinct ethnic divisions, most of which are primarily the result of geographic branching from an originating Israelite population, and subsequent independent evolutions. An array of Jewish communities was established by Jewish settlers in various places around the Old World, often at great distances from one another, resulting in effective and often long-term isolation. During the millennia of the Jewish diaspora the communities would develop under the influence of their local environments: political, cultural, natural, and populational. Today, manifestations of these differences among the Jews can be observed in Jewish cultural expressions of each community, including Jewish linguistic diversity, culinary preferences, liturgical practices, religious interpretations, as well as degrees and sources of genetic admixture. Jews are often identified as belonging to one of two major groups: the Ashkenazim and the Sephardim. Ashkenazim are so named in reference to their geographical origins (their ancestors' culture coalesced in the Rhineland, an area historically referred to by Jews as Ashkenaz). Similarly, Sephardim (Sefarad meaning "Spain" in Hebrew) are named in reference their origins in Iberia. The diverse groups of Jews of the Middle East and North Africa are often collectively referred to as Sephardim together with Sephardim proper for liturgical reasons having to do with their prayer rites. A common term for many of these non-Spanish Jews who are sometimes still broadly grouped as Sephardim is Mizrahim (lit. 'easterners' in Hebrew). Nevertheless, Mizrahis and Sepharadim are usually ethnically distinct. Smaller groups include, but are not restricted to, Indian Jews such as the Bene Israel, Bnei Menashe, Cochin Jews, and Bene Ephraim; the Romaniotes of Greece; the Italian Jews ("Italkim" or "Bené Roma"); the Teimanim from Yemen; various African Jews, including most numerously the Beta Israel of Ethiopia; and Chinese Jews, most notably the Kaifeng Jews, as well as various other distinct but now almost extinct communities. The divisions between all these groups are approximate and their boundaries are not always clear. The Mizrahim for example, are a heterogeneous collection of North African, Central Asian, Caucasian, and Middle Eastern Jewish communities that are no closer related to each other than they are to any of the earlier mentioned Jewish groups. In modern usage, however, the Mizrahim are sometimes termed Sephardi due to similar styles of liturgy, despite independent development from Sephardim proper. Thus, among Mizrahim there are Egyptian Jews, Iraqi Jews, Lebanese Jews, Kurdish Jews, Moroccan Jews, Libyan Jews, Syrian Jews, Bukharian Jews, Mountain Jews, Georgian Jews, Iranian Jews, Afghan Jews, and various others. The Teimanim from Yemen are sometimes included, although their style of liturgy is unique and they differ in respect to the admixture found among them to that found in Mizrahim. In addition, there is a differentiation made between Sephardi migrants who established themselves in the Middle East and North Africa after the expulsion of the Jews from Spain and Portugal in the 1490s and the pre-existing Jewish communities in those regions. Ashkenazi Jews represent the bulk of modern Jewry, with at least 70 percent of Jews worldwide (and up to 90 percent prior to World War II and the Holocaust). As a result of their emigration from Europe, Ashkenazim also represent the overwhelming majority of Jews in the New World continents, in countries such as the United States, Canada, Argentina, Australia, and Brazil. In France, the immigration of Jews from Algeria (Sephardim) has led them to outnumber the Ashkenazim. Only in Israel is the Jewish population representative of all groups, a melting pot independent of each group's proportion within the overall world Jewish population. Y DNA studies tend to imply a small number of founders in an old population whose members parted and followed different migration paths. In most Jewish populations, these male line ancestors appear to have been mainly Middle Eastern. For example, Ashkenazi Jews share more common paternal lineages with other Jewish and Middle Eastern groups than with non-Jewish populations in areas where Jews lived in Eastern Europe, Germany, and the French Rhine Valley. This is consistent with Jewish traditions in placing most Jewish paternal origins in the region of the Middle East. Conversely, the maternal lineages of Jewish populations, studied by looking at mitochondrial DNA, are generally more heterogeneous. Scholars such as Harry Ostrer and Raphael Falk believe this indicates that many Jewish males found new mates from European and other communities in the places where they migrated in the diaspora after fleeing ancient Israel. In contrast, Behar has found evidence that about 40 percent of Ashkenazi Jews originate maternally from just four female founders, who were of Middle Eastern origin. The populations of Sephardi and Mizrahi Jewish communities "showed no evidence for a narrow founder effect." Subsequent studies carried out by Feder et al. confirmed the large portion of non-local maternal origin among Ashkenazi Jews. Reflecting on their findings related to the maternal origin of Ashkenazi Jews, the authors conclude "Clearly, the differences between Jews and non-Jews are far larger than those observed among the Jewish communities. Hence, differences between the Jewish communities can be overlooked when non-Jews are included in the comparisons." However, a 2025 genetic study on the Ashkenazi Jewish founder population supports the presence of a substantial Near Eastern component in the maternal lineages. Analyses of mitochondrial DNA (mtDNA) indicate that the core founder lineages, estimated at around 54, likely originated from the Near East, with these founder signatures appearing in multiple copies across the population. While later admixture introduced additional mtDNA lineages, these absorbed lineages are distinguishable from the original founders. The findings are consistent with genome-wide Identity-by-Descent and Lineage Extinction analyses, reinforcing the Near Eastern origin of the Ashkenazi maternal founders. A study showed that 7% of Ashkenazi Jews have the haplogroup G2c, which is mainly found in Pashtuns and on lower scales all major Jewish groups, Palestinians, Syrians, and Lebanese. Studies of autosomal DNA, which look at the entire DNA mixture, have become increasingly important as the technology develops. They show that Jewish populations have tended to form relatively closely related groups in independent communities, with most in a community sharing significant ancestry in common. For Jewish populations of the diaspora, the genetic composition of Ashkenazi, Sephardic, and Mizrahi Jewish populations show a predominant amount of shared Middle Eastern ancestry. According to Behar, the most parsimonious explanation for this shared Middle Eastern ancestry is that it is "consistent with the historical formulation of the Jewish people as descending from ancient Hebrew and Israelite residents of the Levant" and "the dispersion of the people of ancient Israel throughout the Old World". North African, Italian and others of Iberian origin show variable frequencies of admixture with non-Jewish historical host populations among the maternal lines. In the case of Ashkenazi and Sephardi Jews (in particular Moroccan Jews), who are closely related, the source of non-Jewish admixture is mainly Southern European, while Mizrahi Jews show evidence of admixture with other Middle Eastern populations. Behar et al. have remarked on a close relationship between Ashkenazi Jews and modern Italians. A 2001 study found that Jews were more closely related to groups of the Fertile Crescent (Kurds, Turks, and Armenians) than to their Arab neighbors, whose genetic signature was found in geographic patterns reflective of Islamic conquests. The studies also show that Sephardic Bnei Anusim (descendants of the "anusim" who were forced to convert to Catholicism), which comprise up to 19.8 percent of the population of today's Iberia (Spain and Portugal) and at least 10 percent of the population of Ibero-America (Hispanic America and Brazil), have Sephardic Jewish ancestry within the last few centuries. The Bene Israel and Cochin Jews of India, Beta Israel of Ethiopia, and a portion of the Lemba people of Southern Africa, despite more closely resembling the local populations of their native countries, have also been thought to have some more remote ancient Jewish ancestry. Views on the Lemba have changed and genetic Y-DNA analyses in the 2000s have established a partially Middle-Eastern origin for a portion of the male Lemba population but have been unable to narrow this down further. Although historically, Jews have been found all over the world, in the decades since World War II and the establishment of Israel, they have increasingly concentrated in a small number of countries. In 2021, Israel and the United States together accounted for over 85 percent of the global Jewish population, with approximately 45.3% and 39.6% of the world's Jews, respectively. More than half (51.2%) of world Jewry resides in just ten metropolitan areas. As of 2021, these ten areas were Tel Aviv, New York, Jerusalem, Haifa, Los Angeles, Miami, Philadelphia, Paris, Washington, and Chicago. The Tel Aviv metro area has the highest percent of Jews among the total population (94.8%), followed by Jerusalem (72.3%), Haifa (73.1%), and Beersheba (60.4%), the balance mostly being Israeli Arabs. Outside Israel, the highest percent of Jews in a metropolitan area was in New York (10.8%), followed by Miami (8.7%), Philadelphia (6.8%), San Francisco (5.1%), Washington (4.7%), Los Angeles (4.7%), Toronto (4.5%), and Baltimore (4.1%). As of 2010, there were nearly 14 million Jews around the world, roughly 0.2% of the world's population at the time. According to the 2007 estimates of The Jewish People Policy Planning Institute, the world's Jewish population is 13.2 million. This statistic incorporates both practicing Jews affiliated with synagogues and the Jewish community, and approximately 4.5 million unaffiliated and secular Jews.[citation needed] According to Sergio Della Pergola, a demographer of the Jewish population, in 2021 there were about 6.8 million Jews in Israel, 6 million in the United States, and 2.3 million in the rest of the world. Israel, the Jewish nation-state, is the only country in which Jews make up a majority of the citizens. Israel was established as an independent democratic and Jewish state on 14 May 1948. Of the 120 members in its parliament, the Knesset, as of 2016[update], 14 members of the Knesset are Arab citizens of Israel (not including the Druze), most representing Arab political parties. One of Israel's Supreme Court judges is also an Arab citizen of Israel. Between 1948 and 1958, the Jewish population rose from 800,000 to two million. Currently, Jews account for 75.4 percent of the Israeli population, or 6 million people. The early years of the State of Israel were marked by the mass immigration of Holocaust survivors in the aftermath of the Holocaust and Jews fleeing Arab lands. Israel also has a large population of Ethiopian Jews, many of whom were airlifted to Israel in the late 1980s and early 1990s. Between 1974 and 1979 nearly 227,258 immigrants arrived in Israel, about half being from the Soviet Union. This period also saw an increase in immigration to Israel from Western Europe, Latin America, and North America. A trickle of immigrants from other communities has also arrived, including Indian Jews and others, as well as some descendants of Ashkenazi Holocaust survivors who had settled in countries such as the United States, Argentina, Australia, Chile, and South Africa. Some Jews have emigrated from Israel elsewhere, because of economic problems or disillusionment with political conditions and the continuing Arab–Israeli conflict. Jewish Israeli emigrants are known as yordim. The waves of immigration to the United States and elsewhere at the turn of the 19th century, the founding of Zionism and later events, including pogroms in Imperial Russia (mostly within the Pale of Settlement in present-day Ukraine, Moldova, Belarus and eastern Poland), the massacre of European Jewry during the Holocaust, and the founding of the state of Israel, with the subsequent Jewish exodus from Arab lands, all resulted in substantial shifts in the population centers of world Jewry by the end of the 20th century. More than half of the Jews live in the Diaspora (see Population table). Currently, the largest Jewish community outside Israel, and either the largest or second-largest Jewish community in the world, is located in the United States, with 6 million to 7.5 million Jews by various estimates. Elsewhere in the Americas, there are also large Jewish populations in Canada (315,000), Argentina (180,000–300,000), and Brazil (196,000–600,000), and smaller populations in Mexico, Uruguay, Venezuela, Chile, Colombia and several other countries (see History of the Jews in Latin America). According to a 2010 Pew Research Center study, about 470,000 people of Jewish heritage live in Latin America and the Caribbean. Demographers disagree on whether the United States has a larger Jewish population than Israel, with many maintaining that Israel surpassed the United States in Jewish population during the 2000s, while others maintain that the United States still has the largest Jewish population in the world. Currently, a major national Jewish population survey is planned to ascertain whether or not Israel has overtaken the United States in Jewish population. Western Europe's largest Jewish community, and the third-largest Jewish community in the world, can be found in France, home to between 483,000 and 500,000 Jews, the majority of whom are immigrants or refugees from North African countries such as Algeria, Morocco, and Tunisia (or their descendants). The United Kingdom has a Jewish community of 292,000. In Eastern Europe, the exact figures are difficult to establish. The number of Jews in Russia varies widely according to whether a source uses census data (which requires a person to choose a single nationality among choices that include "Russian" and "Jewish") or eligibility for immigration to Israel (which requires that a person have one or more Jewish grandparents). According to the latter criteria, the heads of the Russian Jewish community assert that up to 1.5 million Russians are eligible for aliyah. In Germany, the 102,000 Jews registered with the Jewish community are a slowly declining population, despite the immigration of tens of thousands of Jews from the former Soviet Union since the fall of the Berlin Wall. Thousands of Israelis also live in Germany, either permanently or temporarily, for economic reasons. Prior to 1948, approximately 800,000 Jews were living in lands which now make up the Arab world (excluding Israel). Of these, just under two-thirds lived in the French-controlled Maghreb region, 15 to 20 percent in the Kingdom of Iraq, approximately 10 percent in the Kingdom of Egypt and approximately 7 percent in the Kingdom of Yemen. A further 200,000 lived in Pahlavi Iran and the Republic of Turkey. Today, around 26,000 Jews live in Muslim-majority countries, mainly in Turkey (14,200) and Iran (9,100), while Morocco (2,000), Tunisia (1,000), and the United Arab Emirates (500) host the largest communities in the Arab world. A small-scale exodus had begun in many countries in the early decades of the 20th century, although the only substantial aliyah came from Yemen and Syria. The exodus from Arab and Muslim countries took place primarily from 1948. The first large-scale exoduses took place in the late 1940s and early 1950s, primarily in Iraq, Yemen and Libya, with up to 90 percent of these communities leaving within a few years. The peak of the exodus from Egypt occurred in 1956. The exodus in the Maghreb countries peaked in the 1960s. Lebanon was the only Arab country to see a temporary increase in its Jewish population during this period, due to an influx of refugees from other Arab countries, although by the mid-1970s the Jewish community of Lebanon had also dwindled. In the aftermath of the exodus wave from Arab states, an additional migration of Iranian Jews peaked in the 1980s when around 80 percent of Iranian Jews left the country.[citation needed] Outside Europe, the Americas, the Middle East, and the rest of Asia, there are significant Jewish populations in Australia (112,500) and South Africa (70,000). There is also a 6,800-strong community in New Zealand. Since at least the time of the Ancient Greeks, a proportion of Jews have assimilated into the wider non-Jewish society around them, by either choice or force, ceasing to practice Judaism and losing their Jewish identity. Assimilation took place in all areas, and during all time periods, with some Jewish communities, for example the Kaifeng Jews of China, disappearing entirely. The advent of the Jewish Enlightenment of the 18th century (see Haskalah) and the subsequent emancipation of the Jewish populations of Europe and America in the 19th century, accelerated the situation, encouraging Jews to increasingly participate in, and become part of, secular society. The result has been a growing trend of assimilation, as Jews marry non-Jewish spouses and stop participating in the Jewish community. Rates of interreligious marriage vary widely: In the United States, it is just under 50 percent; in the United Kingdom, around 53 percent; in France, around 30 percent; and in Australia and Mexico, as low as 10 percent. In the United States, only about a third of children from intermarriages affiliate with Jewish religious practice. The result is that most countries in the Diaspora have steady or slightly declining religiously Jewish populations as Jews continue to assimilate into the countries in which they live.[citation needed] The Jewish people and Judaism have experienced various persecutions throughout their history. During Late Antiquity and the Early Middle Ages, the Roman Empire (in its later phases known as the Byzantine Empire) repeatedly repressed the Jewish population, first by ejecting them from their homelands during the pagan Roman era and later by officially establishing them as second-class citizens during the Christian Roman era. According to James Carroll, "Jews accounted for 10% of the total population of the Roman Empire. By that ratio, if other factors had not intervened, there would be 200 million Jews in the world today, instead of something like 13 million." Later in medieval Western Europe, further persecutions of Jews by Christians occurred, notably during the Crusades—when Jews all over Germany were massacred—and in a series of expulsions from the Kingdom of England, Germany, and France. Then there occurred the largest expulsion of all, when Spain and Portugal, after the Reconquista (the Catholic Reconquest of the Iberian Peninsula), expelled both unbaptized Sephardic Jews and the ruling Muslim Moors. In the Papal States, which existed until 1870, Jews were required to live only in specified neighborhoods called ghettos. Islam and Judaism have a complex relationship. Traditionally Jews and Christians living in Muslim lands, known as dhimmis, were allowed to practice their religions and administer their internal affairs, but they were subject to certain conditions. They had to pay the jizya (a per capita tax imposed on free adult non-Muslim males) to the Islamic state. Dhimmis had an inferior status under Islamic rule. They had several social and legal disabilities such as prohibitions against bearing arms or giving testimony in courts in cases involving Muslims. Many of the disabilities were highly symbolic. The one described by Bernard Lewis as "most degrading" was the requirement of distinctive clothing, not found in the Quran or hadith but invented in early medieval Baghdad; its enforcement was highly erratic. On the other hand, Jews rarely faced martyrdom or exile, or forced compulsion to change their religion, and they were mostly free in their choice of residence and profession. Notable exceptions include the massacre of Jews and forcible conversion of some Jews by the rulers of the Almohad dynasty in Al-Andalus in the 12th century, as well as in Islamic Persia, and the forced confinement of Moroccan Jews to walled quarters known as mellahs beginning from the 15th century and especially in the early 19th century. In modern times, it has become commonplace for standard antisemitic themes to be conflated with anti-Zionist publications and pronouncements of Islamic movements such as Hezbollah and Hamas, in the pronouncements of various agencies of the Islamic Republic of Iran, and even in the newspapers and other publications of Turkish Refah Partisi."[better source needed] Throughout history, many rulers, empires and nations have oppressed their Jewish populations or sought to eliminate them entirely. Methods employed ranged from expulsion to outright genocide; within nations, often the threat of these extreme methods was sufficient to silence dissent. The history of antisemitism includes the First Crusade which resulted in the massacre of Jews; the Spanish Inquisition (led by Tomás de Torquemada) and the Portuguese Inquisition, with their persecution and autos-da-fé against the New Christians and Marrano Jews; the Bohdan Chmielnicki Cossack massacres in Ukraine; the Pogroms backed by the Russian Tsars; as well as expulsions from Spain, Portugal, England, France, Germany, and other countries in which the Jews had settled. According to a 2008 study published in the American Journal of Human Genetics, 19.8 percent of the modern Iberian population has Sephardic Jewish ancestry, indicating that the number of conversos may have been much higher than originally thought. The persecution reached a peak in Nazi Germany's Final Solution, which led to the Holocaust and the slaughter of approximately 6 million Jews. Of the world's 16 million Jews in 1939, almost 40% were murdered in the Holocaust. The Holocaust—the state-led systematic persecution and genocide of European Jews (and certain communities of North African Jews in European controlled North Africa) and other minority groups of Europe during World War II by Germany and its collaborators—remains the most notable modern-day persecution of Jews. The persecution and genocide were accomplished in stages. Legislation to remove the Jews from civil society was enacted years before the outbreak of World War II. Concentration camps were established in which inmates were used as slave labour until they died of exhaustion or disease. Where the Third Reich conquered new territory in Eastern Europe, specialized units called Einsatzgruppen murdered Jews and political opponents in mass shootings. Jews and Roma were crammed into ghettos before being transported hundreds of kilometres by freight train to extermination camps where, if they survived the journey, the majority of them were murdered in gas chambers. Virtually every arm of Germany's bureaucracy was involved in the logistics of the mass murder, turning the country into what one Holocaust scholar has called "a genocidal nation." Throughout Jewish history, Jews have repeatedly been directly or indirectly expelled from both their original homeland, the Land of Israel, and many of the areas in which they have settled. This experience as refugees has shaped Jewish identity and religious practice in many ways, and is thus a major element of Jewish history. In summary, the pogroms in Eastern Europe, the rise of modern antisemitism, the Holocaust, as well as the rise of Arab nationalism, all served to fuel the movements and migrations of huge segments of Jewry from land to land and continent to continent until they arrived back in large numbers at their original historical homeland in Israel. In the Bible, the patriarch Abraham is described as a migrant to the land of Canaan from Ur of the Chaldees. His descendants, the Children of Israel, undertook the Exodus (meaning "departure" or "exit" in Greek) from ancient Egypt, as described in the Book of Exodus. The first movement documented in the historical record occurred with the resettlement policy of the Neo-Assyrian Empire, which mandated the deportation of conquered peoples, and it is estimated some 4,500,000 among its captive populations suffered this dislocation over three centuries of Assyrian rule. With regard to Israel, Tiglath-Pileser III claims he deported 80% of the population of Lower Galilee, some 13,520 people. Some 27,000 Israelites, 20 to 25% of the population of the Kingdom of Israel, were described as being deported by Sargon II, and were replaced by other deported populations and sent into permanent exile by Assyria, initially to the Upper Mesopotamian provinces of the Assyrian Empire. Between 10,000 and 80,000 people from the Kingdom of Judah were similarly exiled by Babylonia, but these people were then returned to Judea by Cyrus the Great of the Persian Achaemenid Empire. Many Jews were exiled again by the Roman Empire. The 2,000 year dispersion of the Jewish diaspora beginning under the Roman Empire, as Jews were spread throughout the Roman world and, driven from land to land, settled wherever they could live freely enough to practice their religion. Over the course of the diaspora the center of Jewish life moved from Babylonia to the Iberian Peninsula to Poland to the United States and, as a result of Zionism, back to Israel. There were also many expulsions of Jews during the Middle Ages and Enlightenment in Europe, including: 1290, 16,000 Jews were expelled from England, (see the Statute of Jewry); in 1396, 100,000 from France; in 1421, thousands were expelled from Austria. Many of these Jews settled in East-Central Europe, especially Poland. Following the Spanish Inquisition in 1492, the Spanish population of around 200,000 Sephardic Jews were expelled by the Spanish crown and Catholic church, followed by expulsions in 1493 in Sicily (37,000 Jews) and Portugal in 1496. The expelled Jews fled mainly to the Ottoman Empire, the Netherlands, and North Africa, others migrating to Southern Europe and the Middle East. During the 19th century, France's policies of equal citizenship regardless of religion led to the immigration of Jews (especially from Eastern and Central Europe). This contributed to the arrival of millions of Jews in the New World. Over two million Eastern European Jews arrived in the United States from 1880 to 1925. In the latest phase of migrations, the Islamic Revolution of Iran caused many Iranian Jews to flee Iran. Most found refuge in the US (particularly Los Angeles, California, and Long Island, New York) and Israel. Smaller communities of Persian Jews exist in Canada and Western Europe. Similarly, when the Soviet Union collapsed, many of the Jews in the affected territory (who had been refuseniks) were suddenly allowed to leave. This produced a wave of migration to Israel in the early 1990s. Israel is the only country with a Jewish population that is consistently growing through natural population growth, although the Jewish populations of other countries, in Europe and North America, have recently increased through immigration. In the Diaspora, in almost every country the Jewish population in general is either declining or steady, but Orthodox and Haredi Jewish communities, whose members often shun birth control for religious reasons, have experienced rapid population growth. Orthodox and Conservative Judaism discourage proselytism to non-Jews, but many Jewish groups have tried to reach out to the assimilated Jewish communities of the Diaspora in order for them to reconnect to their Jewish roots. Additionally, while in principle Reform Judaism favours seeking new members for the faith, this position has not translated into active proselytism, instead taking the form of an effort to reach out to non-Jewish spouses of intermarried couples. There is also a trend of Orthodox movements reaching out to secular Jews in order to give them a stronger Jewish identity so there is less chance of intermarriage. As a result of the efforts by these and other Jewish groups over the past 25 years, there has been a trend (known as the Baal teshuva movement) for secular Jews to become more religiously observant, though the demographic implications of the trend are unknown. Additionally, there is also a growing rate of conversion to Jews by Choice of gentiles who make the decision to head in the direction of becoming Jews. Contributions Jewish individuals have played a significant role in the development and growth of Western culture, advancing many fields of thought, science and technology, both historically and in modern times, including through discrete trends in Jewish philosophy, Jewish ethics and Jewish literature, as well as specific trends in Jewish culture, including in Jewish art, Jewish music, Jewish humor, Jewish theatre, Jewish cuisine and Jewish medicine. Jews have established various Jewish political movements, religious movements, and, through the authorship of the Hebrew Bible and parts of the New Testament, provided the foundation for Christianity and Islam. More than 20 percent of the awarded Nobel Prize have gone to individuals of Jewish descent. Philanthropic giving is a widespread core function among Jewish organizations. Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Penrose_process] | [TOKENS: 840]
Contents Penrose process The Penrose process (also called Penrose mechanism) is theorised by Sir Roger Penrose as a means whereby energy can be extracted from a rotating black hole. The process takes advantage of the ergosphere – a region of spacetime around the black hole dragged by its rotation faster than the speed of light, meaning that from the point of view of an outside observer any matter inside is forced to move in the direction of the rotation of the black hole. In the process, a working body falls (black thick line in the figure) into the ergosphere (gray region). At its lowest point (red dot) the body fires a propellant backwards; however, to a faraway observer both seem to continue to move forward due to frame-dragging (albeit at different speeds). The propellant, being slowed, falls (thin gray line) to the event horizon of the black hole (black disk). The remains of the body, being sped up, fly away (thin black line) with an excess of energy (that more than offsets the loss of the propellant and the energy used to shoot it). The maximum amount of energy gain possible for a single particle decay via the original (or classical) Penrose process is 20.7% of its mass in the case of an uncharged black hole (assuming the best case of maximal rotation of the black hole). The energy is taken from the rotation of the black hole, so there is a limit on how much energy one can extract by Penrose process and similar strategies (for an uncharged black hole no more than 29% of its original mass; larger efficiencies would be possible for charged rotating black holes). Details of the ergosphere The outer surface of the ergosphere is the surface at which light that moves in the direction opposite to the rotation of the black hole remains at a fixed angular coordinate, according to an external observer. Since massive particles necessarily travel slower than light, massive particles will necessarily move along with the black hole's rotation. The inner boundary of the ergosphere is the event horizon, the spatial perimeter beyond which light cannot escape. Inside the ergosphere even light cannot keep up with the rotation of the black hole, as the trajectories of stationary (from the outside perspective) objects become space-like, rather than time-like (that normal matter would have), or light-like. Mathematically, the dt2 component of the metric changes its sign inside the ergosphere. That allows matter to have negative energy inside of the ergosphere as long as it moves counter the black hole's rotation fast enough (or, from outside perspective, resists being dragged along to a sufficient degree). The Penrose mechanism exploits that by diving into the ergosphere, dumping an object that was given negative energy, and returning with more energy than before. In this way, rotational energy is extracted from the black hole, resulting in the black hole being spun down to a lower rotational speed. The maximum amount of energy (per mass of the thrown in object) is extracted if the black hole is rotating at the maximal rate, the object just grazes the event horizon and decays into forwards and backwards moving packets of light (the first escapes the black hole, the second falls inside). In an adjunct process, a black hole can be spun up (its rotational speed increased) by sending in particles that do not split up, but instead give their entire angular momentum to the black hole. However, this is not a reverse of the Penrose process, as both increase the entropy of the black hole by throwing material into it. In popular culture Dominic Green's 2005 short story "The Clockwork Atom Bomb" involves the use of the Penrose Process for military applications and power generation. A major plot point of the 2014 science fiction film Interstellar is the use of the Penrose process to slingshot the spacecraft Endurance towards Edmund's planet flying through the ergosphere of fictional black hole Gargantua. In the narrative of 2024 cooperative horde shooter game Helldivers 2, the Penrose process was utilized to decrease the velocity of a singularity. See also References Further reading
========================================
[SOURCE: https://en.wikipedia.org/wiki/United_States_Environmental_Protection_Agency] | [TOKENS: 8903]
Contents United States Environmental Protection Agency The Environmental Protection Agency (EPA) is an independent agency of the United States government tasked with environmental protection matters. President Richard Nixon proposed the establishment of EPA on July 9, 1970; it began operation on December 2, 1970, after Nixon signed an executive order. The order establishing the EPA was ratified by committee hearings in the House and Senate. The agency is led by its administrator, who is appointed by the president and approved by the Senate. Since January 29, 2025, the administrator is Lee Zeldin. The EPA is not a Cabinet department, but the administrator is normally given cabinet rank. The EPA has its headquarters in Washington, D.C. There are regional offices for each of the agency's ten regions, as well as 27 laboratories around the country. The agency conducts environmental assessment, research, and education. It has the responsibility of maintaining and enforcing national standards under a variety of U.S. environmental laws, in consultation with state, tribal, and local governments. EPA enforcement powers include fines, sanctions, and other measures. It delegates some permitting, monitoring, and enforcement responsibility to U.S. states and the federally recognized tribes. The agency also works with industries and all levels of government in a wide variety of voluntary pollution prevention programs and energy conservation efforts. The agency's budgeted employee level in 2023 was 16,204.1 full-time equivalent (FTE). More than half of EPA's employees are engineers, scientists, and environmental protection specialists; other employees include legal, public affairs, financial, and information technologists. History Beginning in the late 1950s and through the 1960s, Congress reacted to increasing public concern about the impact that human activity could have on the environment. Senator James E. Murray introduced a bill, the Resources and Conservation Act (RCA) of 1959, in the 86th Congress. The bill would have established a Council on Environmental Quality in the Executive Office of the President, declared a national environmental policy, and required the preparation of an annual environmental report. The conservation movement was weak at the time and the bill did not pass Congress. The 1962 publication of Silent Spring, a best-selling book by Rachel Carson, alerted the public about the detrimental effects on animals and humans of the indiscriminate use of pesticide chemicals. In the years following, Congress discussed possible solutions. In 1968, a joint House–Senate colloquium was convened by the chairmen of the Senate Committee on Interior and Insular Affairs, Senator Henry M. Jackson, and the House Committee on Science and Astronautics, Representative George P. Miller, to discuss the need for and means of implementing a national environmental policy. Congress enacted the National Environmental Policy Act of 1969 (NEPA) and the law was based on ideas that had been discussed in the 1959 and subsequent hearings. The Richard Nixon administration made the environment a policy priority in 1969–1971 and created two new agencies, the Council on Environmental Quality (CEQ) and EPA. Nixon signed NEPA into law on January 1, 1970. The law established the CEQ in the Executive Office of the President. NEPA required that a detailed statement of environmental impacts be prepared for all major federal actions significantly affecting the environment. The "detailed statement" would ultimately be referred to as an environmental impact statement (EIS). On July 9, 1970, Nixon proposed an executive reorganization that consolidated many environmental responsibilities of the federal government under one agency, a new Environmental Protection Agency. This proposal included merging pollution control programs from a number of departments, such as the combination of pesticide programs from the United States Department of Agriculture and the United States Department of the Interior.: 5 After conducting hearings during that summer, the House and Senate approved the proposal. The EPA was created 90 days before it had to operate,: 11 and officially opened its doors on December 2, 1970. The agency's first administrator, William Ruckelshaus, took the oath of office on December 4, 1970. EPA's primary predecessor was the former Environmental Health Divisions of the U.S. Public Health Service (PHS), and its creation caused one of a series of reorganizations of PHS that occurred during 1966–1973. From PHS, EPA absorbed the entire National Air Pollution Control Administration, as well as the Environmental Control Administration's Bureau of Solid Waste Management, Bureau of Water Hygiene, and part of its Bureau of Radiological Health. It also absorbed the Federal Water Quality Administration, which had previously been transferred from PHS to the Department of the Interior in 1966. A few functions from other agencies were also incorporated into EPA: the formerly independent Federal Radiation Council was merged into it; pesticides programs were transferred from the Department of the Interior, Food and Drug Administration, and Agricultural Research Service; and some functions were transferred from the Council on Environmental Quality and Atomic Energy Commission. Upon its creation, EPA inherited 84 sites spread across 26 states, of which 42 sites were laboratories. The EPA consolidated these laboratories into 22 sites. In its first year, the EPA had a budget of $1.4 billion and 5,800 employees.: 5 At its start, the EPA was primarily a technical assistance agency that set goals and standards. Soon, new acts and amendments passed by Congress gave the agency its regulatory authority.: 9 A major expansion of the Clean Air Act was approved in December 1970. EPA staff recall that in the early days there was "an enormous sense of purpose and excitement" and the expectation that "there was this agency which was going to do something about a problem that clearly was on the minds of a lot of people in this country," leading to tens of thousands of resumes from those eager to participate in the mighty effort to clean up America's environment. When EPA first began operation, members of the private sector felt strongly that the environmental protection movement was a passing fad. Ruckelshaus stated that he felt pressure to show a public which was deeply skeptical about government's effectiveness, that EPA could respond effectively to widespread concerns about pollution. The burning Cuyahoga River in Cleveland, Ohio, in 1969 led to a national outcry and criminal charges against major steel companies. The US Justice Department in late 1970 began pollution control litigation in cooperation with the new EPA. Congress enacted the Federal Water Pollution Control Act Amendments of 1972, better known as the Clean Water Act (CWA). The CWA established a national framework for addressing water quality, including mandatory pollution control standards, to be implemented by the agency in partnership with the states. Congress amended the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA) in 1972, requiring EPA to measure every pesticide's risks against its potential benefits. In 1973 President Nixon appointed Russell E. Train to be the next EPA administrator. In 1974 Congress passed the Safe Drinking Water Act, requiring EPA to develop mandatory federal standards for all public water systems, which serve 90% of the US population. The law required EPA to enforce the standards with the cooperation of state agencies. In October 1976, Congress passed the Toxic Substances Control Act (TSCA) which, like FIFRA, related to the manufacture, labeling and usage of commercial products rather than pollution. This act gave the EPA the authority to gather information on chemicals and require producers to test them, gave it the ability to regulate chemical production and use (with specific mention of PCBs), and required the agency to create the National Inventory listing of chemicals. Congress also enacted the Resource Conservation and Recovery Act (RCRA) in 1976, significantly amending the Solid Waste Disposal Act of 1965. It tasked the EPA with setting national goals for waste disposal, conserving energy and natural resources, reducing waste, and ensuring environmentally sound management of waste. Accordingly, the agency developed regulations for solid and hazardous waste that were to be implemented in collaboration with states. President Jimmy Carter appointed Douglas M. Costle as EPA administrator in 1977. To manage the agency's expanding legal mandates and workload, by the end of 1979 the budget grew to $5.4 billion and the workforce size increased to 13,000. In 1980, following the discovery of many abandoned or mismanaged hazardous waste sites such as Love Canal, Congress passed the Comprehensive Environmental Response, Compensation, and Liability Act, nicknamed "Superfund." The new law authorized EPA to cast a wider net for parties responsible for sites contaminated by previous hazardous waste disposal and established a funding mechanism for assessment and cleanup. In a dramatic move to the right, President Ronald Reagan in 1981 appointed Anne Gorsuch as EPA administrator. Gorsuch based her administration of EPA on the New Federalism approach of downsizing federal agencies by delegating their functions and services to the individual states. She believed that EPA was over-regulating business and that the agency was too large and not cost-effective. During her 22 months as agency head, she cut the budget of the EPA by 22%, reduced the number of cases filed against polluters, relaxed Clean Air Act regulations, and facilitated the spraying of restricted-use pesticides. She cut the total number of agency employees, and hired staff from the industries they were supposed to be regulating. Environmentalists contended that her policies were designed to placate polluters, and accused her of trying to dismantle the agency. Assistant Administrator Rita Lavelle was fired by Reagan in February 1983 because of her mismanagement of the Superfund program. Gorsuch had increasing confrontations with Congress over Superfund and other programs, including her refusal to submit subpoenaed documents. Gorsuch was cited for contempt of Congress and the White House directed EPA to submit the documents to Congress. Gorsuch and most of her senior staff resigned in March 1983. Reagan then appointed William Ruckelshaus as EPA administrator for a second term. As a condition for accepting his appointment, Ruckleshaus obtained autonomy from the White House in appointing his senior management team. He then appointed experienced competent professionals to the top management positions, and worked to restore public confidence in the agency. Lee M. Thomas succeeded Ruckelshaus as administrator in 1985. In 1986 Congress passed the Emergency Planning and Community Right-to-Know Act, which authorized the EPA to gather data on toxic chemicals and share this information with the public. EPA also researched the implications of stratospheric ozone depletion. Under Administrator Thomas, EPA joined with several international organizations to perform a risk assessment of stratospheric ozone, which helped provide motivation for the Montreal Protocol, which was agreed to in August 1987.: 14 In 1988, during his first presidential campaign, George H. W. Bush was vocal about environmental issues. Following his election victory, he appointed William K. Reilly, an environmentalist, as EPA administrator in 1989. Under Reilly's leadership, the EPA implemented voluntary programs and initiated the development of a "cluster rule" for multimedia regulation of the pulp and paper industry. At the time, there was increasing awareness that some environmental issues were regional or localized in nature, and were more appropriately addressed with sub-national approaches and solutions. This understanding was reflected in the 1990 amendments to the Clean Air Act and in new approaches by the agency, such as a greater emphasis on watershed-based approaches in Clean Water Act programs. In 1992 EPA and the Department of Energy launched the Energy Star program, a voluntary program that fosters energy efficiency. Carol Browner was appointed EPA administrator by President Bill Clinton and served from 1993 to 2001. Major projects during Browner's term included: Since the passage of the Superfund law in 1980, an excise tax had been levied on the chemical and petroleum industries, to support the cleanup trust fund. Congressional authorization of the tax was due to expire in 1995. Although Browner and the Clinton Administration supported continuation of the tax, Congress declined to reauthorize it. Subsequently, the Superfund program was supported only by annual appropriations, greatly reducing the number of waste sites that are remediated in a given year. (In 2021 Congress reauthorized an excise tax on chemical manufacturers.) Taking place In 1994, President Bill Clinton signed the Executive Order 12898, which required the federal government to address environmental justice. The order told all federal agencies to consider how their actions might affect minority and low-income communities, especially communities that face higher levels of pollution or other environmental issues. This order had vital connection to the EPA, with the creation of the Office of Environmental Justice. The purpose of this office was to help coordinate environmental justice work across different environmental agencies and support efforts to protect communities that were facing disproportionate environmental burdens. Major legislative updates during the Clinton Administration were the Food Quality Protection Act and the 1996 amendments to the Safe Drinking Water Act. President George W. Bush appointed Christine Todd Whitman as EPA administrator in 2001. Whitman was succeeded by Mike Leavitt in 2003 and Stephen L. Johnson in 2005. In March 2005 nine states (California, New York, New Jersey, New Hampshire, Massachusetts, Maine, Connecticut, New Mexico and Vermont) sued the EPA. The EPA's inspector general had determined that the EPA's regulation of mercury emissions did not follow the Clean Air Act, and that the regulations were influenced by top political appointees. The EPA had suppressed a study it commissioned by Harvard University which contradicted its position on mercury controls. The suit alleged that the EPA's rule exempting coal-fired power plants from "maximum available control technology" was illegal, and additionally charged that the EPA's system of cap-and-trade to lower average mercury levels would allow power plants to forego reducing mercury emissions, which they objected would lead to dangerous local hotspots of mercury contamination even if average levels declined. Several states also began to enact their own mercury emission regulations. Illinois's proposed rule would have reduced mercury emissions from power plants by an average of 90% by 2009. In 2008—by which point a total of fourteen states had joined the suit—the U.S. Court of Appeals for the District of Columbia ruled that the EPA regulations violated the Clean Air Act. In response, EPA announced plans to propose such standards to replace the vacated Clean Air Mercury Rule, and did so on March 16, 2011. In July 2005 there was a delay in the issuance of an EPA report showing that auto companies were using loopholes to produce less fuel-efficient cars. The report was supposed to be released the day before a controversial energy bill was passed and would have provided backup for those opposed to it, but the EPA delayed its release at the last minute. EPA initiated its voluntary WaterSense program in 2006 to encourage water efficiency through the use of a special label on consumer products. In 2007 the state of California sued the EPA for its refusal to allow California and 16 other states to raise fuel economy standards for new cars. EPA Administrator Stephen Johnson claimed that the EPA was working on its own standards, but the move has been widely considered an attempt to shield the auto industry from environmental regulation by setting lower standards at the federal level, which would then preempt state laws. California governor Arnold Schwarzenegger, along with governors from 13 other states, stated that the EPA's actions ignored federal law, and that existing California standards (adopted by many states in addition to California) were almost twice as effective as the proposed federal standards. It was reported that Johnson ignored his own staff in making this decision. In 2007 it was reported that EPA research was suppressed by career managers. Supervisors at EPA's National Center for Environmental Assessment required several paragraphs to be deleted from a peer-reviewed journal article about EPA's integrated risk information system, which led two co-authors to have their names removed from the publication, and the corresponding author, Ching-Hung Hsu, to leave EPA "because of the draconian restrictions placed on publishing". The 2007 report stated that EPA subjected employees who author scientific papers to prior restraint, even if those papers are written on personal time. In December 2007 EPA administrator Johnson approved a draft of a document that declared that climate change imperiled the public welfare—a decision that would trigger the first national mandatory global-warming regulations. Associate Deputy Administrator Jason Burnett e-mailed the draft to the White House. White House aides—who had long resisted mandatory regulations as a way to address climate change—knew the gist of what Johnson's finding would be, Burnett said. They also knew that once they opened the attachment, it would become a public record, making it controversial and difficult to rescind. So they did not open it; rather, they called Johnson and asked him to take back the draft. Johnson rescinded the draft; in July 2008, he issued a new version which did not state that global warming was danger to public welfare. Burnett resigned in protest. In April 2008, the Union of Concerned Scientists said that more than half of the nearly 1,600 EPA staff scientists who responded online to a detailed questionnaire reported they had experienced incidents of political interference in their work. The survey included chemists, toxicologists, engineers, geologists and experts in other fields of science. About 40% of the scientists reported that the interference had been more prevalent in the last five years than in previous years. President Barack Obama appointed Lisa P. Jackson as EPA administrator in 2009. In 2010 it was reported that a $3 million mapping study on sea level rise was suppressed by EPA management during both the Bush and Obama administrations, and managers changed a key interagency report to reflect the removal of the maps. Between 2011 and 2012, some EPA employees reported difficulty in conducting and reporting the results of studies on hydraulic fracturing due to industry and governmental pressure, and were concerned about the censorship of environmental reports. President Obama appointed Gina McCarthy as EPA administrator in 2013. In 2014, the EPA published its "Tier 3" standards for cars, trucks and other motor vehicles, which tightened air pollution emission requirements and lowered the sulfur content in gasoline. In 2015, the EPA discovered extensive violations by Volkswagen Group in its manufacture of Volkswagen and Audi diesel engine cars, for the 2009 through 2016 model years. Following notice of violations and potential criminal sanctions, Volkswagen later agreed to a legal settlement and paid billions of US dollars in criminal penalties, and was required to initiate a vehicle buyback program and modify the engines of the vehicles to reduce illegal air emissions. In August 2015, the EPA finalized the Clean Power Plan to regulate emissions from power plants, projecting a 15-year cut of 32%, or 789 million metric tons of carbon dioxide. In 2019 it was voided and replaced by the Affordable Clean Energy rule under the Trump administration, and in 2022 its constitutionality was ruled out by the Supreme Court. In August 2015, the 2015 Gold King Mine waste water spill occurred when EPA contractors examined the level of pollutants such as lead and arsenic in a Colorado mine, and accidentally released over three million gallons of waste water into Cement Creek and the Animas River. In 2015, the International Agency for Research on Cancer (IARC), a branch of the World Health Organization, cited research linking glyphosate, an ingredient of the weed killer Roundup manufactured by the chemical company Monsanto, to non-Hodgkin's lymphoma. In March 2017, the presiding judge in a litigation brought about by people who claim to have developed glyphosate-related non-Hodgkin's lymphoma opened Monsanto emails and other documents related to the case, including email exchanges between the company and federal regulators. According to The New York Times, the "records suggested that Monsanto had ghostwritten research that was later attributed to academics and indicated that a senior official at the Environmental Protection Agency had worked to quash a review of Roundup's main ingredient, glyphosate, that was to have been conducted by the United States Department of Health and Human Services." The records show that Monsanto was able to prepare "a public relations assault" on the finding after they were alerted to the determination by Jess Rowland, the head of the EPA's cancer assessment review committee at that time, months in advance. Emails also showed that Rowland "had promised to beat back an effort by the Department of Health and Human Services to conduct its own review." On February 17, 2017, President Donald Trump appointed Scott Pruitt as EPA administrator. The Democratic Party saw the appointment as a controversial move, as Pruitt had spent most of his career challenging environmental regulations and policies. He did not have previous experience in the environmental protection field and had received financial support from the fossil fuel industry. In 2017, the Presidency of Donald Trump proposed a 31% cut to the EPA's budget to $5.7 billion from $8.1 billion and to eliminate a quarter of the agency jobs. However, this cut was not approved by Congress. Pruitt resigned from the position on July 5, 2018, citing "unrelenting attacks" due to ongoing ethics controversies. President Trump appointed Andrew R. Wheeler as EPA administrator in 2019. Trump promised to eliminate EPA "in almost every form" leaving "only tidbits" intact. On July 17, 2019, EPA management prohibited the agency's Scientific Integrity Official, Francesca Grifo, from testifying at a House committee hearing. EPA offered to send a different representative in place of Grifo and accused the committee of "dictating to the agency who they believe was qualified to speak." The hearing was to discuss the importance of allowing federal scientists and other employees to speak freely when and to whom they want to about their research without having to worry about any political consequences. In September 2019 air pollution standards in California were once again under attack, as the Trump administration attempted to revoke a waiver issued to the state which allowed more stringent standards for auto and truck emissions than the federal standards. President Joe Biden appointed Michael S. Regan to be administrator in 2021. Regan began serving on March 11, 2021. In October 2021 EPA announced its "PFAS Strategic Roadmap." PFASs are organofluorine chemical compounds referred to as "forever chemicals". The roadmap is a "whole-of-EPA" strategy and the agency will consider the full life cycle of PFAS, including preventing PFAS from entering the environment, holding polluters accountable, and remediation of contaminated sites. It also will include drinking water monitoring and risk assessment for PFOA and PFOS in biosolids (processed sewage sludge used as fertilizer). In December 2021 EPA issued new greenhouse gas standards for passenger cars and light trucks. The standards, which will reduce climate pollution and improve public health, became effective for the 2023 vehicle model year. In March 2022 the Biden administration allowed California to again set stricter auto emissions standards. In August 2022 the EPA was allotted a listed ~$53.216 billion in funding pursuant to the Inflation Reduction Act (IRA). The EPA listed 24 total initiatives, the most notable among them being greenhouse gas reduction and monitoring, a superfund petroleum tax, replacing current heavy-duty vehicles with zero-emission vehicles, and a methane incentive program. On February 3, 2023, more than 100 train cars were derailed in East Palestine, and around half of those cars containing chemicals like butyl acrylate, vinyl chloride, and ethylhexyl acrylate. Subsequently, the chemicals combusted into a flame being seen from miles around and the fumes filled the air with residents reporting animals falling ill and a burning sensation in their eyes and nose. The EPA monitered the situation and experts recommended that local residents take part in the EPA's at-home air screening. On April 21, 2023, the White House implemented the Justice40 Initiative as part of its broader effort to improve equality in minority communities. The goal of Justice40 is to make sure that 40 percent of the benefits from major federal investments go to communities that have been ignored. These investments include climate programs, clean energy projects, and efforts to reduce pollution. Justice40 changed the way federal agencies planned their programs and worked with each other. It guided how they identified disadvantaged communities and how they decided where money and support should go. For the EPA, this meant adjusting some of its grant programs and technical assistance, so they clearly supported Justice40’s goals and showed that the benefits were reaching the communities that needed them most. In March 2024, EPA published regulations for tailpipe emissions standards that accelerate the transition to electric vehicles (EVs). The standards require at least two-thirds of all new cars sold in the United States to be zero-emissions vehicles by 2032, in order to reduce air pollution and climate change. The agency projected that the regulations would cut emissions by 7 billion metric tons, or 56% of 2026 levels, by 2032. In April 2024, EPA finalized new standards for power plant carbon emissions, projecting cuts of 65,000 tons by 2028 and 1.38 billion tons by 2047. The agency also issued final drinking water standards for six PFAS compounds. In December, 2024 the EPA announced it approved California's plan to end the sale of gasoline-only vehicles by 2035. EPA Administrator Michael Regan granted a waiver under the Clean Air Act to California to implement the plan which was first announced in 2020. It required that by 2035 at least 80% of new cars sold be electric and up to 20% plug-in hybrid models. California's rules were adopted by 11 other states including New York, Massachusetts and Oregon. With the second presidential term of Donald Trump, Lee Zeldin began serving as administrator on January 29, 2025. On February 27, 2025, EPA received a White House memo issued by Russell Vought to prepare for mass layoffs. Hours earlier, Trump had said that there would be a 65 percent reduction in its roughly 17,000 personnel, which was later corrected to 65 percent overall agency budget cuts. In early 2025, the EPA began cutting many of the programs that supported environmental justice work. These programs usually help communities that face a variety of environmental burdens, especially low-income neighborhoods and communities of color. Reports from The Guardian and The Washington Post explained that important funding for communities and partnerships with local governments were either paused, reduced, or sent to other parts of the agency. Newsweek also wrote about internal EPA memos that said the cuts were meant to focus the agency on its basic legal duties. In reality, this meant taking money away from programs that helped communities deal with environmental health problems. After these changes, many towns and cities struggled to continue projects related to pollution cleanup, public health, and environmental monitoring because the funding was no longer available. In March 2025 the EPA dropped a lawsuit against Denka, a chemical company, which was intended to reduce emissions of chloroprene at its plant in LaPlace, Louisiana. After the Supreme Court of the United States overturned an injunction against termination of government employees in AFGE v. Trump, the EPA announced in July 2025 that it would be eliminating its Office of Research and Development. The agency had previously denied plans to do so after a leaked memo indicating plans to do so was reported on in March. The EPA released a new proposed rule in July 2025 to repeal the previous endangerment finding that had been established in 2009 that greenhouse gases posed a risk to human health, a basis used in many of the EPA's regulations supporting the Clean Air Act. Zeldin asserted that the impact of these existing regulations was harming American citizens through increased costs, this being his rationale for eliminating the endangerment finding. By late 2025, the EPA had greatly reduced its environmental justice work. As part of the administration’s changes, the Office of Environmental Justice and External Civil Rights was shut down, and a large portion of the EJ staff were either reassigned or put on leave. Most EJ grants were also ended, which left many local projects without the funding they depended on. During this time, the EPA also removed EJScreen from its website. This made it harder for communities to access pollution and health data they had previously relied on. Without federal support, many state and local EJ programs began struggling to continue their work. While It was possible to use data from third party EJScreen websites, this information was commonly outdated or inaccurate as data is no longer being collected. Overall, these changes showed a major shift in federal policy away from community environmental protection and equity-based enforcement. In January 2026, the EPA changed its methodology for calculating pollution limits. Where previously they had calculated their limits including the monetary value of lives saved, going forward they would only calculate limits based on projected cost to businesses. Organization The EPA is led by the administrator, appointed following nomination by the president and approval from Congress. Creating 10 EPA regions was an initiative that came from President Richard Nixon. See Standard Federal Regions. Each EPA regional office is responsible within its states for implementing the agency's programs, except those programs that have been specifically delegated to states. Each regional office also implements programs on Indian Tribal lands, except those programs delegated to tribal authorities. Legal authority The Environmental Protection Agency can only act pursuant to statutes—the laws passed by Congress. Appropriations statutes authorize how much money the agency can spend each year to carry out the approved statutes. The agency has the power to issue regulations. A regulation interprets a statute, and EPA applies its regulations to various environmental situations and enforces the requirements. The agency must include a rationale of why a regulation is needed. (See Administrative Procedure Act.) Regulations can be challenged in federal courts, either district court or appellate court, depending on the particular statutory provision. EPA has principal implementation authority for the following federal environmental laws:[citation needed] There are additional laws where EPA has a contributing role or provides assistance to other agencies. Among these laws are:[citation needed] Programs EPA established its major programs pursuant to the primary missions originally articulated in the laws passed by Congress. Additional programs have been developed to interpret the primary missions. Some of the newer programs have been specifically authorized by Congress. Former administrator William Ruckelshaus observed in 2016 that a danger for EPA was that air, water, waste and other programs would be unconnected, placed in "silos", a problem that persists more than 50 years later, albeit less so than at the start. The Office of Air and Radiation (OAR) describes itself as the official authority in charge of "developing national programs, policies, and regulations for controlling air pollution and radiation exposure." The OAR is responsible for enforcing the Clean Air Act, the Atomic Energy Act, the Waste Isolation Pilot Plant Land Withdrawal Act, and other applicable laws. The OAR is in charge of the Offices of Air Quality Planning and Standards, Atmospheric Protection, Transportation and Air Quality, and the Office of Radiation and Indoor Air. The Radiation Protection Program comprises seven project groups. In 2019 the Environmental Data & Governance Initiative, "a network of academics, developers, and non-profit professionals", published a report which compared EPA enforcement statistics over time. The number of civil cases filed by EPA have gradually decreased, and in 2018 the criminal and civil penalties from EPA claims dropped over four times their amounts in 2013, 2016, and 2017. In 2016 EPA issued $6,307,833,117 in penalties due to violations of agency requirements, and in 2018 the agency issued $184,768,000 in penalties. EPA's inspections and evaluations have steadily decreased from 2015 to 2018. Enforcement activity has decreased partially due to budget cuts within the agency. In April 2025, the Department of Justice suddenly ended a long-running environmental justice settlement in Lowndes County, Alabama. The settlement was supposed to help fix the county’s severe wastewater problems, which had harmed mostly Black and low-income residents for years. Federal officials said they ended the agreement because of a new executive order from the Trump administration that restricted diversity, equity, and inclusion programs. The Associated Press then reported that the decision brought strong criticism from environmental and civil rights groups. They said ending the settlement would make the county’s already serious health problems even worse. Many critics from The Associated Press also saw this as part of a larger trend in 2025, where federal agencies were becoming less active in protecting vulnerable communities. Many warned that ending the Lowndes County agreement showed a broader retreat from federal oversight at a time when many EJ communities were still dealing with pollution, failing infrastructure, and a lack of basic environmental protection. Controversies Congress enacted laws such as the Clean Air Act, the Resource Conservation and Recovery Act, and CERCLA with the intent of preventing and reconciling environmental damages. Beginning in 2018 under Administrator Andrew Wheeler, EPA revised some pollution standards that resulted in less overall regulation. Furthermore, the CAA's discretionary application has caused a varied application of the law among states. In 1970, Louisiana deployed its Comprehensive Toxic Air Pollutant Emission Control Program to comply with federal law. This program does not require pollution monitoring that is equivalent to programs in other states. The EPA has been criticized for its lack of progress towards environmental justice. Administrator Christine Todd Whitman was criticized for her changes to President Bill Clinton's Executive Order 12898 during 2001, removing the requirements for government agencies to take the poor and minority populations into special consideration when making changes to environmental legislation, and therefore defeating the spirit of the Executive Order. The EPA’s environmental justice work has developed in stages. In the 1990s and early 2000s, the agency focused on explaining what environmental justice meant and creating offices and programs to address pollution burdens in communities that were being disproportionately affected. In a March 2004 report, the inspector general of the agency concluded that the EPA "has not developed a clear vision or a comprehensive strategic plan, and has not established values, goals, expectations, and performance measurements" for environmental justice in its daily operations. Another report in September 2006 found the agency still had failed to review the success of its programs, policies, and activities toward environmental justice. Studies have also found that poor and minority populations were underserved by the EPA's Superfund program, and that this situation was worsening. Under the Trump administration (2017–2021 and again starting in 2025), many environmental justice programs were reduced or rolled back. During the Biden administration (2021–2025), EJ work expanded, especially through programs like Justice40. In August 2022 the EPA was allotted a listed ~42.8 billion in funding from the Inflation Reduction Act (IRA) towards what the EPA classifies as "Advancing Environmental Justice", and published the statement "Through the Inflation Reduction Act, EPA will improve the lives of millions of Americans by reducing pollution in neighborhoods where people live, work, play, and go to school; accelerating environmental justice efforts in communities overburdened by pollution for far too long; and tackling our biggest climate challenges while creating jobs and delivering energy security." In September 2022 EPA announced the creation of a new Office of Environmental Justice and External Civil Rights that reports directly to the EPA administrator. The new office has an expanded budget and staff with broader responsibilities than under the previous organizational arrangement. As of December 2025, the new administration had dismantled OEJECR and was in the process of laying off OEJECR personnel. Under the current administration, the office has changed drastically. In February 2025, the EPA took its EJScreen mapping tool and all related training materials off its public website. EJScreen had helped people see environmental and demographic data to identify communities facing higher pollution levels. The Society of Environmental Journalists reported that this removal was part of a wider rollback of environmental justice and diversity resources across the agency’s online platforms. While older copies of EJScreen can still be found through independent archives, taking it off the EPA website made it much harder for the public to access important environmental equity data. In 2025, the Trump administration made broad changes to EPA that greatly reduced the agency’s ability to carry out environmental justice work. Many programs were paused or eliminated, and large numbers of EJ staff were removed from their positions. It can be objectively described as one of the most significant rollbacks of environmental justice capacity in the agency’s history. One of the most impactful changes involved the restructuring of the EPA's Office of Environmental Justice and External Civil Rights. The Washington Post reported that more than 450 employees working on environmental justice and diversity programs were told they would be fired or reassigned, which massively shut down its operations. Additional staff removals followed soon after. Newsweek reported that at least 168 employees were placed on administrative leave after an executive order declared certain diversity, equity, and EJ programs “not related to the agency’s statutory duties.” Earlier that year, 139 employees had already been placed on leave after signing a letter during a federal workforce reduction effort. At the same time, the EPA’s research division faced major cuts. According to The Guardian, these actions were part of a plan to defund the agency’s main research arm, with senior scientists calling the move “the obliteration of ORD.” Public broadcasters confirmed that by February 2025, many remaining EJ staff were on indefinite leave, and a senior employee described the office as “on life support.” These staffing changes were paired with the suspension or closure of several environmental justice programs. The Guardian reported that mass layoffs across research and environmental divisions eliminated hundreds of EJ-related positions. The administration said these cuts were meant to refocus the EPA on “core statutory duties,” but many argued they stripped support from communities facing long-term pollution burdens. Newsweek noted that new executive orders redefined the EPA’s mission to exclude EJ programs and community outreach grants, leaving historically ignored communities without federal assistance for pollution remediation or environmental cleanup. Supporters of the changes argued that the shift returned the agency to its traditional role and reduced supposedly unnecessary spending. Experts and environmental groups warned in a Newsweek article that these rollbacks would have lasting consequences. Analysts pointed out that removing staff, ending grant programs, and limiting access to tools like EJScreen would make it much harder for vulnerable communities to get environmental data or seek legal fixes for unfair pollution exposure. The Center for American Progress explained that industry lobbying and economic pressures likely played an important role in these rollbacks. Many large companies and trade groups do not support environmental justice tools because these tools make pollution patterns easy to see. When the public can clearly see where pollution is highest and which industries are responsible, there is often more pressure for stronger rules. These rules can lead to higher costs, stricter oversight, or legal actions that companies want to avoid. Because of this, industries often push federal agencies and lawmakers to limit or remove programs that increase transparency or help communities demand protection. Weakening environmental justice programs reduces the amount of information available to the public and lowers the pressure on industries to follow stricter environmental standards. When staff positions are cut, grant programs end, or tools like EJScreen are removed, companies face fewer requirements that they have to meet. This situation benefits industries because it reduces their costs and the chances of facing enforcement. At the same time, it becomes harder for communities to learn about pollution in their area or to ask for government help. In our current administration, being a government system built to solely make the United States the economic powerhouse of the world, many injustices on a smaller scale go unnoticed. Supporters of these cuts argued that reducing environmental justice programs would save money and make the EPA more efficient. They claimed that these programs were unnecessary or too expensive. However, research shows that most of the savings go to large industries rather than to the public. Communities affected by pollution often receive no benefit. Instead, they lose important protection and have fewer ways to understand or respond to environmental risks. Many of these communities already struggle with limited resources, so the loss of environmental justice makes their situation even more difficult. In the latest Center for Effective Government analysis of 15 federal agencies which receive the most Freedom of Information Act (FOIA) requests, published in 2015 (using 2012 and 2013 data, the most recent years available), the EPA earned a D by scoring 67 out of a possible 100 points, i.e. did not earn a satisfactory overall grade. Pebble Mine is a copper and gold mining project located in the southwest region of Alaska in the Bristol Bay region organized by Northern Dynasty Minerals. In 2014 the EPA released its statement on the impacts that mining would have on Bristol Bay and its tributaries. Among many things, the statement assesses geological, topographic, ecological, hydrological, and economic data and determined that mining could negatively impact the salmon population. Seeing as Bristol Bay and its watershed provides around 46% of the world's sockeye salmon, the EPA did not want to risk an ecological disaster. In July 2014, before Northern Dynasty Minerals had submitted its EIS, EPA's Region 10 office proposed restrictions pursuant to section 404(c) of the Clean Water Act, restrictions that would effectively prohibit the project. Northern Dynasty Minerals protested this decision and on July 18, 2014, in a published statement, Pebble Partnership CEO Tom Collier said that the project would continue its litigation against EPA; noted that the EPA's action was under investigation by the EPA inspector general and by the House Committee on Oversight and Government Reform; and noted that two bills were pending in Congress seeking to clarify that EPA did not have the authority to preemptively veto or otherwise restrict development projects prior to the onset of federal and state permitting. Collier's statement also said that EPA's proposal was based on outdated mining scenarios that were not part of the project's approach. Multiple journalists and organizations have reported on the controversy including the Natural Resources Defense Council in support of the cancelation of the project and John Stossel in support of the development of the mine. As of 2023[update], the mine remains a controversial topic. On January 30, 2023, the EPA vetoed the mine. Ohio governor Mike DeWine and administrator of the EPA Michael Regan drank tap water in East Palestine, Ohio, on February 3, 2023, after a train derailment to show that the water was safe. The derailment caused a fire and the release of toxic chemicals into the air and water making locals and environmental groups concerned for the quality of water in the area. Despite the EPA's assurance that the water is safe some residents do not trust the quality of the water and question its long-term effects. This incident also raised environmental justice concerns. Many residents felt that their community did not receive clear information or enough support during the cleanup, and some believed that a wealthier and more prevalent town might have received faster action and more resources. Community members reported strong odors, health symptoms, and fear of long-term contamination, and they argued that their voices were not taken seriously. These concerns reflected broader environmental justice issues, where communities with fewer resources often struggle to get the same level of protection, transparency, and attention from state and federal agencies. See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-243] | [TOKENS: 17273]
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023​, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America)
========================================
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-241] | [TOKENS: 17273]
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023​, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America)
========================================
[SOURCE: https://en.wikipedia.org/wiki/Social_network#cite_ref-76] | [TOKENS: 5247]
Contents Social network 1800s: Martineau · Tocqueville · Marx · Spencer · Le Bon · Ward · Pareto · Tönnies · Veblen · Simmel · Durkheim · Addams · Mead · Weber · Du Bois · Mannheim · Elias A social network is a social structure consisting of a set of social actors (such as individuals or organizations), networks of dyadic ties, and other social interactions between actors. The social network perspective provides a set of methods for analyzing the structure of whole social entities along with a variety of theories explaining the patterns observed in these structures. The study of these structures uses social network analysis to identify local and global patterns, locate influential entities, and examine dynamics of networks. For instance, social network analysis has been used in studying the spread of misinformation on social media platforms or analyzing the influence of key figures in social networks. Social networks and the analysis of them is an inherently interdisciplinary academic field which emerged from social psychology, sociology, statistics, and graph theory. Georg Simmel authored early structural theories in sociology emphasizing the dynamics of triads and "web of group affiliations". Jacob Moreno is credited with developing the first sociograms in the 1930s to study interpersonal relationships. These approaches were mathematically formalized in the 1950s and theories and methods of social networks became pervasive in the social and behavioral sciences by the 1980s. Social network analysis is now one of the major paradigms in contemporary sociology, and is also employed in a number of other social and formal sciences. Together with other complex networks, it forms part of the nascent field of network science. Overview The social network is a theoretical construct useful in the social sciences to study relationships between individuals, groups, organizations, or even entire societies (social units, see differentiation). The term is used to describe a social structure determined by such interactions. The ties through which any given social unit connects represent the convergence of the various social contacts of that unit. This theoretical approach is, necessarily, relational. An axiom of the social network approach to understanding social interaction is that social phenomena should be primarily conceived and investigated through the properties of relations between and within units, instead of the properties of these units themselves. Thus, one common criticism of social network theory is that individual agency is often ignored although this may not be the case in practice (see agent-based modeling). Precisely because many different types of relations, singular or in combination, form these network configurations, network analytics are useful to a broad range of research enterprises. In social science, these fields of study include, but are not limited to anthropology, biology, communication studies, economics, geography, information science, organizational studies, social psychology, sociology, and sociolinguistics. History In the late 1890s, both Émile Durkheim and Ferdinand Tönnies foreshadowed the idea of social networks in their theories and research of social groups. Tönnies argued that social groups can exist as personal and direct social ties that either link individuals who share values and belief (Gemeinschaft, German, commonly translated as "community") or impersonal, formal, and instrumental social links (Gesellschaft, German, commonly translated as "society"). Durkheim gave a non-individualistic explanation of social facts, arguing that social phenomena arise when interacting individuals constitute a reality that can no longer be accounted for in terms of the properties of individual actors. Georg Simmel, writing at the turn of the twentieth century, pointed to the nature of networks and the effect of network size on interaction and examined the likelihood of interaction in loosely knit networks rather than groups. Major developments in the field can be seen in the 1930s by several groups in psychology, anthropology, and mathematics working independently. In psychology, in the 1930s, Jacob L. Moreno began systematic recording and analysis of social interaction in small groups, especially classrooms and work groups (see sociometry). In anthropology, the foundation for social network theory is the theoretical and ethnographic work of Bronislaw Malinowski, Alfred Radcliffe-Brown, and Claude Lévi-Strauss. A group of social anthropologists associated with Max Gluckman and the Manchester School, including John A. Barnes, J. Clyde Mitchell and Elizabeth Bott Spillius, often are credited with performing some of the first fieldwork from which network analyses were performed, investigating community networks in southern Africa, India and the United Kingdom. Concomitantly, British anthropologist S. F. Nadel codified a theory of social structure that was influential in later network analysis. In sociology, the early (1930s) work of Talcott Parsons set the stage for taking a relational approach to understanding social structure. Later, drawing upon Parsons' theory, the work of sociologist Peter Blau provides a strong impetus for analyzing the relational ties of social units with his work on social exchange theory. By the 1970s, a growing number of scholars worked to combine the different tracks and traditions. One group consisted of sociologist Harrison White and his students at the Harvard University Department of Social Relations. Also independently active in the Harvard Social Relations department at the time were Charles Tilly, who focused on networks in political and community sociology and social movements, and Stanley Milgram, who developed the "six degrees of separation" thesis. Mark Granovetter and Barry Wellman are among the former students of White who elaborated and championed the analysis of social networks. Beginning in the late 1990s, social network analysis experienced work by sociologists, political scientists, and physicists such as Duncan J. Watts, Albert-László Barabási, Peter Bearman, Nicholas A. Christakis, James H. Fowler, and others, developing and applying new models and methods to emerging data available about online social networks, as well as "digital traces" regarding face-to-face networks. Levels of analysis In general, social networks are self-organizing, emergent, and complex, such that a globally coherent pattern appears from the local interaction of the elements that make up the system. These patterns become more apparent as network size increases. However, a global network analysis of, for example, all interpersonal relationships in the world is not feasible and is likely to contain so much information as to be uninformative. Practical limitations of computing power, ethics and participant recruitment and payment also limit the scope of a social network analysis. The nuances of a local system may be lost in a large network analysis, hence the quality of information may be more important than its scale for understanding network properties. Thus, social networks are analyzed at the scale relevant to the researcher's theoretical question. Although levels of analysis are not necessarily mutually exclusive, there are three general levels into which networks may fall: micro-level, meso-level, and macro-level. At the micro-level, social network research typically begins with an individual, snowballing as social relationships are traced, or may begin with a small group of individuals in a particular social context. Dyadic level: A dyad is a social relationship between two individuals. Network research on dyads may concentrate on structure of the relationship (e.g. multiplexity, strength), social equality, and tendencies toward reciprocity/mutuality. Triadic level: Add one individual to a dyad, and you have a triad. Research at this level may concentrate on factors such as balance and transitivity, as well as social equality and tendencies toward reciprocity/mutuality. In the balance theory of Fritz Heider the triad is the key to social dynamics. The discord in a rivalrous love triangle is an example of an unbalanced triad, likely to change to a balanced triad by a change in one of the relations. The dynamics of social friendships in society has been modeled by balancing triads. The study is carried forward with the theory of signed graphs. Actor level: The smallest unit of analysis in a social network is an individual in their social setting, i.e., an "actor" or "ego." Egonetwork analysis focuses on network characteristics, such as size, relationship strength, density, centrality, prestige and roles such as isolates, liaisons, and bridges. Such analyses, are most commonly used in the fields of psychology or social psychology, ethnographic kinship analysis or other genealogical studies of relationships between individuals. Subset level: Subset levels of network research problems begin at the micro-level, but may cross over into the meso-level of analysis. Subset level research may focus on distance and reachability, cliques, cohesive subgroups, or other group actions or behavior. In general, meso-level theories begin with a population size that falls between the micro- and macro-levels. However, meso-level may also refer to analyses that are specifically designed to reveal connections between micro- and macro-levels. Meso-level networks are low density and may exhibit causal processes distinct from interpersonal micro-level networks. Organizations: Formal organizations are social groups that distribute tasks for a collective goal. Network research on organizations may focus on either intra-organizational or inter-organizational ties in terms of formal or informal relationships. Intra-organizational networks themselves often contain multiple levels of analysis, especially in larger organizations with multiple branches, franchises or semi-autonomous departments. In these cases, research is often conducted at a work group level and organization level, focusing on the interplay between the two structures. Experiments with networked groups online have documented ways to optimize group-level coordination through diverse interventions, including the addition of autonomous agents to the groups. Randomly distributed networks: Exponential random graph models of social networks became state-of-the-art methods of social network analysis in the 1980s. This framework has the capacity to represent social-structural effects commonly observed in many human social networks, including general degree-based structural effects commonly observed in many human social networks as well as reciprocity and transitivity, and at the node-level, homophily and attribute-based activity and popularity effects, as derived from explicit hypotheses about dependencies among network ties. Parameters are given in terms of the prevalence of small subgraph configurations in the network and can be interpreted as describing the combinations of local social processes from which a given network emerges. These probability models for networks on a given set of actors allow generalization beyond the restrictive dyadic independence assumption of micro-networks, allowing models to be built from theoretical structural foundations of social behavior. Scale-free networks: A scale-free network is a network whose degree distribution follows a power law, at least asymptotically. In network theory a scale-free ideal network is a random network with a degree distribution that unravels the size distribution of social groups. Specific characteristics of scale-free networks vary with the theories and analytical tools used to create them, however, in general, scale-free networks have some common characteristics. One notable characteristic in a scale-free network is the relative commonness of vertices with a degree that greatly exceeds the average. The highest-degree nodes are often called "hubs", and may serve specific purposes in their networks, although this depends greatly on the social context. Another general characteristic of scale-free networks is the clustering coefficient distribution, which decreases as the node degree increases. This distribution also follows a power law. The Barabási model of network evolution shown above is an example of a scale-free network. Rather than tracing interpersonal interactions, macro-level analyses generally trace the outcomes of interactions, such as economic or other resource transfer interactions over a large population. Large-scale networks: Large-scale network is a term somewhat synonymous with "macro-level." It is primarily used in social and behavioral sciences, and in economics. Originally, the term was used extensively in the computer sciences (see large-scale network mapping). Complex networks: Most larger social networks display features of social complexity, which involves substantial non-trivial features of network topology, with patterns of complex connections between elements that are neither purely regular nor purely random (see, complexity science, dynamical system and chaos theory), as do biological, and technological networks. Such complex network features include a heavy tail in the degree distribution, a high clustering coefficient, assortativity or disassortativity among vertices, community structure (see stochastic block model), and hierarchical structure. In the case of agency-directed networks these features also include reciprocity, triad significance profile (TSP, see network motif), and other features. In contrast, many of the mathematical models of networks that have been studied in the past, such as lattices and random graphs, do not show these features. Theoretical links Various theoretical frameworks have been imported for the use of social network analysis. The most prominent of these are Graph theory, Balance theory, Social comparison theory, and more recently, the Social identity approach. Few complete theories have been produced from social network analysis. Two that have are structural role theory and heterophily theory. The basis of Heterophily Theory was the finding in one study that more numerous weak ties can be important in seeking information and innovation, as cliques have a tendency to have more homogeneous opinions as well as share many common traits. This homophilic tendency was the reason for the members of the cliques to be attracted together in the first place. However, being similar, each member of the clique would also know more or less what the other members knew. To find new information or insights, members of the clique will have to look beyond the clique to its other friends and acquaintances. This is what Granovetter called "the strength of weak ties". Structural holes In the context of networks, social capital exists where people have an advantage because of their location in a network. Contacts in a network provide information, opportunities and perspectives that can be beneficial to the central player in the network. Most social structures tend to be characterized by dense clusters of strong connections. Information within these clusters tends to be rather homogeneous and redundant. Non-redundant information is most often obtained through contacts in different clusters. When two separate clusters possess non-redundant information, there is said to be a structural hole between them. Thus, a network that bridges structural holes will provide network benefits that are in some degree additive, rather than overlapping. An ideal network structure has a vine and cluster structure, providing access to many different clusters and structural holes. Networks rich in structural holes are a form of social capital in that they offer information benefits. The main player in a network that bridges structural holes is able to access information from diverse sources and clusters. For example, in business networks, this is beneficial to an individual's career because he is more likely to hear of job openings and opportunities if his network spans a wide range of contacts in different industries/sectors. This concept is similar to Mark Granovetter's theory of weak ties, which rests on the basis that having a broad range of contacts is most effective for job attainment. Structural holes have been widely applied in social network analysis, resulting in applications in a wide range of practical scenarios as well as machine learning-based social prediction. Research clusters Research has used network analysis to examine networks created when artists are exhibited together in museum exhibition. Such networks have been shown to affect an artist's recognition in history and historical narratives, even when controlling for individual accomplishments of the artist. Other work examines how network grouping of artists can affect an individual artist's auction performance. An artist's status has been shown to increase when associated with higher status networks, though this association has diminishing returns over an artist's career. In J.A. Barnes' day, a "community" referred to a specific geographic location and studies of community ties had to do with who talked, associated, traded, and attended church with whom. Today, however, there are extended "online" communities developed through telecommunications devices and social network services. Such devices and services require extensive and ongoing maintenance and analysis, often using network science methods. Community development studies, today, also make extensive use of such methods. Complex networks require methods specific to modelling and interpreting social complexity and complex adaptive systems, including techniques of dynamic network analysis. Mechanisms such as Dual-phase evolution explain how temporal changes in connectivity contribute to the formation of structure in social networks. The study of social networks is being used to examine the nature of interdependencies between actors and the ways in which these are related to outcomes of conflict and cooperation. Areas of study include cooperative behavior among participants in collective actions such as protests; promotion of peaceful behavior, social norms, and public goods within communities through networks of informal governance; the role of social networks in both intrastate conflict and interstate conflict; and social networking among politicians, constituents, and bureaucrats. In criminology and urban sociology, much attention has been paid to the social networks among criminal actors. For example, murders can be seen as a series of exchanges between gangs. Murders can be seen to diffuse outwards from a single source, because weaker gangs cannot afford to kill members of stronger gangs in retaliation, but must commit other violent acts to maintain their reputation for strength. Diffusion of ideas and innovations studies focus on the spread and use of ideas from one actor to another or one culture and another. This line of research seeks to explain why some become "early adopters" of ideas and innovations, and links social network structure with facilitating or impeding the spread of an innovation. A case in point is the social diffusion of linguistic innovation such as neologisms. Experiments and large-scale field trials (e.g., by Nicholas Christakis and collaborators) have shown that cascades of desirable behaviors can be induced in social groups, in settings as diverse as Honduras villages, Indian slums, or in the lab. Still other experiments have documented the experimental induction of social contagion of voting behavior, emotions, risk perception, and commercial products. In demography, the study of social networks has led to new sampling methods for estimating and reaching populations that are hard to enumerate (for example, homeless people or intravenous drug users.) For example, respondent driven sampling is a network-based sampling technique that relies on respondents to a survey recommending further respondents. The field of sociology focuses almost entirely on networks of outcomes of social interactions. More narrowly, economic sociology considers behavioral interactions of individuals and groups through social capital and social "markets". Sociologists, such as Mark Granovetter, have developed core principles about the interactions of social structure, information, ability to punish or reward, and trust that frequently recur in their analyses of political, economic and other institutions. Granovetter examines how social structures and social networks can affect economic outcomes like hiring, price, productivity and innovation and describes sociologists' contributions to analyzing the impact of social structure and networks on the economy. Analysis of social networks is increasingly incorporated into health care analytics, not only in epidemiological studies but also in models of patient communication and education, disease prevention, mental health diagnosis and treatment, and in the study of health care organizations and systems. Human ecology is an interdisciplinary and transdisciplinary study of the relationship between humans and their natural, social, and built environments. The scientific philosophy of human ecology has a diffuse history with connections to geography, sociology, psychology, anthropology, zoology, and natural ecology. In the study of literary systems, network analysis has been applied by Anheier, Gerhards and Romo, De Nooy, Senekal, and Lotker, to study various aspects of how literature functions. The basic premise is that polysystem theory, which has been around since the writings of Even-Zohar, can be integrated with network theory and the relationships between different actors in the literary network, e.g. writers, critics, publishers, literary histories, etc., can be mapped using visualization from SNA. Research studies of formal or informal organization relationships, organizational communication, economics, economic sociology, and other resource transfers. Social networks have also been used to examine how organizations interact with each other, characterizing the many informal connections that link executives together, as well as associations and connections between individual employees at different organizations. Many organizational social network studies focus on teams. Within team network studies, research assesses, for example, the predictors and outcomes of centrality and power, density and centralization of team instrumental and expressive ties, and the role of between-team networks. Intra-organizational networks have been found to affect organizational commitment, organizational identification, interpersonal citizenship behaviour. Social capital is a form of economic and cultural capital in which social networks are central, transactions are marked by reciprocity, trust, and cooperation, and market agents produce goods and services not mainly for themselves, but for a common good. Social capital is split into three dimensions: the structural, the relational and the cognitive dimension. The structural dimension describes how partners interact with each other and which specific partners meet in a social network. Also, the structural dimension of social capital indicates the level of ties among organizations. This dimension is highly connected to the relational dimension which refers to trustworthiness, norms, expectations and identifications of the bonds between partners. The relational dimension explains the nature of these ties which is mainly illustrated by the level of trust accorded to the network of organizations. The cognitive dimension analyses the extent to which organizations share common goals and objectives as a result of their ties and interactions. Social capital is a sociological concept about the value of social relations and the role of cooperation and confidence to achieve positive outcomes. The term refers to the value one can get from their social ties. For example, newly arrived immigrants can make use of their social ties to established migrants to acquire jobs they may otherwise have trouble getting (e.g., because of unfamiliarity with the local language). A positive relationship exists between social capital and the intensity of social network use. In a dynamic framework, higher activity in a network feeds into higher social capital which itself encourages more activity. This particular cluster focuses on brand-image and promotional strategy effectiveness, taking into account the impact of customer participation on sales and brand-image. This is gauged through techniques such as sentiment analysis which rely on mathematical areas of study such as data mining and analytics. This area of research produces vast numbers of commercial applications as the main goal of any study is to understand consumer behaviour and drive sales. In many organizations, members tend to focus their activities inside their own groups, which stifles creativity and restricts opportunities. A player whose network bridges structural holes has an advantage in detecting and developing rewarding opportunities. Such a player can mobilize social capital by acting as a "broker" of information between two clusters that otherwise would not have been in contact, thus providing access to new ideas, opinions and opportunities. British philosopher and political economist John Stuart Mill, writes, "it is hardly possible to overrate the value of placing human beings in contact with persons dissimilar to themselves.... Such communication [is] one of the primary sources of progress." Thus, a player with a network rich in structural holes can add value to an organization through new ideas and opportunities. This in turn, helps an individual's career development and advancement. A social capital broker also reaps control benefits of being the facilitator of information flow between contacts. Full communication with exploratory mindsets and information exchange generated by dynamically alternating positions in a social network promotes creative and deep thinking. In the case of consulting firm Eden McCallum, the founders were able to advance their careers by bridging their connections with former big three consulting firm consultants and mid-size industry firms. By bridging structural holes and mobilizing social capital, players can advance their careers by executing new opportunities between contacts. There has been research that both substantiates and refutes the benefits of information brokerage. A study of high tech Chinese firms by Zhixing Xiao found that the control benefits of structural holes are "dissonant to the dominant firm-wide spirit of cooperation and the information benefits cannot materialize due to the communal sharing values" of such organizations. However, this study only analyzed Chinese firms, which tend to have strong communal sharing values. Information and control benefits of structural holes are still valuable in firms that are not quite as inclusive and cooperative on the firm-wide level. In 2004, Ronald Burt studied 673 managers who ran the supply chain for one of America's largest electronics companies. He found that managers who often discussed issues with other groups were better paid, received more positive job evaluations and were more likely to be promoted. Thus, bridging structural holes can be beneficial to an organization, and in turn, to an individual's career. Computer networks combined with social networking software produce a new medium for social interaction. A relationship over a computerized social networking service can be characterized by context, direction, and strength. The content of a relation refers to the resource that is exchanged. In a computer-mediated communication context, social pairs exchange different kinds of information, including sending a data file or a computer program as well as providing emotional support or arranging a meeting. With the rise of electronic commerce, information exchanged may also correspond to exchanges of money, goods or services in the "real" world. Social network analysis methods have become essential to examining these types of computer mediated communication. In addition, the sheer size and the volatile nature of social media has given rise to new network metrics. A key concern with networks extracted from social media is the lack of robustness of network metrics given missing data. Based on the pattern of homophily, ties between people are most likely to occur between nodes that are most similar to each other, or within neighbourhood segregation, individuals are most likely to inhabit the same regional areas as other individuals who are like them. Therefore, social networks can be used as a tool to measure the degree of segregation or homophily within a social network. Social Networks can both be used to simulate the process of homophily but it can also serve as a measure of level of exposure of different groups to each other within a current social network of individuals in a certain area. See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-242] | [TOKENS: 17273]
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023​, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America)
========================================
[SOURCE: https://en.wikipedia.org/wiki/Jews#cite_note-87] | [TOKENS: 15852]
Contents Jews Jews (Hebrew: יְהוּדִים‎, ISO 259-2: Yehudim, Israeli pronunciation: [jehuˈdim]), or the Jewish people, are an ethnoreligious group and nation, originating from the Israelites of ancient Israel and Judah. They traditionally adhere to Judaism. Jewish ethnicity, religion, and community are highly interrelated, as Judaism is an ethnic religion, though many ethnic Jews do not practice it. Religious Jews regard converts to Judaism as members of the Jewish nation, pursuant to the long-standing conversion process. The Israelites emerged from the pre-existing Canaanite peoples to establish Israel and Judah in the Southern Levant during the Iron Age. Originally, Jews referred to the inhabitants of the kingdom of Judah and were distinguished from the gentiles and the Samaritans. According to the Hebrew Bible, these inhabitants predominately originate from the tribe of Judah, who were descendants of Judah, the fourth son of Jacob. The tribe of Benjamin were another significant demographic in Judah and were considered Jews too. By the late 6th century BCE, Judaism had evolved from the Israelite religion, dubbed Yahwism (for Yahweh) by modern scholars, having a theology that religious Jews believe to be the expression of the Mosaic covenant between God and the Jewish people. After the Babylonian exile, Jews referred to followers of Judaism, descendants of the Israelites, citizens of Judea, or allies of the Judean state. Jewish migration within the Mediterranean region during the Hellenistic period, followed by population transfers, caused by events like the Jewish–Roman wars, gave rise to the Jewish diaspora, consisting of diverse Jewish communities that maintained their sense of Jewish history, identity, and culture. In the following millennia, Jewish diaspora communities coalesced into three major ethnic subdivisions according to where their ancestors settled: the Ashkenazim (Central and Eastern Europe), the Sephardim (Iberian Peninsula), and the Mizrahim (Middle East and North Africa). While these three major divisions account for most of the world's Jews, there are other smaller Jewish groups outside of the three. Prior to World War II, the global Jewish population reached a peak of 16.7 million, representing around 0.7% of the world's population at that time. During World War II, approximately six million Jews throughout Europe were systematically murdered by Nazi Germany in a genocide known as the Holocaust. Since then, the population has slowly risen again, and as of 2021[update], was estimated to be at 15.2 million by the demographer Sergio Della Pergola or less than 0.2% of the total world population in 2012.[b] Today, over 85% of Jews live in Israel or the United States. Israel, whose population is 73.9% Jewish, is the only country where Jews comprise more than 2.5% of the population. Jews have significantly influenced and contributed to the development and growth of human progress in many fields, both historically and in modern times, including in science and technology, philosophy, ethics, literature, governance, business, art, music, comedy, theatre, cinema, architecture, food, medicine, and religion. Jews founded Christianity and had an indirect but profound influence on Islam. In these ways and others, Jews have played a significant role in the development of Western culture. Name and etymology The term "Jew" is derived from the Hebrew word יְהוּדִי Yehudi, with the plural יְהוּדִים Yehudim. Endonyms in other Jewish languages include the Ladino ג׳ודיו Djudio (plural ג׳ודיוס, Djudios) and the Yiddish ייִד Yid (plural ייִדן Yidn). Though Genesis 29:35 and 49:8 connect "Judah" with the verb yada, meaning "praise", scholars generally agree that "Judah" most likely derives from the name of a Levantine geographic region dominated by gorges and ravines. The gradual ethnonymic shift from "Israelites" to "Jews", regardless of their descent from Judah, although not contained in the Torah, is made explicit in the Book of Esther (4th century BCE) of the Tanakh. Some modern scholars disagree with the conflation, based on the works of Josephus, Philo and Apostle Paul. The English word "Jew" is a derivation of Middle English Gyw, Iewe. The latter was loaned from the Old French giu, which itself evolved from the earlier juieu, which in turn derived from judieu/iudieu which through elision had dropped the letter "d" from the Medieval Latin Iudaeus, which, like the New Testament Greek term Ioudaios, meant both "Jew" and "Judean" / "of Judea". The Greek term was a loan from Aramaic *yahūdāy, corresponding to Hebrew יְהוּדִי Yehudi. Some scholars prefer translating Ioudaios as "Judean" in the Bible since it is more precise, denotes the community's origins and prevents readers from engaging in antisemitic eisegesis. Others disagree, believing that it erases the Jewish identity of Biblical characters such as Jesus. Daniel R. Schwartz distinguishes "Judean" and "Jew". Here, "Judean" refers to the inhabitants of Judea, which encompassed southern Palestine. Meanwhile, "Jew" refers to the descendants of Israelites that adhere to Judaism. Converts are included in the definition. But Shaye J.D. Cohen argues that "Judean" is inclusive of believers of the Judean God and allies of the Judean state. Another scholar, Jodi Magness, wrote the term Ioudaioi refers to a "people of Judahite/Judean ancestry who worshipped the God of Israel as their national deity and (at least nominally) lived according to his laws." The etymological equivalent is in use in other languages, e.g., يَهُودِيّ yahūdī (sg.), al-yahūd (pl.), in Arabic, "Jude" in German, "judeu" in Portuguese, "Juif" (m.)/"Juive" (f.) in French, "jøde" in Danish and Norwegian, "judío/a" in Spanish, "jood" in Dutch, "żyd" in Polish etc., but derivations of the word "Hebrew" are also in use to describe a Jew, e.g., in Italian (Ebreo), in Persian ("Ebri/Ebrani" (Persian: عبری/عبرانی)) and Russian (Еврей, Yevrey). The German word "Jude" is pronounced [ˈjuːdə], the corresponding adjective "jüdisch" [ˈjyːdɪʃ] (Jewish) is the origin of the word "Yiddish". According to The American Heritage Dictionary of the English Language, fourth edition (2000), It is widely recognized that the attributive use of the noun Jew, in phrases such as Jew lawyer or Jew ethics, is both vulgar and highly offensive. In such contexts Jewish is the only acceptable possibility. Some people, however, have become so wary of this construction that they have extended the stigma to any use of Jew as a noun, a practice that carries risks of its own. In a sentence such as There are now several Jews on the council, which is unobjectionable, the substitution of a circumlocution like Jewish people or persons of Jewish background may in itself cause offense for seeming to imply that Jew has a negative connotation when used as a noun. Identity Judaism shares some of the characteristics of a nation, an ethnicity, a religion, and a culture, making the definition of who is a Jew vary slightly depending on whether a religious or national approach to identity is used.[better source needed] Generally, in modern secular usage, Jews include three groups: people who were born to a Jewish family regardless of whether or not they follow the religion, those who have some Jewish ancestral background or lineage (sometimes including those who do not have strictly matrilineal descent), and people without any Jewish ancestral background or lineage who have formally converted to Judaism and therefore are followers of the religion. In the context of biblical and classical literature, Jews could refer to inhabitants of the Kingdom of Judah, or the broader Judean region, allies of the Judean state, or anyone that followed Judaism. Historical definitions of Jewish identity have traditionally been based on halakhic definitions of matrilineal descent, and halakhic conversions. These definitions of who is a Jew date back to the codification of the Oral Torah into the Babylonian Talmud, around 200 CE. Interpretations by Jewish sages of sections of the Tanakh – such as Deuteronomy 7:1–5, which forbade intermarriage between their Israelite ancestors and seven non-Israelite nations: "for that [i.e. giving your daughters to their sons or taking their daughters for your sons,] would turn away your children from following me, to serve other gods"[failed verification] – are used as a warning against intermarriage between Jews and gentiles. Leviticus 24:10 says that the son in a marriage between a Hebrew woman and an Egyptian man is "of the community of Israel." This is complemented by Ezra 10:2–3, where Israelites returning from Babylon vow to put aside their gentile wives and their children. A popular theory is that the rape of Jewish women in captivity brought about the law of Jewish identity being inherited through the maternal line, although scholars challenge this theory citing the Talmudic establishment of the law from the pre-exile period. Another argument is that the rabbis changed the law of patrilineal descent to matrilineal descent due to the widespread rape of Jewish women by Roman soldiers. Since the anti-religious Haskalah movement of the late 18th and 19th centuries, halakhic interpretations of Jewish identity have been challenged. According to historian Shaye J. D. Cohen, the status of the offspring of mixed marriages was determined patrilineally in the Bible. He brings two likely explanations for the change in Mishnaic times: first, the Mishnah may have been applying the same logic to mixed marriages as it had applied to other mixtures (Kil'ayim). Thus, a mixed marriage is forbidden as is the union of a horse and a donkey, and in both unions the offspring are judged matrilineally. Second, the Tannaim may have been influenced by Roman law, which dictated that when a parent could not contract a legal marriage, offspring would follow the mother. Rabbi Rivon Krygier follows a similar reasoning, arguing that Jewish descent had formerly passed through the patrilineal descent and the law of matrilineal descent had its roots in the Roman legal system. Origins The prehistory and ethnogenesis of the Jews are closely intertwined with archaeology, biology, historical textual records, mythology, and religious literature. The ethnic origin of the Jews lie in the Israelites, a confederation of Iron Age Semitic-speaking tribes that inhabited a part of Canaan during the tribal and monarchic periods. Modern Jews are named after and also descended from the southern Israelite Kingdom of Judah. Gary A. Rendsburg links the early Canaanite nomadic pastoralists confederation to the Shasu known to the Egyptians around the 15th century BCE. According to the Hebrew Bible narrative, Jewish history begins with the Biblical patriarchs such as Abraham, his son Isaac, Isaac's son Jacob, and the Biblical matriarchs Sarah, Rebecca, Leah, and Rachel, who lived in Canaan. The twelve sons of Jacob subsequently gave birth to the Twelve Tribes. Jacob and his family migrated to Ancient Egypt after being invited to live with Jacob's son Joseph by the Pharaoh himself. Jacob's descendants were later enslaved until the Exodus, led by Moses. Afterwards, the Israelites conquered Canaan under Moses' successor Joshua, and went through the period of the Biblical judges after the death of Joshua. Through the mediation of Samuel, the Israelites were subject to a king, Saul, who was succeeded by David and then Solomon, after whom the United Monarchy ended and was split into a separate Kingdom of Israel and a Kingdom of Judah. The Kingdom of Judah is described as comprising the tribes of Judah, Benjamin and partially, Levi. They later assimilated remnants of other tribes who migrated there from the northern Kingdom of Israel. In the extra-biblical record, the Israelites become visible as a people between 1200 and 1000 BCE. There is well accepted archeological evidence referring to "Israel" in the Merneptah Stele, which dates to about 1200 BCE, and in the Mesha stele from 840 BCE. It is debated whether a period like that of the Biblical judges occurred and if there ever was a United Monarchy. There is further disagreement about the earliest existence of the Kingdoms of Israel and Judah and their extent and power. Historians agree that a Kingdom of Israel existed by c. 900 BCE,: 169–95 there is a consensus that a Kingdom of Judah existed by c. 700 BCE at least, and recent excavations in Khirbet Qeiyafa have provided strong evidence for dating the Kingdom of Judah to the 10th century BCE. In 587 BCE, Nebuchadnezzar II, King of the Neo-Babylonian Empire, besieged Jerusalem, destroyed the First Temple and deported parts of the Judahite population. Scholars disagree regarding the extent to which the Bible should be accepted as a historical source for early Israelite history. Rendsburg states that there are two approximately equal groups of scholars who debate the historicity of the biblical narrative, the minimalists who largely reject it, and the maximalists who largely accept it, with the minimalists being the more vocal of the two. Some of the leading minimalists reframe the biblical account as constituting the Israelites' inspiring national myth narrative, suggesting that according to the modern archaeological and historical account, the Israelites and their culture did not overtake the region by force, but instead branched out of the Canaanite peoples and culture through the development of a distinct monolatristic—and later monotheistic—religion of Yahwism centered on Yahweh, one of the gods of the Canaanite pantheon. The growth of Yahweh-centric belief, along with a number of cultic practices, gradually gave rise to a distinct Israelite ethnic group, setting them apart from other Canaanites. According to Dever, modern archaeologists have largely discarded the search for evidence of the biblical narrative surrounding the patriarchs and the exodus. According to the maximalist position, the modern archaeological record independently points to a narrative which largely agrees with the biblical account. This narrative provides a testimony of the Israelites as a nomadic people known to the Egyptians as belonging to the Shasu. Over time these nomads left the desert and settled on the central mountain range of the land of Canaan, in simple semi-nomadic settlements in which pig bones are notably absent. This population gradually shifted from a tribal lifestyle to a monarchy. While the archaeological record of the ninth century BCE provides evidence for two monarchies, one in the south under a dynasty founded by a figure named David with its capital in Jerusalem, and one in the north under a dynasty founded by a figure named Omri with its capital in Samaria. It also points to an early monarchic period in which these regions shared material culture and religion, suggesting a common origin. Archaeological finds also provide evidence for the later cooperation of these two kingdoms in their coalition against Aram, and for their destructions by the Assyrians and later by the Babylonians. Genetic studies on Jews show that most Jews worldwide bear a common genetic heritage which originates in the Middle East, and that they share certain genetic traits with other Gentile peoples of the Fertile Crescent. The genetic composition of different Jewish groups shows that Jews share a common gene pool dating back four millennia, as a marker of their common ancestral origin. Despite their long-term separation, Jewish communities maintained their unique commonalities, propensities, and sensibilities in culture, tradition, and language. History The earliest recorded evidence of a people by the name of Israel appears in the Merneptah Stele, which dates to around 1200 BCE. The majority of scholars agree that this text refers to the Israelites, a group that inhabited the central highlands of Canaan, where archaeological evidence shows that hundreds of small settlements were constructed between the 12th and 10th centuries BCE. The Israelites differentiated themselves from neighboring peoples through various distinct characteristics including religious practices, prohibition on intermarriage, and an emphasis on genealogy and family history. In the 10th century BCE, two neighboring Israelite kingdoms—the northern Kingdom of Israel and the southern Kingdom of Judah—emerged. Since their inception, they shared ethnic, cultural, linguistic and religious characteristics despite a complicated relationship. Israel, with its capital mostly in Samaria, was larger and wealthier, and soon developed into a regional power. In contrast, Judah, with its capital in Jerusalem, was less prosperous and covered a smaller, mostly mountainous territory. However, while in Israel the royal succession was often decided by a military coup d'état, resulting in several dynasty changes, political stability in Judah was much greater, as it was ruled by the House of David for the whole four centuries of its existence. Scholars also describe Biblical Jews as a 'proto-nation', in the modern nationalist sense, comparable to classical Greeks, the Gauls and the British Celts. Around 720 BCE, Kingdom of Israel was destroyed when it was conquered by the Neo-Assyrian Empire, which came to dominate the ancient Near East. Under the Assyrian resettlement policy, a significant portion of the northern Israelite population was exiled to Mesopotamia and replaced by immigrants from the same region. During the same period, and throughout the 7th century BCE, the Kingdom of Judah, now under Assyrian vassalage, experienced a period of prosperity and witnessed a significant population growth. This prosperity continued until the Neo-Assyrian king Sennacherib devastated the region of Judah in response to a rebellion in the area, ultimately halting at Jerusalem. Later in the same century, the Assyrians were defeated by the rising Neo-Babylonian Empire, and Judah became its vassal. In 587 BCE, following a revolt in Judah, the Babylonian king Nebuchadnezzar II besieged and destroyed Jerusalem and the First Temple, putting an end to the kingdom. The majority of Jerusalem's residents, including the kingdom's elite, were exiled to Babylon. According to the Book of Ezra, the Persian Cyrus the Great ended the Babylonian exile in 538 BCE, the year after he captured Babylon. The exile ended with the return under Zerubbabel the Prince (so called because he was a descendant of the royal line of David) and Joshua the Priest (a descendant of the line of the former High Priests of the Temple) and their construction of the Second Temple circa 521–516 BCE. As part of the Persian Empire, the former Kingdom of Judah became the province of Judah (Yehud Medinata), with a smaller territory and a reduced population. Judea was under control of the Achaemenids until the fall of their empire in c. 333 BCE to Alexander the Great. After several centuries under foreign imperial rule, the Maccabean Revolt against the Seleucid Empire resulted in an independent Hasmonean kingdom, under which the Jews once again enjoyed political independence for a period spanning from 110 to 63 BCE. Under Hasmonean rule the boundaries of their kingdom were expanded to include not only the land of the historical kingdom of Judah, but also the Galilee and Transjordan. In the beginning of this process the Idumeans, who had infiltrated southern Judea after the destruction of the First Temple, were converted en masse. In 63 BCE, Judea was conquered by the Romans. From 37 BCE to 6 CE, the Romans allowed the Jews to maintain some degree of independence by installing the Herodian dynasty as vassal kings. However, Judea eventually came directly under Roman control and was incorporated into the Roman Empire as the province of Judaea. The Jewish–Roman wars, a series of failed uprisings against Roman rule during the first and second centuries CE, had profound and devastating consequences for the Jewish population of Judaea. The First Jewish–Roman War (66–73/74 CE) culminated in the destruction of Jerusalem and the Second Temple, after which the significantly diminished Jewish population was stripped of political autonomy. A few generations later, the Bar Kokhba revolt (132–136 CE) erupted in response to Roman plans to rebuild Jerusalem as a Roman colony, and, possibly, to restrictions on circumcision. Its violent suppression by the Romans led to the near-total depopulation of Judea, and the demographic and cultural center of Jewish life shifted to Galilee. Jews were subsequently banned from residing in Jerusalem and the surrounding area, and the province of Judaea was renamed Syria Palaestina. These developments effectively ended Jewish efforts to restore political sovereignty in the region for nearly two millennia. Similar upheavals impacted the Jewish communities in the empire's eastern provinces during the Diaspora Revolt (115–117 CE), leading to the near-total destruction of Jewish diaspora communities in Libya, Cyprus and Egypt, including the highly influential community in Alexandria. The destruction of the Second Temple in 70 CE brought profound changes to Judaism. With the Temple's central place in Jewish worship gone, religious practices shifted towards prayer, Torah study (including Oral Torah), and communal gatherings in synagogues. Judaism also lost much of its sectarian nature.: 69 Two of the three main sects that flourished during the late Second Temple period, namely the Sadducees and Essenes, eventually disappeared, while Pharisaic beliefs became the foundational, liturgical, and ritualistic basis of Rabbinic Judaism, which emerged as the prevailing form of Judaism since late antiquity. The Jewish diaspora existed well before the destruction of the Second Temple in 70 CE and had been ongoing for centuries, with the dispersal driven by both forced expulsions and voluntary migrations. In Mesopotamia, a testimony to the beginnings of the Jewish community can be found in Joachin's ration tablets, listing provisions allotted to the exiled Judean king and his family by Nebuchadnezzar II, and further evidence are the Al-Yahudu tablets, dated to the 6th–5th centuries BCE and related to the exiles from Judea arriving after the destruction of the First Temple, though there is ample evidence for the presence of Jews in Babylonia even from 626 BCE. In Egypt, the documents from Elephantine reveal the trials of a community founded by a Persian Jewish garrison at two fortresses on the frontier during the 5th–4th centuries BCE, and according to Josephus the Jewish community in Alexandria existed since the founding of the city in the 4th century BCE by Alexander the Great. By 200 BCE, there were well established Jewish communities both in Egypt and Mesopotamia ("Babylonia" in Jewish sources) and in the two centuries that followed, Jewish populations were also present in Asia Minor, Greece, Macedonia, Cyrene, and, beginning in the middle of the first century BCE, in the city of Rome. Later, in the first centuries CE, as a result of the Jewish-Roman Wars, a large number of Jews were taken as captives, sold into slavery, or compelled to flee from the regions affected by the wars, contributing to the formation and expansion of Jewish communities across the Roman Empire as well as in Arabia and Mesopotamia. After the Bar Kokhba revolt, the Jewish population in Judaea—now significantly reduced— made efforts to recover from the revolt's devastating effects, but never fully regained its former strength. Between the second and fourth centuries CE, the region of Galilee emerged as the primary center of Jewish life in Syria Palaestina, experiencing both demographic growth and cultural development. It was during this period that two central rabbinic texts, the Mishnah and the Jerusalem Talmud, were composed. The Romans recognized the patriarchs—rabbinic sages such as Judah ha-Nasi—as representatives of the Jewish people, granting them a certain degree of autonomy. However, as the Roman Empire gave way to the Christianized Byzantine Empire under Constantine, Jews began to face persecution by both the Church and imperial authorities, Jews came to be persecuted by the church and the authorities, and many immigrated to communities in the diaspora. By the fourth century CE, Jews are believed to have lost their demographic majority in Syria Palaestina. The long-established Jewish community of Mesopotamia, which had been living under Parthian and later Sasanian rule, beyond the confines of the Roman Empire, became an important center of Jewish study as Judea's Jewish population declined. Estimates often place the Babylonian Jewish community of the 3rd to 7th centuries at around one million, making it the largest Jewish diaspora community of that period. Under the political leadership of the exilarch, who was regarded as a royal heir of the House of David, this community had an autonomous status and served as a place of refuge for the Jews of Syria Palaestina. A number of significant Talmudic academies, such as the Nehardea, Pumbedita, and Sura academies, were established in Mesopotamia, and many important Amoraim were active there. The Babylonian Talmud, a centerpiece of Jewish religious law, was compiled in Babylonia in the 3rd to 6th centuries. Jewish diaspora communities are generally described to have coalesced into three major ethnic subdivisions according to where their ancestors settled: the Ashkenazim (initially in the Rhineland and France), the Sephardim (initially in the Iberian Peninsula), and the Mizrahim (Middle East and North Africa). Romaniote Jews, Tunisian Jews, Yemenite Jews, Egyptian Jews, Ethiopian Jews, Bukharan Jews, Mountain Jews, and other groups also predated the arrival of the Sephardic diaspora. During the same period, Jewish communities in the Middle East thrived under Islamic rule, especially in cities like Baghdad, Cairo, and Damascus. In Babylonia, from the 7th to 11th centuries the Pumbedita and Sura academies led the Arab and to an extent the entire Jewish world. The deans and students of said academies defined the Geonic period in Jewish history. Following this period were the Rishonim who lived from the 11th to 15th centuries. Like their European counterparts, Jews in the Middle East and North Africa also faced periods of persecution and discriminatory policies, with the Almohad Caliphate in North Africa and Iberia issuing forced conversion decrees, causing Jews such as Maimonides to seek safety in other regions. Despite experiencing repeated waves of persecution, Ashkenazi Jews in Western Europe worked in a variety of fields, making an impact on their communities' economy and societies. In Francia, for example, figures like Isaac Judaeus and Armentarius occupied prominent social and economic positions. Francia also witnessed the development of a sophisticated tradition of biblical commentary, as exemplified by Rashi and the tosafists. In 1144, the first documented blood libel occurred in Norwich, England, marking an escalation in the pattern of discrimination and violence that Jews had already been subjected to throughout medieval Europe. During the 12th and 13th centuries, Jews faced frequent antisemitic legislation - including laws prescribing distinctive dress - alongside segregation, repeated blood libels, pogroms, and massacres such as the Rhineland Massacres (1066). The Jews of the Holy Roman Empire were designated Servi camerae regis (“servants of the imperial chamber”) by Frederick II, a status that afforded limited protection while simultaneously entangling them in the political struggles between the emperor and the German principalities and cities. Persecution intensified during the Black Death in the mid-14th century, when Jews were accused of poisoning wells and many communities were destroyed. These pressures, combined with major expulsions such as that from England in 1290, gradually pushed Ashkenazi Jewish populations eastward into Poland, Lithuania, and Russia. One of the largest Jewish communities of the Middle Ages was in the Iberian Peninsula, which for a time contained the largest Jewish population in Europe. Iberian Jewry endured discrimination under the Visigoths but saw its fortunes improve under Umayyad rule and later the Taifa kingdoms. During this period, the Jews of Muslim Spain entered a "Golden Age" marked by achievements in Hebrew poetry and literature, religious scholarship, grammar, medicine and science, with leading figures including Hasdai ibn Shaprut, Judah Halevi, Moses ibn Ezra and Solomon ibn Gabirol. Jews also rose to high office, most notably Samuel ibn Naghrillah, a scholar and poet who served as grand vizier and military commander of Granada. The Golden Age ended with the rise of the radical Almoravid and Almohad dynasties, whose persecutions drove many Jews from Iberia (including Maimonides), together with the advancing Reconquista. In 1391, widespread pogroms swept across Spain, leaving thousands dead and forcing mass conversions. The Spanish Inquisition was later established to pursue, torture and execute conversos who continued to practice Judaism in secret, while public disputations were staged to discredit Judaism. In 1492, after the Reconquista, Isabella I of Castile and Ferdinand II of Aragon decreed the expulsion of all Jews who refused conversion, sending an estimated 200,000 into exile in Portugal, Italy, North Africa, and the Ottoman Empire. In 1497, Portugal's Jews, about 30,000, were formally ordered expelled but instead were forcibly converted to retain their economic role. In 1498, some 3,500 Jews were expelled from Navarre. Many converts outwardly adopted Christianity while secretly preserving Jewish practices, becoming crypto-Jews (also known as marranos or anusim), who remained targets of the various Inquisitions for centuries. Following the expulsions from Spain and Portugal in the 1490s, Jewish exiles dispersed across the Mediterranean, Europe, and North Africa. Many settled in the Ottoman Empire—which, replacing the Iberian Peninsula, became home to the world's largest Jewish population—where new communities developed in Anatolia, the Balkans, and the Land of Israel. Cities such as Istanbul and Thessaloniki grew into major Jewish centers, while in 16th-century Safed a flourishing spiritual life took shape. There, Solomon Alkabetz, Moses Cordovero, and Isaac Luria developed influential new schools of Kabbalah, giving powerful impetus to Jewish mysticism, and Joseph Karo composed the Shulchan Aruch, which became a cornerstone of Jewish law. In the 17th century, Portuguese conversos who returned to Judaism and engaged in trade and banking helped establish Amsterdam as a prosperous Jewish center, while also forming communities in cities such as Antwerp and London. This period also witnessed waves of messianic fervor, most notably the rise of the Sabbatean movement in the 1660s, led by Sabbatai Zvi of İzmir, which reverberated throughout the Jewish world. In Eastern Europe, Poland–Lithuania became the principal center of Ashkenazi Jewry, eventually becoming home to the largest Jewish population in the world. Jewish life flourished there from in the early modern era, supported by relative stability, economic opportunity, and strong communal institutions. The mid-17th century brought devastation with the Cossack uprisings in Ukraine, which reversed migration flows and sent refugees westward, yet Poland–Lithuania remained the demographic and cultural heartland of Ashkenazic Jewry. Following the partitions of Poland, most of its Jews came under Russian rule and were confined to the "Pale of Settlement." The 18th century also witnessed new religious and intellectual currents. Hasidism, founded by Baal Shem Tov, emphasized mysticism and piety, while its opponents, the Misnagdim ("opponents") led by the Vilna Gaon, defended rabbinic scholarship and tradition. In Western Europe, during the 1760s and 1770s, the Haskalah (Jewish Enlightenment) emerged in German-speaking lands, where figures such as Moses Mendelssohn promoted secular learning, vernacular literacy, and integration into European society. Elsewhere, Jews began to be re-admitted to Western Europe, including England, where Menasseh ben Israel petitioned Oliver Cromwell for their return. In the Americas, Jews of Sephardic descent first arrived as conversos in Spanish and Portuguese colonies, where many faced trial by Inquisition tribunals for "judaizing." A more durable presence began in Dutch Brazil, where Jews openly practiced their religion and established the first synagogues in the New World, before the Portuguese reconquest forced their dispersal to Amsterdam, the Caribbean, and North America. Sephardic communities took root in Curaçao, Suriname, Jamaica, and Barbados, later joined by Ashkenazi migrants. In North America, Jews were present from the mid-17th century, with New Amsterdam hosting the first organized congregation in 1654. By the time of the American Revolution, small communities in New York, Newport, Philadelphia, Savannah, and Charleston played an active role in the struggle for independence. In the late 19th century, Jews in Western Europe gradually achieved legal emancipation, though social acceptance remained limited by persistent antisemitism and rising nationalism. In Eastern Europe, particularly within the Russian Empire's Pale of Settlement, Jews faced mounting legal restrictions and recurring pogroms. From this environment emerged Zionism, a national revival movement originating in Central and Eastern Europe that sought to re-establish a Jewish polity in the Land of Israel as a means of returning the Jewish people to their ancestral homeland and ending centuries of exile and persecution. This led to waves of Jewish migration to Ottoman-controlled Palestine. Theodor Herzl, who is considered the father of political Zionism, offered his vision of a future Jewish state in his 1896 book Der Judenstaat (The Jewish State); a year later, he presided over the First Zionist Congress. The antisemitism that inflicted Jewish communities in Europe also triggered a mass exodus of 2.8 million Jews to the United States between 1881 and 1924. Despite this, some Jews of Europe and the United States were able to make great achievements in various fields of science and culture. Among the most influential from this period are Albert Einstein in physics, Sigmund Freud in psychology, Franz Kafka in literature, and Irving Berlin in music. Many Nobel Prize winners at this time were Jewish, as is still the case. When Adolf Hitler and the Nazi Party came to power in Germany in 1933, the situation for Jews deteriorated rapidly as a direct result of Nazi policies. Many Jews fled from Europe to Mandatory Palestine, the United States, and the Soviet Union as a result of racial anti-Semitic laws, economic difficulties, and the fear of an impending war. World War II started in 1939, and by 1941, Hitler occupied almost all of Europe. Following the German invasion of the Soviet Union in 1941, the Final Solution—an extensive, organized effort with an unprecedented scope intended to annihilate the Jewish people—began, and resulted in the persecution and murder of Jews in Europe and North Africa. In Poland, three million were murdered in gas chambers in all concentration camps combined, with one million at the Auschwitz camp complex alone. The Holocaust is the name given to this genocide, in which six million Jews in total were systematically murdered. Before and during the Holocaust, enormous numbers of Jews immigrated to Mandatory Palestine. In 1944, the Jewish insurgency in Mandatory Palestine began with the aim of gaining full independence from the United Kingdom. On 14 May 1948, upon the termination of the mandate, David Ben-Gurion declared the creation of the State of Israel, a Jewish and democratic state. Immediately afterwards, all neighboring Arab states invaded, and were resisted by the newly formed Israel Defense Forces. In 1949, the war ended and Israel started building its state and absorbing waves of Aliyah, granting citizenship to Jews all over the world via the Law of Return passed in 1950. However, both the Israeli–Palestinian conflict and wider Arab–Israeli conflict continue to this day. Culture The Jewish people and the religion of Judaism are strongly interrelated. Converts to Judaism have a status within the Jewish people equal to those born into it. However, converts who go on to practice no Judaism are likely to be viewed with skepticism. Mainstream Judaism does not proselytize, and conversion is considered a difficult task. A significant portion of conversions are undertaken by children of mixed marriages, or would-be or current spouses of Jews. The Hebrew Bible, a religious interpretation of the traditions and early history of the Jews, established the first of the Abrahamic religions, which are now practiced by 54 percent of the world. Judaism guides its adherents in both practice and belief, and has been called not only a religion, but also a "way of life," which has made drawing a clear distinction between Judaism, Jewish culture, and Jewish identity rather difficult. Throughout history, in eras and places as diverse as the ancient Hellenic world, in Europe before and after The Age of Enlightenment (see Haskalah), in Islamic Spain and Portugal, in North Africa and the Middle East, India, China, or the contemporary United States and Israel, cultural phenomena have developed that are in some sense characteristically Jewish without being at all specifically religious. Some factors in this come from within Judaism, others from the interaction of Jews or specific communities of Jews with their surroundings, and still others from the inner social and cultural dynamics of the community, as opposed to from the religion itself. This phenomenon has led to considerably different Jewish cultures unique to their own communities. Hebrew is the liturgical language of Judaism (termed lashon ha-kodesh, "the holy tongue"), the language in which most of the Hebrew scriptures (Tanakh) were composed, and the daily speech of the Jewish people for centuries. By the 5th century BCE, Aramaic, a closely related tongue, joined Hebrew as the spoken language in Judea. By the 3rd century BCE, some Jews of the diaspora were speaking Greek. Others, such as in the Jewish communities of Asoristan, known to Jews as Babylonia, were speaking Hebrew and Aramaic, the languages of the Babylonian Talmud. Dialects of these same languages were also used by the Jews of Syria Palaestina at that time.[citation needed] For centuries, Jews worldwide have spoken the local or dominant languages of the regions they migrated to, often developing distinctive dialectal forms or branches that became independent languages. Yiddish is the Judaeo-German language developed by Ashkenazi Jews who migrated to Central Europe. Ladino is the Judaeo-Spanish language developed by Sephardic Jews who migrated to the Iberian Peninsula. Due to many factors, including the impact of the Holocaust on European Jewry, the Jewish exodus from Arab and Muslim countries, and widespread emigration from other Jewish communities around the world, ancient and distinct Jewish languages of several communities, including Judaeo-Georgian, Judaeo-Arabic, Judaeo-Berber, Krymchak, Judaeo-Malayalam and many others, have largely fallen out of use. For over sixteen centuries Hebrew was used almost exclusively as a liturgical language, and as the language in which most books had been written on Judaism, with a few speaking only Hebrew on the Sabbath. Hebrew was revived as a spoken language by Eliezer ben Yehuda, who arrived in Palestine in 1881. It had not been used as a mother tongue since Tannaic times. Modern Hebrew is designated as the "State language" of Israel. Despite efforts to revive Hebrew as the national language of the Jewish people, knowledge of the language is not commonly possessed by Jews worldwide and English has emerged as the lingua franca of the Jewish diaspora. Although many Jews once had sufficient knowledge of Hebrew to study the classic literature, and Jewish languages like Yiddish and Ladino were commonly used as recently as the early 20th century, most Jews lack such knowledge today and English has by and large superseded most Jewish vernaculars. The three most commonly spoken languages among Jews today are Hebrew, English, and Russian. Some Romance languages, particularly French and Spanish, are also widely used. Yiddish has been spoken by more Jews in history than any other language, but it is far less used today following the Holocaust and the adoption of Modern Hebrew by the Zionist movement and the State of Israel. In some places, the mother language of the Jewish community differs from that of the general population or the dominant group. For example, in Quebec, the Ashkenazic majority has adopted English, while the Sephardic minority uses French as its primary language. Similarly, South African Jews adopted English rather than Afrikaans. Due to both Czarist and Soviet policies, Russian has superseded Yiddish as the language of Russian Jews, but these policies have also affected neighboring communities. Today, Russian is the first language for many Jewish communities in a number of Post-Soviet states, such as Ukraine and Uzbekistan,[better source needed] as well as for Ashkenazic Jews in Azerbaijan, Georgia, and Tajikistan. Although communities in North Africa today are small and dwindling, Jews there had shifted from a multilingual group to a monolingual one (or nearly so), speaking French in Algeria, Morocco, and the city of Tunis, while most North Africans continue to use Arabic or Berber as their mother tongue.[citation needed] There is no single governing body for the Jewish community, nor a single authority with responsibility for religious doctrine. Instead, a variety of secular and religious institutions at the local, national, and international levels lead various parts of the Jewish community on a variety of issues. Today, many countries have a Chief Rabbi who serves as a representative of that country's Jewry. Although many Hasidic Jews follow a certain hereditary Hasidic dynasty, there is no one commonly accepted leader of all Hasidic Jews. Many Jews believe that the Messiah will act a unifying leader for Jews and the entire world. A number of modern scholars of nationalism support the existence of Jewish national identity in antiquity. One of them is David Goodblatt, who generally believes in the existence of nationalism before the modern period. In his view, the Bible, the parabiblical literature and the Jewish national history provide the base for a Jewish collective identity. Although many of the ancient Jews were illiterate (as were their neighbors), their national narrative was reinforced through public readings. The Hebrew language also constructed and preserved national identity. Although it was not widely spoken after the 5th century BCE, Goodblatt states: the mere presence of the language in spoken or written form could invoke the concept of a Jewish national identity. Even if one knew no Hebrew or was illiterate, one could recognize that a group of signs was in Hebrew script. ... It was the language of the Israelite ancestors, the national literature, and the national religion. As such it was inseparable from the national identity. Indeed its mere presence in visual or aural medium could invoke that identity. Anthony D. Smith, an historical sociologist considered one of the founders of the field of nationalism studies, wrote that the Jews of the late Second Temple period provide "a closer approximation to the ideal type of the nation [...] than perhaps anywhere else in the ancient world." He adds that this observation "must make us wary of pronouncing too readily against the possibility of the nation, and even a form of religious nationalism, before the onset of modernity." Agreeing with Smith, Goodblatt suggests omitting the qualifier "religious" from Smith's definition of ancient Jewish nationalism, noting that, according to Smith, a religious component in national memories and culture is common even in the modern era. This view is echoed by political scientist Tom Garvin, who writes that "something strangely like modern nationalism is documented for many peoples in medieval times and in classical times as well," citing the ancient Jews as one of several "obvious examples", alongside the classical Greeks and the Gaulish and British Celts. Fergus Millar suggests that the sources of Jewish national identity and their early nationalist movements in the first and second centuries CE included several key elements: the Bible as both a national history and legal source, the Hebrew language as a national language, a system of law, and social institutions such as schools, synagogues, and Sabbath worship. Adrian Hastings argued that Jews are the "true proto-nation", that through the model of ancient Israel found in the Hebrew Bible, provided the world with the original concept of nationhood which later influenced Christian nations. However, following Jerusalem's destruction in the first century CE, Jews ceased to be a political entity and did not resemble a traditional nation-state for almost two millennia. Despite this, they maintained their national identity through collective memory, religion and sacred texts, even without land or political power, and remained a nation rather than just an ethnic group, eventually leading to the rise of Zionism and the establishment of Israel. Steven Weitzman suggests that Jewish nationalist sentiment in antiquity was encouraged because under foreign rule (Persians, Greeks, Romans) Jews were able to claim that they were an ancient nation. This claim was based on the preservation and reverence of their scriptures, the Hebrew language, the Temple and priesthood, and other traditions of their ancestors. Doron Mendels further observes that the Hasmonean kingdom, one of the few examples of indigenous statehood at its time, significantly reinforced Jewish national consciousness. The memory of this period of independence contributed to the persistent efforts to revive Jewish sovereignty in Judea, leading to the major revolts against Roman rule in the 1st and 2nd centuries CE. Demographics Within the world's Jewish population there are distinct ethnic divisions, most of which are primarily the result of geographic branching from an originating Israelite population, and subsequent independent evolutions. An array of Jewish communities was established by Jewish settlers in various places around the Old World, often at great distances from one another, resulting in effective and often long-term isolation. During the millennia of the Jewish diaspora the communities would develop under the influence of their local environments: political, cultural, natural, and populational. Today, manifestations of these differences among the Jews can be observed in Jewish cultural expressions of each community, including Jewish linguistic diversity, culinary preferences, liturgical practices, religious interpretations, as well as degrees and sources of genetic admixture. Jews are often identified as belonging to one of two major groups: the Ashkenazim and the Sephardim. Ashkenazim are so named in reference to their geographical origins (their ancestors' culture coalesced in the Rhineland, an area historically referred to by Jews as Ashkenaz). Similarly, Sephardim (Sefarad meaning "Spain" in Hebrew) are named in reference their origins in Iberia. The diverse groups of Jews of the Middle East and North Africa are often collectively referred to as Sephardim together with Sephardim proper for liturgical reasons having to do with their prayer rites. A common term for many of these non-Spanish Jews who are sometimes still broadly grouped as Sephardim is Mizrahim (lit. 'easterners' in Hebrew). Nevertheless, Mizrahis and Sepharadim are usually ethnically distinct. Smaller groups include, but are not restricted to, Indian Jews such as the Bene Israel, Bnei Menashe, Cochin Jews, and Bene Ephraim; the Romaniotes of Greece; the Italian Jews ("Italkim" or "Bené Roma"); the Teimanim from Yemen; various African Jews, including most numerously the Beta Israel of Ethiopia; and Chinese Jews, most notably the Kaifeng Jews, as well as various other distinct but now almost extinct communities. The divisions between all these groups are approximate and their boundaries are not always clear. The Mizrahim for example, are a heterogeneous collection of North African, Central Asian, Caucasian, and Middle Eastern Jewish communities that are no closer related to each other than they are to any of the earlier mentioned Jewish groups. In modern usage, however, the Mizrahim are sometimes termed Sephardi due to similar styles of liturgy, despite independent development from Sephardim proper. Thus, among Mizrahim there are Egyptian Jews, Iraqi Jews, Lebanese Jews, Kurdish Jews, Moroccan Jews, Libyan Jews, Syrian Jews, Bukharian Jews, Mountain Jews, Georgian Jews, Iranian Jews, Afghan Jews, and various others. The Teimanim from Yemen are sometimes included, although their style of liturgy is unique and they differ in respect to the admixture found among them to that found in Mizrahim. In addition, there is a differentiation made between Sephardi migrants who established themselves in the Middle East and North Africa after the expulsion of the Jews from Spain and Portugal in the 1490s and the pre-existing Jewish communities in those regions. Ashkenazi Jews represent the bulk of modern Jewry, with at least 70 percent of Jews worldwide (and up to 90 percent prior to World War II and the Holocaust). As a result of their emigration from Europe, Ashkenazim also represent the overwhelming majority of Jews in the New World continents, in countries such as the United States, Canada, Argentina, Australia, and Brazil. In France, the immigration of Jews from Algeria (Sephardim) has led them to outnumber the Ashkenazim. Only in Israel is the Jewish population representative of all groups, a melting pot independent of each group's proportion within the overall world Jewish population. Y DNA studies tend to imply a small number of founders in an old population whose members parted and followed different migration paths. In most Jewish populations, these male line ancestors appear to have been mainly Middle Eastern. For example, Ashkenazi Jews share more common paternal lineages with other Jewish and Middle Eastern groups than with non-Jewish populations in areas where Jews lived in Eastern Europe, Germany, and the French Rhine Valley. This is consistent with Jewish traditions in placing most Jewish paternal origins in the region of the Middle East. Conversely, the maternal lineages of Jewish populations, studied by looking at mitochondrial DNA, are generally more heterogeneous. Scholars such as Harry Ostrer and Raphael Falk believe this indicates that many Jewish males found new mates from European and other communities in the places where they migrated in the diaspora after fleeing ancient Israel. In contrast, Behar has found evidence that about 40 percent of Ashkenazi Jews originate maternally from just four female founders, who were of Middle Eastern origin. The populations of Sephardi and Mizrahi Jewish communities "showed no evidence for a narrow founder effect." Subsequent studies carried out by Feder et al. confirmed the large portion of non-local maternal origin among Ashkenazi Jews. Reflecting on their findings related to the maternal origin of Ashkenazi Jews, the authors conclude "Clearly, the differences between Jews and non-Jews are far larger than those observed among the Jewish communities. Hence, differences between the Jewish communities can be overlooked when non-Jews are included in the comparisons." However, a 2025 genetic study on the Ashkenazi Jewish founder population supports the presence of a substantial Near Eastern component in the maternal lineages. Analyses of mitochondrial DNA (mtDNA) indicate that the core founder lineages, estimated at around 54, likely originated from the Near East, with these founder signatures appearing in multiple copies across the population. While later admixture introduced additional mtDNA lineages, these absorbed lineages are distinguishable from the original founders. The findings are consistent with genome-wide Identity-by-Descent and Lineage Extinction analyses, reinforcing the Near Eastern origin of the Ashkenazi maternal founders. A study showed that 7% of Ashkenazi Jews have the haplogroup G2c, which is mainly found in Pashtuns and on lower scales all major Jewish groups, Palestinians, Syrians, and Lebanese. Studies of autosomal DNA, which look at the entire DNA mixture, have become increasingly important as the technology develops. They show that Jewish populations have tended to form relatively closely related groups in independent communities, with most in a community sharing significant ancestry in common. For Jewish populations of the diaspora, the genetic composition of Ashkenazi, Sephardic, and Mizrahi Jewish populations show a predominant amount of shared Middle Eastern ancestry. According to Behar, the most parsimonious explanation for this shared Middle Eastern ancestry is that it is "consistent with the historical formulation of the Jewish people as descending from ancient Hebrew and Israelite residents of the Levant" and "the dispersion of the people of ancient Israel throughout the Old World". North African, Italian and others of Iberian origin show variable frequencies of admixture with non-Jewish historical host populations among the maternal lines. In the case of Ashkenazi and Sephardi Jews (in particular Moroccan Jews), who are closely related, the source of non-Jewish admixture is mainly Southern European, while Mizrahi Jews show evidence of admixture with other Middle Eastern populations. Behar et al. have remarked on a close relationship between Ashkenazi Jews and modern Italians. A 2001 study found that Jews were more closely related to groups of the Fertile Crescent (Kurds, Turks, and Armenians) than to their Arab neighbors, whose genetic signature was found in geographic patterns reflective of Islamic conquests. The studies also show that Sephardic Bnei Anusim (descendants of the "anusim" who were forced to convert to Catholicism), which comprise up to 19.8 percent of the population of today's Iberia (Spain and Portugal) and at least 10 percent of the population of Ibero-America (Hispanic America and Brazil), have Sephardic Jewish ancestry within the last few centuries. The Bene Israel and Cochin Jews of India, Beta Israel of Ethiopia, and a portion of the Lemba people of Southern Africa, despite more closely resembling the local populations of their native countries, have also been thought to have some more remote ancient Jewish ancestry. Views on the Lemba have changed and genetic Y-DNA analyses in the 2000s have established a partially Middle-Eastern origin for a portion of the male Lemba population but have been unable to narrow this down further. Although historically, Jews have been found all over the world, in the decades since World War II and the establishment of Israel, they have increasingly concentrated in a small number of countries. In 2021, Israel and the United States together accounted for over 85 percent of the global Jewish population, with approximately 45.3% and 39.6% of the world's Jews, respectively. More than half (51.2%) of world Jewry resides in just ten metropolitan areas. As of 2021, these ten areas were Tel Aviv, New York, Jerusalem, Haifa, Los Angeles, Miami, Philadelphia, Paris, Washington, and Chicago. The Tel Aviv metro area has the highest percent of Jews among the total population (94.8%), followed by Jerusalem (72.3%), Haifa (73.1%), and Beersheba (60.4%), the balance mostly being Israeli Arabs. Outside Israel, the highest percent of Jews in a metropolitan area was in New York (10.8%), followed by Miami (8.7%), Philadelphia (6.8%), San Francisco (5.1%), Washington (4.7%), Los Angeles (4.7%), Toronto (4.5%), and Baltimore (4.1%). As of 2010, there were nearly 14 million Jews around the world, roughly 0.2% of the world's population at the time. According to the 2007 estimates of The Jewish People Policy Planning Institute, the world's Jewish population is 13.2 million. This statistic incorporates both practicing Jews affiliated with synagogues and the Jewish community, and approximately 4.5 million unaffiliated and secular Jews.[citation needed] According to Sergio Della Pergola, a demographer of the Jewish population, in 2021 there were about 6.8 million Jews in Israel, 6 million in the United States, and 2.3 million in the rest of the world. Israel, the Jewish nation-state, is the only country in which Jews make up a majority of the citizens. Israel was established as an independent democratic and Jewish state on 14 May 1948. Of the 120 members in its parliament, the Knesset, as of 2016[update], 14 members of the Knesset are Arab citizens of Israel (not including the Druze), most representing Arab political parties. One of Israel's Supreme Court judges is also an Arab citizen of Israel. Between 1948 and 1958, the Jewish population rose from 800,000 to two million. Currently, Jews account for 75.4 percent of the Israeli population, or 6 million people. The early years of the State of Israel were marked by the mass immigration of Holocaust survivors in the aftermath of the Holocaust and Jews fleeing Arab lands. Israel also has a large population of Ethiopian Jews, many of whom were airlifted to Israel in the late 1980s and early 1990s. Between 1974 and 1979 nearly 227,258 immigrants arrived in Israel, about half being from the Soviet Union. This period also saw an increase in immigration to Israel from Western Europe, Latin America, and North America. A trickle of immigrants from other communities has also arrived, including Indian Jews and others, as well as some descendants of Ashkenazi Holocaust survivors who had settled in countries such as the United States, Argentina, Australia, Chile, and South Africa. Some Jews have emigrated from Israel elsewhere, because of economic problems or disillusionment with political conditions and the continuing Arab–Israeli conflict. Jewish Israeli emigrants are known as yordim. The waves of immigration to the United States and elsewhere at the turn of the 19th century, the founding of Zionism and later events, including pogroms in Imperial Russia (mostly within the Pale of Settlement in present-day Ukraine, Moldova, Belarus and eastern Poland), the massacre of European Jewry during the Holocaust, and the founding of the state of Israel, with the subsequent Jewish exodus from Arab lands, all resulted in substantial shifts in the population centers of world Jewry by the end of the 20th century. More than half of the Jews live in the Diaspora (see Population table). Currently, the largest Jewish community outside Israel, and either the largest or second-largest Jewish community in the world, is located in the United States, with 6 million to 7.5 million Jews by various estimates. Elsewhere in the Americas, there are also large Jewish populations in Canada (315,000), Argentina (180,000–300,000), and Brazil (196,000–600,000), and smaller populations in Mexico, Uruguay, Venezuela, Chile, Colombia and several other countries (see History of the Jews in Latin America). According to a 2010 Pew Research Center study, about 470,000 people of Jewish heritage live in Latin America and the Caribbean. Demographers disagree on whether the United States has a larger Jewish population than Israel, with many maintaining that Israel surpassed the United States in Jewish population during the 2000s, while others maintain that the United States still has the largest Jewish population in the world. Currently, a major national Jewish population survey is planned to ascertain whether or not Israel has overtaken the United States in Jewish population. Western Europe's largest Jewish community, and the third-largest Jewish community in the world, can be found in France, home to between 483,000 and 500,000 Jews, the majority of whom are immigrants or refugees from North African countries such as Algeria, Morocco, and Tunisia (or their descendants). The United Kingdom has a Jewish community of 292,000. In Eastern Europe, the exact figures are difficult to establish. The number of Jews in Russia varies widely according to whether a source uses census data (which requires a person to choose a single nationality among choices that include "Russian" and "Jewish") or eligibility for immigration to Israel (which requires that a person have one or more Jewish grandparents). According to the latter criteria, the heads of the Russian Jewish community assert that up to 1.5 million Russians are eligible for aliyah. In Germany, the 102,000 Jews registered with the Jewish community are a slowly declining population, despite the immigration of tens of thousands of Jews from the former Soviet Union since the fall of the Berlin Wall. Thousands of Israelis also live in Germany, either permanently or temporarily, for economic reasons. Prior to 1948, approximately 800,000 Jews were living in lands which now make up the Arab world (excluding Israel). Of these, just under two-thirds lived in the French-controlled Maghreb region, 15 to 20 percent in the Kingdom of Iraq, approximately 10 percent in the Kingdom of Egypt and approximately 7 percent in the Kingdom of Yemen. A further 200,000 lived in Pahlavi Iran and the Republic of Turkey. Today, around 26,000 Jews live in Muslim-majority countries, mainly in Turkey (14,200) and Iran (9,100), while Morocco (2,000), Tunisia (1,000), and the United Arab Emirates (500) host the largest communities in the Arab world. A small-scale exodus had begun in many countries in the early decades of the 20th century, although the only substantial aliyah came from Yemen and Syria. The exodus from Arab and Muslim countries took place primarily from 1948. The first large-scale exoduses took place in the late 1940s and early 1950s, primarily in Iraq, Yemen and Libya, with up to 90 percent of these communities leaving within a few years. The peak of the exodus from Egypt occurred in 1956. The exodus in the Maghreb countries peaked in the 1960s. Lebanon was the only Arab country to see a temporary increase in its Jewish population during this period, due to an influx of refugees from other Arab countries, although by the mid-1970s the Jewish community of Lebanon had also dwindled. In the aftermath of the exodus wave from Arab states, an additional migration of Iranian Jews peaked in the 1980s when around 80 percent of Iranian Jews left the country.[citation needed] Outside Europe, the Americas, the Middle East, and the rest of Asia, there are significant Jewish populations in Australia (112,500) and South Africa (70,000). There is also a 6,800-strong community in New Zealand. Since at least the time of the Ancient Greeks, a proportion of Jews have assimilated into the wider non-Jewish society around them, by either choice or force, ceasing to practice Judaism and losing their Jewish identity. Assimilation took place in all areas, and during all time periods, with some Jewish communities, for example the Kaifeng Jews of China, disappearing entirely. The advent of the Jewish Enlightenment of the 18th century (see Haskalah) and the subsequent emancipation of the Jewish populations of Europe and America in the 19th century, accelerated the situation, encouraging Jews to increasingly participate in, and become part of, secular society. The result has been a growing trend of assimilation, as Jews marry non-Jewish spouses and stop participating in the Jewish community. Rates of interreligious marriage vary widely: In the United States, it is just under 50 percent; in the United Kingdom, around 53 percent; in France, around 30 percent; and in Australia and Mexico, as low as 10 percent. In the United States, only about a third of children from intermarriages affiliate with Jewish religious practice. The result is that most countries in the Diaspora have steady or slightly declining religiously Jewish populations as Jews continue to assimilate into the countries in which they live.[citation needed] The Jewish people and Judaism have experienced various persecutions throughout their history. During Late Antiquity and the Early Middle Ages, the Roman Empire (in its later phases known as the Byzantine Empire) repeatedly repressed the Jewish population, first by ejecting them from their homelands during the pagan Roman era and later by officially establishing them as second-class citizens during the Christian Roman era. According to James Carroll, "Jews accounted for 10% of the total population of the Roman Empire. By that ratio, if other factors had not intervened, there would be 200 million Jews in the world today, instead of something like 13 million." Later in medieval Western Europe, further persecutions of Jews by Christians occurred, notably during the Crusades—when Jews all over Germany were massacred—and in a series of expulsions from the Kingdom of England, Germany, and France. Then there occurred the largest expulsion of all, when Spain and Portugal, after the Reconquista (the Catholic Reconquest of the Iberian Peninsula), expelled both unbaptized Sephardic Jews and the ruling Muslim Moors. In the Papal States, which existed until 1870, Jews were required to live only in specified neighborhoods called ghettos. Islam and Judaism have a complex relationship. Traditionally Jews and Christians living in Muslim lands, known as dhimmis, were allowed to practice their religions and administer their internal affairs, but they were subject to certain conditions. They had to pay the jizya (a per capita tax imposed on free adult non-Muslim males) to the Islamic state. Dhimmis had an inferior status under Islamic rule. They had several social and legal disabilities such as prohibitions against bearing arms or giving testimony in courts in cases involving Muslims. Many of the disabilities were highly symbolic. The one described by Bernard Lewis as "most degrading" was the requirement of distinctive clothing, not found in the Quran or hadith but invented in early medieval Baghdad; its enforcement was highly erratic. On the other hand, Jews rarely faced martyrdom or exile, or forced compulsion to change their religion, and they were mostly free in their choice of residence and profession. Notable exceptions include the massacre of Jews and forcible conversion of some Jews by the rulers of the Almohad dynasty in Al-Andalus in the 12th century, as well as in Islamic Persia, and the forced confinement of Moroccan Jews to walled quarters known as mellahs beginning from the 15th century and especially in the early 19th century. In modern times, it has become commonplace for standard antisemitic themes to be conflated with anti-Zionist publications and pronouncements of Islamic movements such as Hezbollah and Hamas, in the pronouncements of various agencies of the Islamic Republic of Iran, and even in the newspapers and other publications of Turkish Refah Partisi."[better source needed] Throughout history, many rulers, empires and nations have oppressed their Jewish populations or sought to eliminate them entirely. Methods employed ranged from expulsion to outright genocide; within nations, often the threat of these extreme methods was sufficient to silence dissent. The history of antisemitism includes the First Crusade which resulted in the massacre of Jews; the Spanish Inquisition (led by Tomás de Torquemada) and the Portuguese Inquisition, with their persecution and autos-da-fé against the New Christians and Marrano Jews; the Bohdan Chmielnicki Cossack massacres in Ukraine; the Pogroms backed by the Russian Tsars; as well as expulsions from Spain, Portugal, England, France, Germany, and other countries in which the Jews had settled. According to a 2008 study published in the American Journal of Human Genetics, 19.8 percent of the modern Iberian population has Sephardic Jewish ancestry, indicating that the number of conversos may have been much higher than originally thought. The persecution reached a peak in Nazi Germany's Final Solution, which led to the Holocaust and the slaughter of approximately 6 million Jews. Of the world's 16 million Jews in 1939, almost 40% were murdered in the Holocaust. The Holocaust—the state-led systematic persecution and genocide of European Jews (and certain communities of North African Jews in European controlled North Africa) and other minority groups of Europe during World War II by Germany and its collaborators—remains the most notable modern-day persecution of Jews. The persecution and genocide were accomplished in stages. Legislation to remove the Jews from civil society was enacted years before the outbreak of World War II. Concentration camps were established in which inmates were used as slave labour until they died of exhaustion or disease. Where the Third Reich conquered new territory in Eastern Europe, specialized units called Einsatzgruppen murdered Jews and political opponents in mass shootings. Jews and Roma were crammed into ghettos before being transported hundreds of kilometres by freight train to extermination camps where, if they survived the journey, the majority of them were murdered in gas chambers. Virtually every arm of Germany's bureaucracy was involved in the logistics of the mass murder, turning the country into what one Holocaust scholar has called "a genocidal nation." Throughout Jewish history, Jews have repeatedly been directly or indirectly expelled from both their original homeland, the Land of Israel, and many of the areas in which they have settled. This experience as refugees has shaped Jewish identity and religious practice in many ways, and is thus a major element of Jewish history. In summary, the pogroms in Eastern Europe, the rise of modern antisemitism, the Holocaust, as well as the rise of Arab nationalism, all served to fuel the movements and migrations of huge segments of Jewry from land to land and continent to continent until they arrived back in large numbers at their original historical homeland in Israel. In the Bible, the patriarch Abraham is described as a migrant to the land of Canaan from Ur of the Chaldees. His descendants, the Children of Israel, undertook the Exodus (meaning "departure" or "exit" in Greek) from ancient Egypt, as described in the Book of Exodus. The first movement documented in the historical record occurred with the resettlement policy of the Neo-Assyrian Empire, which mandated the deportation of conquered peoples, and it is estimated some 4,500,000 among its captive populations suffered this dislocation over three centuries of Assyrian rule. With regard to Israel, Tiglath-Pileser III claims he deported 80% of the population of Lower Galilee, some 13,520 people. Some 27,000 Israelites, 20 to 25% of the population of the Kingdom of Israel, were described as being deported by Sargon II, and were replaced by other deported populations and sent into permanent exile by Assyria, initially to the Upper Mesopotamian provinces of the Assyrian Empire. Between 10,000 and 80,000 people from the Kingdom of Judah were similarly exiled by Babylonia, but these people were then returned to Judea by Cyrus the Great of the Persian Achaemenid Empire. Many Jews were exiled again by the Roman Empire. The 2,000 year dispersion of the Jewish diaspora beginning under the Roman Empire, as Jews were spread throughout the Roman world and, driven from land to land, settled wherever they could live freely enough to practice their religion. Over the course of the diaspora the center of Jewish life moved from Babylonia to the Iberian Peninsula to Poland to the United States and, as a result of Zionism, back to Israel. There were also many expulsions of Jews during the Middle Ages and Enlightenment in Europe, including: 1290, 16,000 Jews were expelled from England, (see the Statute of Jewry); in 1396, 100,000 from France; in 1421, thousands were expelled from Austria. Many of these Jews settled in East-Central Europe, especially Poland. Following the Spanish Inquisition in 1492, the Spanish population of around 200,000 Sephardic Jews were expelled by the Spanish crown and Catholic church, followed by expulsions in 1493 in Sicily (37,000 Jews) and Portugal in 1496. The expelled Jews fled mainly to the Ottoman Empire, the Netherlands, and North Africa, others migrating to Southern Europe and the Middle East. During the 19th century, France's policies of equal citizenship regardless of religion led to the immigration of Jews (especially from Eastern and Central Europe). This contributed to the arrival of millions of Jews in the New World. Over two million Eastern European Jews arrived in the United States from 1880 to 1925. In the latest phase of migrations, the Islamic Revolution of Iran caused many Iranian Jews to flee Iran. Most found refuge in the US (particularly Los Angeles, California, and Long Island, New York) and Israel. Smaller communities of Persian Jews exist in Canada and Western Europe. Similarly, when the Soviet Union collapsed, many of the Jews in the affected territory (who had been refuseniks) were suddenly allowed to leave. This produced a wave of migration to Israel in the early 1990s. Israel is the only country with a Jewish population that is consistently growing through natural population growth, although the Jewish populations of other countries, in Europe and North America, have recently increased through immigration. In the Diaspora, in almost every country the Jewish population in general is either declining or steady, but Orthodox and Haredi Jewish communities, whose members often shun birth control for religious reasons, have experienced rapid population growth. Orthodox and Conservative Judaism discourage proselytism to non-Jews, but many Jewish groups have tried to reach out to the assimilated Jewish communities of the Diaspora in order for them to reconnect to their Jewish roots. Additionally, while in principle Reform Judaism favours seeking new members for the faith, this position has not translated into active proselytism, instead taking the form of an effort to reach out to non-Jewish spouses of intermarried couples. There is also a trend of Orthodox movements reaching out to secular Jews in order to give them a stronger Jewish identity so there is less chance of intermarriage. As a result of the efforts by these and other Jewish groups over the past 25 years, there has been a trend (known as the Baal teshuva movement) for secular Jews to become more religiously observant, though the demographic implications of the trend are unknown. Additionally, there is also a growing rate of conversion to Jews by Choice of gentiles who make the decision to head in the direction of becoming Jews. Contributions Jewish individuals have played a significant role in the development and growth of Western culture, advancing many fields of thought, science and technology, both historically and in modern times, including through discrete trends in Jewish philosophy, Jewish ethics and Jewish literature, as well as specific trends in Jewish culture, including in Jewish art, Jewish music, Jewish humor, Jewish theatre, Jewish cuisine and Jewish medicine. Jews have established various Jewish political movements, religious movements, and, through the authorship of the Hebrew Bible and parts of the New Testament, provided the foundation for Christianity and Islam. More than 20 percent of the awarded Nobel Prize have gone to individuals of Jewish descent. Philanthropic giving is a widespread core function among Jewish organizations. Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Rotational_energy] | [TOKENS: 471]
Contents Rotational energy Rotational energy or angular kinetic energy is kinetic energy due to the rotation of an object and is part of its total kinetic energy. Looking at rotational energy separately around an object's axis of rotation, the following dependence on the object's moment of inertia is observed: E rotational = 1 2 I ω 2 {\displaystyle E_{\text{rotational}}={\tfrac {1}{2}}I\omega ^{2}} where The mechanical work required for or applied during rotation is the torque times the rotation angle. The instantaneous power of an angularly accelerating body is the torque times the angular velocity. For free-floating (unattached) objects, the axis of rotation is commonly around its center of mass. Note the close relationship between the result for rotational energy and the energy held by linear (or translational) motion: E translational = 1 2 m v 2 {\displaystyle E_{\text{translational}}={\tfrac {1}{2}}mv^{2}} In the rotating system, the moment of inertia, I, takes the role of the mass, m, and the angular velocity, ω {\displaystyle \omega } , takes the role of the linear velocity, v. The rotational energy of a rolling cylinder varies from one half of the translational energy (if it is massive) to the same as the translational energy (if it is hollow). An example is the calculation of the rotational kinetic energy of the Earth. As the Earth has a sidereal rotation period of 23.93 hours, it has an angular velocity of 7.29×10−5 rad·s−1. The Earth has a moment of inertia, I = 8.04×1037 kg·m2. Therefore, it has a rotational kinetic energy of 2.14×1029 J. Part of the Earth's rotational energy can also be tapped using tidal power. Additional friction of the two global tidal waves creates energy in a physical manner, infinitesimally slowing down Earth's angular velocity ω. Due to the conservation of angular momentum, this process transfers angular momentum to the Moon's orbital motion, increasing its distance from Earth and its orbital period (see tidal locking for a more detailed explanation of this process). See also Notes References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Timeline_of_hypertext_technology] | [TOKENS: 126]
Contents Timeline of hypertext technology This article presents a timeline of hypertext technology, including "hypermedia" and related human–computer interaction projects and developments from ancient times and up to the present day. The term hypertext is credited to the author and philosopher Ted Nelson. See also Graphical user interface, Multimedia; also Paul Otlet and Henri La Fontaine's Mundaneum, a massively cross-referenced card index system established in 1910. 700 BCE–600 CE 1930s 1940s 1960s 1970s 1980s 1990s 2000s
========================================
[SOURCE: https://en.wikipedia.org/wiki/United_States_Fish_and_Wildlife_Service] | [TOKENS: 2481]
Contents United States Fish and Wildlife Service The United States Fish and Wildlife Service (USFWS or FWS) is a U.S. federal government agency within the United States Department of the Interior which oversees the management of fish, wildlife, and natural habitats in the United States. The mission of the agency is: "working with others to conserve, protect, and enhance fish, wildlife, plants and their habitats for the continuing benefit of the American people". History The original ancestor of the agency was the United States Commission on Fish and Fisheries, more commonly referred to as the United States Fish Commission, created in 1871 by the United States Congress with the purpose of studying and recommending solutions to a noted decline in the stocks of food fish. Spencer Fullerton Baird was appointed to lead it as the first United States Commissioner of Fisheries. In 1903, the Fish Commission was reorganized as the United States Bureau of Fisheries and made part of the United States Department of Commerce and Labor. When the Department of Commerce and Labor was split into the United States Department of Commerce and the United States Department of Labor in 1913, the Bureau of Fisheries was made part of the Department of Commerce. Originally focused on fisheries science and fish culture, the Bureau of Fisheries also assumed other duties; in 1906, the U.S. Congress assigned it the responsibility for the enforcement of fishery and fur seal-hunting regulations in the District of Alaska, and in 1910 for the management and harvest of northern fur seals, foxes, and other fur-bearing animals in the Pribilof Islands, as well as for the care, education, and welfare of the Aleut communities in the islands. In 1939, the Bureau of Fisheries moved from the Department of Commerce to the Department of the Interior. The other ancestor of the agency began as the Section of Economic Ornithology, which was established within the United States Department of Agriculture in 1885 and became the Division of Economic Ornithology and Mammalogy in 1886. In 1896 it became the Division of Biological Survey. Clinton Hart Merriam headed the Division for 25 years and became a national figure for improving the scientific understanding of birds and mammals in the United States. By 1905 with funding scarce, the Survey included in its mission the eradication of wolves, coyotes and other large predators. This garnered them the support of ranchers and western legislators resulting, by 1914, in a $125,000 congressionally approved budget for destroying wolves, coyotes and other animals injurious to agriculture and animal husbandry. Meanwhile, scientists like Joseph Grinnell and Charles C. Adams, a founder of the Ecological Society of America, were promoting a balance of nature. In 1924, at a conference organized by the American Society of Mammalogists, the debate generated a public split between those in the Survey, promoting eradication, and the mammalogists who promoted some sort of accommodation. The Survey subsequently placed over 2 million poisoned bait stations across the west. The Survey then turned to the eradication of coyote,: 124–126 coordinated through the 1931 Animal Damage Control Act. In 1934, the Division of Biological Survey was reorganized as the Bureau of Biological Survey and Jay Norwood Darling was appointed its chief;. The same year, Congress passed the Fish and Wildlife Coordination Act, one of the oldest federal environmental review statutes. Under Darling's guidance, the Bureau began an ongoing legacy of protecting vital natural habitat throughout the United States. In 1939, the Bureau of Biological Survey moved from the Department of Agriculture to the Department of the Interior. On June 30, 1940, the Bureau of Fisheries and the Bureau of Biological Survey were combined to form the Department of the Interior's Fish and Wildlife Service. In 1956, the Fish and Wildlife Service was reorganized as the United States Fish and Wildlife Service — which remained part of the Department of the Interior — and divided its operations into two bureaus, the Bureau of Sport Fisheries and Wildlife and the Bureau of Commercial Fisheries, with the latter inheriting the history and heritage of the old U.S. Fish Commission and U.S. Bureau of Fisheries. In 1970, the National Oceanic and Atmospheric Administration was formed within the Department of Commerce. That year, the Bureau of Commercial Fisheries merged with the saline water laboratories of the Bureau of Sport Fisheries and Wildlife. This led to the creation of today's National Marine Fisheries Service, which also acquired the former fleet of the United States Fish and Wildlife Service. The use of poisoned bait stations continued into the early 1970s. Although it resulted in the deaths of hundreds of thousands of coyotes, this method failed to significantly reduce their population. Instead, due to their remarkable adaptability and resilience, coyotes migrated into a wider range of habitats, even venturing into urban areas. Driven by growing environmental awareness in the late 1960s and early 1970s, Richard Nixon banned poisons used since the Second World War and signed the Endangered Species Act of 1973. In 1985 Ronald Reagan reversed the poison killing ban and transferred the responsibility for predator control to the Wildlife Services program under the Department of Agriculture. Habitat conservation The agency plays a vital role in wildlife and habitat conservation through various key functions, including protecting native species, managing migratory bird populations, restoring fisheries to support ecosystems, conserving vital habitats like wetlands, and overseeing wildlife efforts on military bases per the Sikes Act. The agency manages the National Wildlife Refuge System, guided by the National Wildlife Refuge System Administration Act of 1966, which consists of 570 National Wildlife Refuges, encompassing a full range of habitat types, including wetlands, prairies, coastal and marine areas, and temperate, tundra, and boreal forests spread across all 50 U.S. states. It also manages thousands of small wetlands and other areas covering over 150 million acres. The agency participates in the governance of six National Monuments: The agency shares responsibility for administering the Endangered Species Act of 1973 with the National Marine Fisheries Service. The latter is responsible for marine species, while the agency oversees freshwater fish and all other species. The two organizations jointly manage species that inhabit both marine and non-marine environments. To manage the listing process, a listing priority number is assigned to candidate species to reflect the relative urgency of listing them as threatened or endangered when an immediate listing is not feasible. The agency publishes the quarterly Endangered Species Bulletin. The National Conservation Training Center trains employees and partners in the accomplishment of the agency's mission. The vast majority of fish and wildlife habitats are on state or private land not controlled by the United States government. Therefore, the agency's Partners for Fish and Wildlife program works closely with private groups such as Partners in Flight, National Wildlife Refuge Association, and the Landscape conservation cooperatives to promote voluntary habitat conservation and restoration. Migratory bird program The Migratory Bird Program aims to protect and conserve bird populations and habitats. It ensures ecological sustainability, enhances opportunities for birdwatching and other outdoor activities, and promotes awareness of the importance of migratory birds. To achieve these goals, the program utilizes resources such as the National Wetlands Inventory to map and monitor critical wetland habitats. The program conducts surveys, coordinates conservation partnerships, offers conservation grants, develops policies, and manages conservation laws, such as the Migratory Bird Conservation Act, educates children, and provides resources for engaging with nature and birds. The agency organizes the annual art competition for the Federal Duck Stamp, formally known as the Migratory Bird Hunting and Conservation Stamp, a collectable adhesive stamp required for waterfowl hunting. It also allows access to National Wildlife Refuges without paying an admission fee. Since 1934, sales of Federal Duck Stamps have generated over $1.2 billion, which has enabled the conservation of more than 6 million acres of wetlands habitat. This makes the Duck Stamp one of the most successful wetland conservation revenue programs in history, with over 98% of the funds directly supporting the acquisition of wetlands and conservation easements for the National Wildlife Refuge System. Restoring fisheries The agency oversees the National Fish Hatchery System which includes 71 fish hatcheries and 65 conservation offices. Originally created to reverse declines in lake and coastal fish stocks, the program subsequently expanded its mission to include the preservation of the genes of wild and hatchery-raised fish. The fish hatcheries contribute to the restoration of native fish populations, freshwater mussels, and amphibians including populations of species listed under the Endangered Species Act, and providing fish to benefit Native Americans and nature reserves. The National Fish Passage Program provides financial and technical resources to projects that promote the free movement of fish and aquatic life. Common projects include dam removal and fishway construction. Between 1999 and 2023, the program has worked with over 2,000 local partners to open 61,000 mi (98,000 km) of upstream habitat by removing or bypassing 3,400 aquatic barriers. The agency also plays a key role both in protecting sensitive marine coastal ecosystems under the Coastal Barrier Resources Act and in conducting evaluations of the impacts on fish and wildlife resulting from proposed water resources development projects, in accordance with the Fish and Wildlife Coordination Act. Law enforcement The Office of Law Enforcement enforces wildlife laws such as the Marine Mammal Protection Act, the Migratory Bird Treaty Act of 1918, and the Lacey Act of 1900. The Refuge Law Enforcement officers safeguard National Wildlife Refuges, playing a vital role in preventing habitat destruction. The office also provides training to law enforcement officers, and collaborates with tribal partners to conserve wildlife resources. The International Affairs Program coordinates national and global initiatives to protect, restore, and enhance wildlife and habitats, with a focus on species of international concern. It fulfills obligations under international treaties, such as CITES. The program collaborates with private citizens, local communities, government agencies, conservation organizations, and other stakeholders to implement treaties and laws and conserve species worldwide. The agency operates the Clark R. Bavin National Fish and Wildlife Forensic Laboratory, the only forensics laboratory in the world devoted to wildlife law enforcement. By treaty, it also is the official crime laboratory for CITES and Interpol. The laboratory identifies the species or subspecies of pieces, parts, or products of an animal to determine its cause of death, help wildlife officers determine if a violation of law occurred in its death, and to identify and compare physical evidence to link suspects to the crime scene and the animal's death. Tribal relations Pursuant to the Eagle feather law and the Bald and Golden Eagle Protection Act, the agency administers the National Eagle Repository and the permit system for Native American religious use of eagle feathers. These exceptions often only apply to Native Americans that are registered with the federal government and are enrolled with a federally recognized tribe. In the late 1990s and early 2000s, the agency began to incorporate the research of tribal scientists into conservation decisions. This came on the heels of Native American traditional ecological knowledge gaining acceptance in the scientific community as a reasonable and respectable way to gain knowledge of managing the natural world. Additionally, other natural resource agencies within the United States government, such as the United States Department of Agriculture, have taken steps to be more inclusive of tribes, native people, and tribal rights. This has marked a transition to a relationship of more co-operation rather than the tension between tribes and government agencies seen historically. Today, these agencies work closely with tribal governments to ensure the best conservation decisions are made and that tribes retain their sovereignty. In popular culture Tom Lehrer's satirical song, Poisoning Pigeons in the Park, gained notoriety in 1959 for its criticism against the agency's Animal Damage Control program's cruel practice of poisoning pigeons. This program was in 1985 transferred to the Department of Agriculture and renamed Wildlife Services. In the 2017 contemporary Western crime film Wind River, Jeremy Renner plays a U.S. Fish and Wildlife Service tracker who teams up with an FBI agent to solve a murder in the Wind River Indian Reservation. References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-414] | [TOKENS: 10515]
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Enron] | [TOKENS: 9134]
Contents Enron Enron Corporation was an American energy, commodities, and services company based in Houston, Texas. It was led by Kenneth Lay and developed in 1985 via a merger between Houston Natural Gas and InterNorth, both relatively small regional companies at the time of the merger. Before its bankruptcy on December 2, 2001, Enron employed approximately 20,600 staff and was a major electricity, natural gas, communications, and pulp and paper company, with claimed revenues of nearly $101 billion during 2000. Fortune named Enron "America's Most Innovative Company" for six consecutive years. At the end of 2001, it was revealed that Enron's reported financial condition was sustained by an institutionalized, systematic, and creatively planned accounting fraud, known since as the Enron scandal. Enron became synonymous with willful, institutional fraud and systemic corruption. The scandal brought into question the accounting practices and activities of many corporations in the United States and was a factor in the enactment of the Sarbanes–Oxley Act of 2002. It affected the greater business world by causing, together with the even larger fraudulent bankruptcy of WorldCom, the dissolution of the Arthur Andersen accounting firm, which had been Enron and WorldCom's main auditor, and coconspirator in the fraud for years. Enron filed for bankruptcy in the United States District Court for the Southern District of New York in late 2001 and selected Weil, Gotshal & Manges as its bankruptcy counsel. Enron emerged from bankruptcy in November 2004, under a court-approved plan of reorganization. A new board of directors changed its name to Enron Creditors Recovery Corp., and emphasized reorganizing and liquidating certain operations and assets of the pre-bankruptcy Enron. On September 7, 2006, Enron sold its last remaining subsidiary, Prisma Energy International, to Ashmore Energy International Ltd. (now AEI). It is the largest bankruptcy due specifically to fraud in United States history. History One of Enron's primary predecessors was InterNorth, which was formed in 1930, in Omaha, Nebraska, just a few months after Black Tuesday. The low cost of natural gas and the cheap supply of labor during the Great Depression helped to fuel the company's early beginnings, doubling in size by 1932. Over the next 50 years, Northern expanded even more as it acquired many energy companies. It was reorganized in 1979 as the main subsidiary of a holding company, InterNorth, a diversified energy and energy-related products firm. Although most of the acquisitions conducted were successful, some ended poorly. InterNorth competed with Cooper Industries unsuccessfully over a hostile takeover of Crouse-Hinds Company, an electrical products manufacturer. Cooper and InterNorth feuded in numerous suits during the takeover that were eventually settled after the transaction was completed. The subsidiary Northern Natural Gas operated the largest pipeline company in North America. By the 1980s, InterNorth became a major force for natural gas production, transmission, and marketing as well as for natural gas liquids, and was an innovator in the plastics industry. In 1983, InterNorth merged with the Belco Petroleum Company, a Fortune 500 oil exploration and development company founded by Arthur Belfer. The Houston Natural Gas (HNG) corporation was initially formed from the Houston Oil Co. in 1925 to provide gas to customers in the Houston market through the building of gas pipelines. Under the leadership of CEO Robert Herring from 1967 to 1981, the company took advantage of the unregulated Texas natural gas market and the commodity surge in the early 1970s to become a dominant force in the energy industry. Toward the end of the 1970s, HNG's luck began to run out with rising gas prices forcing clients to switch to oil. In addition, with the passing of the Natural Gas Policy Act of 1978, the Texas market was less profitable and as a result, HNG's profits fell. After Herring died in 1981, M.D. Matthews briefly took over as CEO in a 3-year stint with initial success, but ultimately, a big dip in earnings led to his exit. In 1984, Kenneth Lay succeeded Matthews and inherited the troubled conglomerate. With its conservative success, InterNorth became a target of corporate takeovers, the most prominent originating with Irwin Jacobs. InterNorth CEO Sam Segnar sought a friendly merger with HNG. In May 1985, Internorth acquired HNG for $2.3 billion, 40% higher than the current market price, and on July 16, 1985, the two entities voted to merge. The combined assets of the two companies created the second largest gas pipeline system in the US at that time. Internorth's north–south pipelines that served Iowa and Minnesota complemented HNG's Florida and California east-west pipelines well. The company was initially named HNG/InterNorth Inc., even though InterNorth was technically the parent. At the outset, Segnar was CEO but was soon fired by the board of directors to name Lay to the post. Lay moved its headquarters back to Houston and set out to find a new name, spending more than $100,000 in focus groups and consultants in the process. Lippincott & Margulies, the advertising firm responsible for the InterNorth identity five years prior, suggested "Enteron". During a meeting with employees on February 14, 1986, Lay announced his interest in this name change, which would be held to a stockholder vote on April 10. Less than a month from this meeting, on March 7, 1986, a spokesman for HNG/InterNorth rescinded the planned Enteron proposal, as since its announcement the name had come under scrutiny for being the same as a medical term for the intestines. This same press release saw the introduction of the Enron name, which would be the new name voted on come April. Enron still had some lingering problems left over from its merger, however, the company had to pay Jacobs, who was still a threat, over $350 million and reorganize the company. Lay sold off any parts of the company that he believed didn't belong in the long-term future of Enron. Lay consolidated all the gas pipeline efforts under the Enron Gas Pipeline Operating Company. In addition, it ramped up its electric power and natural gas efforts. In 1988 and 1989, the company added power plants and cogeneration units to its portfolio. In 1989, Jeffrey Skilling, then a consultant at McKinsey & Company, came up with the idea to link natural gas to consumers in more ways, effectively turning natural gas into a commodity. Enron adopted the idea and called it the "Gas Bank". The division's success prompted Skilling to join Enron as the head of the Gas Bank in 1991. Another major development inside Enron was a pivot to overseas operations with a $56 million loan in 1989 from the Overseas Private Investment Corporation (OPIC) for a power plant in Argentina. Throughout the 1990s, Enron made a few changes to its business plan that greatly improved the perceived profitability of the company. First, Enron invested heavily in overseas assets, specifically energy. Another major shift was the gradual transition of focus from a producer of energy to a company that acted more like an investment firm and sometimes a hedge fund, making profits off the margins of the products it traded. These products were traded through the Gas Bank concept, now called the Enron Finance Corp. and headed by Skilling. With the success of the Gas Bank trading natural gas, Skilling looked to expand the horizons of his division, Enron Capital & Trade. Skilling hired Andrew Fastow in 1990 to help. Starting in 1994 under the Energy Policy Act of 1992, Congress allowed states to deregulate their electricity utilities, allowing them to be opened for competition. California was one such state to do so. Enron, seeing an opportunity with rising prices, was eager to jump into the market. In 1997, Enron acquired Portland General Electric (PGE). Although an Oregon utility, it had the potential to begin serving the massive California market since PGE was a regulated utility. The new Enron division, Enron Energy, ramped up its efforts by offering discounts to potential customers in California starting in 1998. Enron Energy also began to sell natural gas to customers in Ohio and wind power in Iowa. However, the company ended its retail endeavor in 1999 as it was revealed it was costing upwards of $100 million a year. As fiber optic technology progressed in the 1990s, multiple companies, including Enron, attempted to make money by "keeping the continuing network costs low", which was done by owning their own network. In 1997, FTV Communications LLC, a limited liability company formed by Enron subsidiary FirstPoint Communications, Inc., constructed a 1,380 miles (2,220 km) fiber optic network between Portland and Las Vegas. In 1998, Enron constructed a building in a rundown area of Las Vegas near E Sahara, right over the "backbone" of fiber optic cables providing service to technology companies nationwide. The location had the ability to send "the entire Library of Congress anywhere in the world within minutes" and could stream "video to the whole state of California". The location was also more protected from natural disasters than areas such as Los Angeles or the East Coast. According to Wall Street Daily, "Enron had a secret", it "wanted to trade bandwidth like it traded oil, gas, electricity, etc. It launched a secret plan to build an enormous amount of fiber optic transmission capacity in Las Vegas ... it was all part of Enron's plan to essentially own the internet." Enron sought to have all US internet service providers rely on their Nevada facility to supply bandwidth, which Enron would sell in a fashion similar to other commodities. In January 2000, Kenneth Lay and Jeffrey Skilling announced to analysts that they were going to open trading for their own "high-speed fiber-optic networks that form the backbone for Internet traffic". Investors quickly bought Enron stock following the announcement "as they did with most things Internet-related at the time", with stock prices rising from $40 per share in January 2000 to $70 per share in March, peaking at $90 in the summer of 2000. Enron executives obtained windfall gains from the rising stock prices, with a total of $924 million of stocks sold by high-level Enron employees between 2000 and 2001. The head of Enron Broadband Services, Kenneth Rice, sold 1 million shares himself, earning about $70 million in returns. As prices of existing fiber optic cables plummeted due to the vast oversupply of the system, with only 5% of the 40 million miles being active wires, Enron purchased the inactive "dark fibers", expecting to buy them at low cost and then make a profit as the need for more usage by internet providers increased, with Enron expecting to lease its acquired dark fibers in 20-year contracts to providers. However, Enron's accounting would use estimates to determine how much their dark fiber would be worth when "lit" and apply those estimates to their current income, adding exaggerated revenue to their accounts since transactions were not yet made and it was not known if the cables would ever be active. Enron's trading with other energy companies within the broadband market was its attempt to lure large telecommunications companies, such as Verizon Communications, into its broadband scheme to create its own new market. By the second quarter of 2001, Enron Broadband Services was reporting losses. On March 12, 2001, a proposed 20-year deal between Enron and Blockbuster Inc. to stream movies on demand over Enron's connections was canceled, with Enron shares dropping from $80 per share in mid-February 2001 to below $60 the week after the deal was killed. The branch of the company that Jeffrey Skilling "said would eventually add $40 billion to Enron's stock value" added only about $408 million in revenue for Enron in 2001, with the company's broadband arm closed shortly after its meager second-quarter earnings report in July 2001. Following the bankruptcy of Enron, telecommunications holdings were sold for "pennies on the dollar". In 2002, Rob Roy of Switch Communications purchased Enron's Nevada facility in an auction attended only by Roy. Enron's "fiber plans were so secretive that few people even knew about the auction." The facility was sold for only $930,000. Following the sale, Switch expanded to control "the biggest data center in the world". Enron, seeing stability after the merger, began to look overseas for new possible energy opportunities in 1991. Enron's first such opportunity was a natural gas power plant utilizing cogeneration that the company built near Middlesbrough, UK. The power plant was so large it could produce up to 3% of the United Kingdom's electricity demand with a capacity of over 1,875 megawatts. Seeing the success in England, the company developed and diversified its assets worldwide under the name of Enron International (EI), headed by former HNG executive Rebecca Mark. By 1994, EI's portfolio included assets in The Philippines, Australia, Guatemala, Germany, France, India, Argentina, the Caribbean, China, England, Colombia, Turkey, Bolivia, Brazil, Indonesia, Norway, Poland, and Japan. The division was producing a large share of earnings for Enron, contributing 25% of earnings in 1996. Mark and EI believed the water industry was the next market to be deregulated by authorities. Seeing the potential, they searched for ways to enter the market, similar to PGE. During this period of growth, Enron introduced a new corporate identity on January 14, 1997, and from that point adopted their distinctive tricolor E logo. This logo was one of the final projects of legendary graphic designer Paul Rand before his death in 1996, and debuted almost three months after his departure. In 1998, Enron International acquired Wessex Water for $2.88 billion. Wessex Water became the core asset of a new company, Azurix, which expanded to other water companies. After Azurix's promising IPO in June 1999, Enron "sucked out over $1 billion in cash while loading it up with debt", according to Bethany McLean and Peter Elkind, authors of The Smartest Guys in the Room: The Amazing Rise and Scandalous Fall of Enron.: 250 Additionally, British water regulators required Wessex to cut its rates by 12% starting in April 2000, and an upgrade was required of the utility's aging infrastructure, estimated at costing over a billion dollars.: 255 By the end of 2000 Azurix had an operating profit of less than $100 million and was $2 billion in debt.: 257 In August 2000, after Azurix stock took a plunge following its earnings report,: 257 Mark resigned from Azurix and Enron. Azurix assets, including Wessex, were eventually sold by Enron. In 1990, Enron's chief operating officer Jeffrey Skilling hired Andrew Fastow, who was well acquainted with the burgeoning deregulated energy market that Skilling wanted to exploit. In 1993, Fastow began establishing numerous limited liability special-purpose entities, a common business practice in the energy industry. However, it also allowed Enron to transfer some of its liabilities off its books, allowing it to maintain a robust and generally increasing stock price and thus keep its critical investment-grade credit ratings. Enron was originally involved in transmitting and distributing electricity and natural gas throughout the US. The company developed, built, and operated power plants and pipelines while dealing with rules of law and other infrastructures worldwide.[citation needed] Enron owned a large network of natural gas pipelines, which stretched coast to coast and border to border including Northern Natural Gas, Florida Gas Transmission, Transwestern Pipeline Company, and a partnership in Northern Border Pipeline from Canada.[citation needed] The states of California, New Hampshire, and Rhode Island had already passed power deregulation laws by July 1996, the time of Enron's proposal to acquire Portland General Electric corporation. During 1998, Enron began operations in the water sector, creating the Azurix Corporation, which it part-floated on the New York Stock Exchange during June 1999. Azurix failed to become successful in the water utility market, and one of its major concessions, in Buenos Aires, was a large-scale money-loser. Enron grew wealthy due largely to marketing, promoting power, and having a high stock price.[citation needed] Enron was named "America's Most Innovative Company" by Fortune for six consecutive years, from 1996 to 2001. It was on the Fortune's "100 Best Companies to Work for in America" list during 2000, and had offices that were stunning in their opulence. Enron was hailed by many, including labor and the workforce, as an overall great company, praised for its large long-term pensions, benefits for its workers, and extremely effective management until the exposure of its corporate fraud. The first analyst to question the company's success story was Daniel Scotto, an energy market expert at BNP Paribas, who issued a note in August 2001 entitled Enron: All stressed up and no place to go which encouraged investors to sell Enron stocks, although he only changed his recommendation on the stock from "buy" to "neutral". As was later discovered, many of Enron's recorded assets and profits were inflated, wholly fraudulent, or nonexistent. One example was in 1999 when Enron promised to repay Merrill Lynch's investment with interest to show a profit on its books. Debts and losses were put into entities formed offshore that were not included in the company's financial statements; other sophisticated and arcane financial transactions between Enron and related companies were used to eliminate unprofitable entities from the company's books. The company's most valuable asset and the largest source of honest income, the 1930s-era Northern Natural Gas company, was eventually purchased by a group of Omaha investors who relocated its headquarters to their city; it is now a unit of Warren Buffett's Berkshire Hathaway Energy. NNG was established as collateral for a $2.5 billion capital infusion by Dynegy Corporation when Dynegy was planning to buy Enron. When Dynegy examined Enron's financial records carefully, they repudiated the deal and dismissed their CEO, Chuck Watson. The new chairman and CEO, the late Daniel Dienstbier, had been president of NNG and an Enron executive at one time and was forced out by Ken Lay.[citation needed] Dienstbier was an acquaintance of Warren Buffett. NNG continues to be profitable now.[relevant?] 2001 accounting scandals In 2001, after a series of revelations involving irregular accounting procedures perpetrated throughout the 1990s involving Enron and its auditor Arthur Andersen that bordered on fraud, Enron filed for the then largest Chapter 11 bankruptcy in history (since surpassed by those of Worldcom during 2002 and Lehman Brothers during 2008), resulting in $11 billion in shareholder losses. As the scandal progressed, Enron share prices decreased from US$90 during the summer of 2000, to just pennies. Enron's demise occurred after the revelation that much of its profit and revenue were the result of deals with special-purpose entities (limited partnerships which it controlled). This maneuver allowed many of Enron's debts and losses to disappear from its financial statements. Enron filed for bankruptcy on December 2, 2001. In addition, the scandal caused the dissolution of Arthur Andersen, which at the time was one of the Big Five of the world's accounting firms. The company was found guilty of obstruction of justice in 2002 for destroying documents related to the Enron audit. Since the SEC is not allowed to accept audits from convicted felons, Andersen was forced to stop auditing public companies. Although the conviction was dismissed in 2005 by the Supreme Court, the damage to the Andersen name has prevented it from recovering or reviving itself as a viable business even on a limited scale. Enron also withdrew a naming-rights deal with the Houston Astros Major League Baseball club for its new stadium, which was known formerly as Enron Field (now Daikin Park). Enron used a variety of deceptive and fraudulent tactics and accounting practices to cover its fraud in reporting Enron's financial information. Special-purpose entities were created to mask significant liabilities from Enron's financial statements. These entities made Enron seem more profitable than it was, and created a dangerous spiral in which, each quarter, corporate officers would have to perform more and more financial deception to create the illusion of billions of dollars in profit while the company was actually losing money. This practice increased their stock price to new levels, at which point the executives began to work on insider information and trade millions of dollars' worth of Enron stock. The executives and insiders at Enron knew about the offshore accounts that were hiding losses for the company; the investors, however, did not. Chief Financial Officer Andrew Fastow directed the team that created the off-books companies and manipulated the deals to provide himself, his family, and his friends with hundreds of millions of dollars in guaranteed revenue, at the expense of the corporation for which he worked and its stockholders.[citation needed] In 1999, Enron initiated EnronOnline, an Internet-based trading operation, which was used by virtually every energy company in the United States. By promoting the company's aggressive investment strategy, Enron's president and chief operating officer Jeffrey Skilling helped make Enron the biggest wholesaler of gas and electricity, trading over $27 billion per quarter. The corporation's financial claims, however, had to be accepted at face value. Under Skilling, Enron adopted mark-to-market accounting, in which anticipated future profits from any deal were tabulated as if currently real. Thus, Enron could record gains from what over time might turn out to be losses, as the company's fiscal health became secondary to manipulating its stock price during the so-called Tech boom. But when a company's success is measured by undocumented financial statements, actual balance sheets are inconvenient. Indeed, Enron's unscrupulous actions were often gambles to keep the deception going and so increase the stock price. An advancing price meant a continued infusion of investor capital on which debt-ridden Enron in large part subsisted (much like a financial "pyramid" or "Ponzi scheme"). Attempting to maintain the illusion, Skilling verbally attacked Wall Street analyst Richard Grubman, who questioned Enron's unusual accounting practice during a recorded conference telephone call. When Grubman complained that Enron was the only company that could not release a balance sheet along with its earnings statements, Skilling replied, "Well, thank you very much, we appreciate that ... asshole." Though the comment was met with dismay and astonishment by press, Wall Street analysts and public,: 325–6 it became an inside joke among many Enron employees, mocking Grubman for his perceived meddling rather than Skilling's offensiveness. Enron initially planned to retain its three domestic pipeline companies as well as most of its overseas assets. However, before emerging from bankruptcy, Enron sold its domestic pipeline companies as CrossCountry Energy for $2.45 billion and later sold other assets to Vulcan Capital Management. Enron sold its last business, Prisma Energy, during 2006, leaving Enron asset-less. During early 2007, its name was changed to Enron Creditors Recovery Corporation. Its goal was to repay the old Enron's remaining creditors and end Enron's affairs. In December 2008, it was announced that Enron's creditors would receive $7.2 billion from the company's liquidation (approximately 17 percent of the debts owed by the company). After Citigroup and JP Morgan Chase were sued for their role in abetting Enron's practices with loans, the two companies agreed to give billions of dollars to Enron's creditors. By May 2011, $21.8 billion had been distributed to the creditors, totaling 53 percent of Enron's debts at the time of bankruptcy. Enron Creditors Recovery Corporation was ultimately dissolved on November 28, 2016. Azurix, the former water utility part of the company, remains under Enron ownership, although it is currently asset-less. It is involved in several litigations against the government of Argentina claiming compensation relating to the negligence and corruption of the local governance during its management of the Buenos Aires water concession in 1999, which resulted in substantial amounts of debt (approx. $620 million) and the eventual collapse of the branch. Soon after emerging from bankruptcy in November 2004, Enron's new board of directors sued 11 financial institutions for helping Lay, Fastow, Skilling, and others hide Enron's true financial condition. The proceedings were dubbed the "megaclaims litigation". Among the defendants were Royal Bank of Scotland, Deutsche Bank and Citigroup. As of 2008[update], Enron has settled with all of the institutions, ending with Citigroup. Enron was able to obtain nearly $7.2 billion to distribute to its creditors as a result of the megaclaims litigation. As of December 2009, some claim and process payments were still being distributed. Enron has been featured since its bankruptcy in popular culture, including in The Simpsons episodes "That '90s Show" (Homer buys Enron stock while Marge chooses to keep her own Microsoft shares) and "Special Edna", which features a scene of an Enron-themed amusement park ride. The 2007 film Bee Movie also featured a joke reference to a parody company of Enron called "Honron" (a play on the words honey and Enron). The 2003 documentary The Corporation made frequent references to Enron post-bankruptcy, calling the company a "bad apple". Insider trading scandal In August 2000, Enron's stock price attained its greatest value, closing at $90 on the 23rd.: 244 At this time, Enron executives, who possessed inside information on the hidden losses, began to sell their stock. At the same time, the general public and Enron's investors were told to buy the stock. Executives told the investors that the stock would continue to increase until it attained possibly the $130 to $140 range, while secretly unloading their shares. As executives sold their shares, the price began to decrease. Investors were told to continue buying stock or hold steady if they already owned Enron because the stock price would rebound shortly. Kenneth Lay's strategy for responding to Enron's continuing problems was his demeanor. As he did many times, Lay would issue a statement or make an appearance to calm investors and assure them that Enron was doing well. In March 2001 an article by Bethany McLean appeared in Fortune magazine noting that no one understood how the company made money and questioning whether Enron stock was overvalued. By August 15, 2001, Enron's stock price had decreased to $42. Many of the investors still trusted Lay and believed that Enron would rule the market. They continued to buy or retain their stock as the equity value decreased. As October ended, the stock had decreased to $15. Many considered this a great opportunity to buy Enron stock because of what Lay had been telling them in the media. Lay was accused of selling more than $70 million worth of stock at this time, which he used to repay cash advances on lines of credit. He sold another $29 million worth of stock in the open market. Also, Lay's wife, Linda, was accused of selling 500,000 shares of Enron stock totaling $1.2 million on November 28, 2001. The money earned from this sale did not go to the family but rather to charitable organizations, which had already received pledges of contributions from the foundation. Records show that Mrs. Lay made the sale order sometime between 10:00 and 10:20 am. News of Enron's problems, including the millions of dollars in losses they hid, became public about 10:30 that morning, and the stock price soon decreased to less than one dollar. Former Enron executive Paula Rieker was charged with criminal insider trading and sentenced to two years' probation. Rieker obtained 18,380 Enron shares for $15.51 a share. She sold that stock for $49.77 a share in July 2001, a week before the public was told what she already knew about the $102 million loss. In 2002, after the tumultuous fall of Enron's external auditor, and management consultant, Andersen LLP, former Andersen Director, John M. Cunningham coined the phrase, "We have all been Enroned." The fallout resulted in both Lay and Skilling being convicted of conspiracy, fraud, and insider trading. Lay died before sentencing, Skilling got 24 years and 4 months and a $45 million penalty (later reduced). Fastow was sentenced to six years of jail time, and Lou Pai settled out of court for $31.5 million. California's deregulation and subsequent energy crisis In October 2000, Daniel Scotto, the most renowned utility analyst on Wall Street, suspended his ratings on all energy companies conducting business in California because of the possibility that the companies would not receive full and adequate compensation for the deferred energy accounts used as the basis for the California Deregulation Plan enacted during the late 1990s. Five months later, Pacific Gas & Electric (PG&E) was forced into bankruptcy. Republican Senator Phil Gramm, husband of Enron Board member Wendy Gramm and also the second-largest recipient of campaign contributions from Enron, succeeded in legislating California's energy commodity trading deregulation. Despite warnings from prominent consumer groups which stated that this law would give energy traders too much influence over energy commodity prices, the legislation was passed in December 2000. As the periodical Public Citizen reported: Because of Enron's new, unregulated power auction, the company's "Wholesale Services'' revenues quadrupled – from $12 billion in the first quarter of 2000 to $48.4 billion in the first quarter of 2001. After the passage of the deregulation law, California had a total of 38 Stage 3 rolling blackouts declared, until federal regulators intervened in June 2001. These blackouts occurred as a result of a poorly designed market system that was manipulated by traders and marketers, as well as from poor state management and regulatory oversight. Subsequently, Enron traders were revealed as intentionally encouraging the removal of power from the market during California's energy crisis by encouraging suppliers to shut down plants to perform unnecessary maintenance, as documented in recordings made at the time. These acts contributed to the need for rolling blackouts, which adversely affected many businesses dependent upon a reliable supply of electricity, and inconvenienced a large number of retail customers. This scattered supply increased the price, and Enron traders were thus able to sell power at premium prices, sometimes up to a factor of 20 times its normal peak value. The callousness of the traders' attitude toward ratepayers was documented in an evidence tape of a conversation regarding the matter, and sarcastically referencing the confusion of retiree voters in Florida's Miami-Dade County in the November 2000, presidential election. "They're fucking taking all the money back from you guys? All the money you guys stole from those poor grandmothers in California?" "Yeah, Grandma Millie man. But she's the one who couldn't figure out how to fucking vote on the butterfly ballot." (Laughing from both sides.) "Yeah, now she wants her fucking money back for all the power you've charged right up, jammed right up her ass for fucking $250 a megawatt-hour." The traders had been discussing the efforts of the Snohomish PUD in Northwestern Washington state to recover the massive overcharges that Enron had engineered. Morgan Stanley, which had taken Enron's place in the lawsuit, fought the release of the documents that the PUD had sought to make its case, but were being withheld by the Federal Energy Regulatory Commission. Former management and corporate governance Products Enron traded in more than 30 different products, including oil and LNG transportation, broadband, principal investments, risk management for commodities, shipping / freight, streaming media, and water and wastewater. Products traded on EnronOnline in particular included petrochemicals, plastics, power, pulp and paper, steel, and weather risk management. Enron was also an extensive futures trader, including sugar, coffee, grains, hogs, and other meat futures. At the time of its bankruptcy filing in December 2001, Enron was structured into seven distinct business units. Enron manufactured gas valves, circuit breakers, thermostats, and electrical equipment in Venezuela using INSELA SA, a 50–50 joint venture with General Electric. Enron owned three paper and pulp products companies: Garden State Paper, a newsprint mill; as well as Papiers Stadacona and St. Aurelie Timberlands. Enron had a controlling stake in the Louisiana-based petroleum exploration and production company Mariner Energy. Enron opened EnronOnline, an electronic trading platform for energy commodities, on November 29, 1999. Conceptualized by the company's European Gas Trading team, it was the first web-based transaction system that allowed buyers and sellers to buy, sell, and trade commodity products globally. It allowed users to do business only with Enron. The site allowed Enron to transact with participants in the global energy markets. The main commodities offered on EnronOnline were natural gas and electricity, although there were 500 other products including credit derivatives, bankruptcy swaps, pulp, gas, plastics, paper, steel, metals, freight, and TV commercial time. At its maximum, more than $6 billion worth of commodities were transacted using EnronOnline every day, but specialists questioned how Enron reported trades and calculated its profits, saying that the same fraudulent accounting that was rampant at Enron's other operations may have been used in trading. After Enron's bankruptcy in late 2001, EnronOnline was sold to the Swiss financial giant UBS. Within a year, UBS abandoned its efforts to relaunch the division and closed it in November 2002. Enron International (EI) was Enron's wholesale asset development and asset management business. Its primary emphasis was developing and building natural gas power plants outside North America. Enron Engineering and Construction Company (EECC) was a wholly owned subsidiary of Enron International and built almost all of Enron International's power plants. Unlike other business units of Enron, Enron International had a strong cash flow at the bankruptcy filing.[citation needed] Enron International consisted of all of Enron's foreign power projects, including ones in Europe. The company's Teesside plant was one of the largest gas-fired power stations in the world, built and operated by Enron from 1989, and produced 3 percent of the United Kingdom's energy needs. Enron owned half of the plant's equity, with the remaining 50 percent split between four regional electricity companies. Rebecca Mark was the CEO of Enron International until she resigned to manage Enron's newly acquired water business, Azurix, in 1997. Mark had a major role in the development of the Dabhol project in India, Enron's largest international endeavor. Enron International constructed power plants and pipelines across the globe. Some are presently still operating, including the massive Teesside plant in England. Others, like a barge-mounted plant off Puerto Plata in the Dominican Republic, cost Enron money through lawsuits and investment losses. Puerto Plata was a barge-mounted power plant next to the hotel Hotelero del Atlantico. When the plant was activated, winds blew soot from the plant onto the hotel guests' meals, blackening their food. The winds also blew garbage from nearby slums into the plant's water-intake system. For some time the only solution was to hire men who would row out and push the garbage away with their paddles. Through mid-2000 the company collected a paltry $3.5 million from a $95 million investment. Enron also had other investment projects in Europe, Argentina, Brazil, Bolivia, Colombia, Mexico, Jamaica, Venezuela, elsewhere in South America and across the Caribbean. Around 1992, Indian experts came to the United States to find energy investors to help with India's energy shortage problems. During December 1993, Enron finalized a 20-year power-purchase contract with the Maharashtra State Electricity Board. The contract allowed Enron to construct a massive 2,015 megawatt power plant on a remote volcanic bluff 100 miles (160 km) south of Mumbai through a two-phase project called Dabhol Power Station. Construction would be completed in two phases, and Enron would form the Dabhol Power Company to help manage the plant. The power project was the first step in a $20 billion scheme to help rebuild and stabilize India's power grid. Enron, GE (which was selling turbines to the project), and Bechtel (which was constructing the plant), each contributed 10% equity with the remaining 90% covered by the MSEB In 1996, when India's Congress Party was no longer in power, the Indian government assessed the project as being excessively expensive, refused to pay for the plant, and stopped construction. The MSEB was required by contract to continue to pay Enron plant maintenance charges, even if no power was purchased from the plant. The MSEB determined that it could not afford to purchase the power (at Rs. 8 per unit kWh) charged by Enron. The plant operator was unable to find alternate customers for Dabhol power due to the absence of a free market in the regulated structure of utilities in India.[citation needed] By 2000, the Dabhol plant was almost complete and Phase 1 had begun producing power. Enron as a whole, however, was heavily overextended, and in the summer of that year Mark and all the key executives at Enron International were asked to resign from Enron to reshape the company and get rid of asset businesses. Shortly thereafter a payment dispute with MSEB ensued, and Enron issued a stop-work order on the plant in June 2001. From 1996 until Enron's bankruptcy in 2001 the company tried to revive the project and revive interest in India's need for the power plant without success. By December 2001 the Enron scandal and bankruptcy cut short any opportunity to revive the construction and complete the plant. In 2005, an Indian government-run company, Ratnagiri Gas and Power, was set up to finish construction on the Dabhol facility and operate the plant. During the summer of 2001, Enron attempted to sell several of Enron International's assets, many of which were not sold. The public and media believed it was unknown why Enron wanted to sell these assets, suspecting it was because Enron needed cash. Employees who worked with company assets were told in 2000 that Jeff Skilling believed that business assets were an outdated means of a company's worth, and instead he wanted to build a company based on "intellectual assets". Enron Global Exploration & Production Inc. (EGEP) was an Enron subsidiary that was born from the split of domestic assets via EOG Resources (formerly Enron Oil and Gas EOG) and international assets via EGEP (formerly Enron Oil and Gas Int'l, Ltd EOGIL). Among the EGEP assets were the Panna-Mukta and the South Tapti fields, discovered by the Indian state-owned Oil and Natural Gas Corporation (ONGC), which operated the fields initially. December 1994, a joint venture began between ONGC (40%), Enron (30%) and Reliance (30%). Mid-year of 2002, British Gas (BG) completed the acquisition of EGEP's 30% share of the Panna-Mukta and Tapti fields for $350 million, a few months before Enron filed bankruptcy. Enron Prize for Distinguished Public Service During the mid-1990s, Enron established an endowment for the Enron Prize for Distinguished Public Service, awarded by Rice University's Baker Institute to "recognize outstanding individuals for their contributions to public service". Recipients were: Greenspan, because of his position as the Fed chairman, was not at liberty to accept the $10,000 honorarium, the $15,000 sculpture, nor the crystal trophy, but only accepted the "honor" of being named an Enron Prize recipient. The situation was further complicated because a few days earlier, Enron had filed paperwork admitting it had falsified financial statements for five years. Greenspan did not mention Enron a single time during his speech. At the ceremony, Ken Lay stated, "I'm looking forward to our first woman recipient." The next morning, it was reported in the Houston Chronicle that no decision had been made on whether the name of the prize would be changed. 19 days after the prize was awarded to Greenspan, Enron declared bankruptcy. In early 2002, Enron was awarded MIT's (in)famous Ig Nobel Prize for "Most Creative Use of Imaginary Numbers". The various former members of the Enron management team all refused to accept the award in person, although no reason was given at the time. Enron's influence on politics "Women of Enron" In 2002, the Playboy magazine featured a nude pictorial "Women of Enron", with ten former and contemporary Enron female employees. The women said they posed for the fun and to earn some money. 2024 satirical reboot and meme coin On December 2, 2024, a new tweet was posted on the @Enron X account. The tweet had the caption "We're back. Can we talk?", along with a promotional video. Viewers also noticed that the website enron.com was also functional. The replies alleged a potential cryptocurrency scam, but others pointed to the new terms of service including a clause explicitly stating that the content on the website was parody protected under the First Amendment. Many pointed out that this was likely a joke, as the domain seemed to be registered by Peter McIndoe, a performance artist and founder of Birds Aren't Real. Additionally, the rights to the Enron name had been purchased at auction by The College Company for $275 in 2020. On December 9, it was announced that the CEO of Enron was Connor Gaydos, another cofounder of The College Company and Birds Aren't Real. It was also announced that Enron planned to hold a new "Enron Power Summit" on January 6, 2025. On December 12, the Twitter page @Pubity posted video of Gaydos being pied in the face. Many viewers quickly assumed the incident was staged, and it may have been a parody of an incident when Jeff Skilling was pied in the face by a California woman. On January 6, 2025, Enron announced the Enron Egg, its first new product in 20 years. Gaydos claimed that the product was a micro nuclear reactor capable of powering a suburban home for 10 years using 20% enriched uranium in the form of uranium zirconium hydride. The announcement supposedly took place at the previously announced "Enron Power Summit," but the Houston Chronicle was unable to confirm that such an event actually took place. On February 4, 2025, Enron launched a crypto token named $ENRON on the Solana blockchain as part of its satire, at one point trading with a market capitalization of $700 million before the price fell at least 76% within 24 hours of launch. See also Notes References Bibliography External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Python_(programming_language)#cite_ref-151] | [TOKENS: 4314]
Contents Python (programming language) Python is a high-level, general-purpose programming language. Its design philosophy emphasizes code readability with the use of significant indentation. Python is dynamically type-checked and garbage-collected. It supports multiple programming paradigms, including structured (particularly procedural), object-oriented and functional programming. Guido van Rossum began working on Python in the late 1980s as a successor to the ABC programming language. Python 3.0, released in 2008, was a major revision and not completely backward-compatible with earlier versions. Beginning with Python 3.5, capabilities and keywords for typing were added to the language, allowing optional static typing. As of 2026[update], the Python Software Foundation supports Python 3.10, 3.11, 3.12, 3.13, and 3.14, following the project's annual release cycle and five-year support policy. Python 3.15 is currently in the alpha development phase, and the stable release is expected to come out in October 2026. Earlier versions in the 3.x series have reached end-of-life and no longer receive security updates. Python has gained widespread use in the machine learning community. It is widely taught as an introductory programming language. Since 2003, Python has consistently ranked in the top ten of the most popular programming languages in the TIOBE Programming Community Index, which ranks based on searches in 24 platforms. History Python was conceived in the late 1980s by Guido van Rossum at Centrum Wiskunde & Informatica (CWI) in the Netherlands. It was designed as a successor to the ABC programming language, which was inspired by SETL, capable of exception handling and interfacing with the Amoeba operating system. Python implementation began in December 1989. Van Rossum first released it in 1991 as Python 0.9.0. Van Rossum assumed sole responsibility for the project, as the lead developer, until 12 July 2018, when he announced his "permanent vacation" from responsibilities as Python's "benevolent dictator for life" (BDFL); this title was bestowed on him by the Python community to reflect his long-term commitment as the project's chief decision-maker. (He has since come out of retirement and is self-titled "BDFL-emeritus".) In January 2019, active Python core developers elected a five-member Steering Council to lead the project. The name Python derives from the British comedy series Monty Python's Flying Circus. (See § Naming.) Python 2.0 was released on 16 October 2000, featuring many new features such as list comprehensions, cycle-detecting garbage collection, reference counting, and Unicode support. Python 2.7's end-of-life was initially set for 2015, and then postponed to 2020 out of concern that a large body of existing code could not easily be forward-ported to Python 3. It no longer receives security patches or updates. While Python 2.7 and older versions are officially unsupported, a different unofficial Python implementation, PyPy, continues to support Python 2, i.e., "2.7.18+" (plus 3.11), with the plus signifying (at least some) "backported security updates". Python 3.0 was released on 3 December 2008, and was a major revision and not completely backward-compatible with earlier versions, with some new semantics and changed syntax. Python 2.7.18, released in 2020, was the last release of Python 2. Several releases in the Python 3.x series have added new syntax to the language, and made a few (considered very minor) backward-incompatible changes. As of January 2026[update], Python 3.14.3 is the latest stable release. All older 3.x versions had a security update down to Python 3.9.24 then again with 3.9.25, the final version in 3.9 series. Python 3.10 is, since November 2025, the oldest supported branch. Python 3.15 has an alpha released, and Android has an official downloadable executable available for Python 3.14. Releases receive two years of full support followed by three years of security support. Design philosophy and features Python is a multi-paradigm programming language. Object-oriented programming and structured programming are fully supported, and many of their features support functional programming and aspect-oriented programming – including metaprogramming and metaobjects. Many other paradigms are supported via extensions, including design by contract and logic programming. Python is often referred to as a 'glue language' because it is purposely designed to be able to integrate components written in other languages. Python uses dynamic typing and a combination of reference counting and a cycle-detecting garbage collector for memory management. It uses dynamic name resolution (late binding), which binds method and variable names during program execution. Python's design offers some support for functional programming in the "Lisp tradition". It has filter, map, and reduce functions; list comprehensions, dictionaries, sets, and generator expressions. The standard library has two modules (itertools and functools) that implement functional tools borrowed from Haskell and Standard ML. Python's core philosophy is summarized in the Zen of Python (PEP 20) written by Tim Peters, which includes aphorisms such as these: However, Python has received criticism for violating these principles and adding unnecessary language bloat. Responses to these criticisms note that the Zen of Python is a guideline rather than a rule. The addition of some new features had been controversial: Guido van Rossum resigned as Benevolent Dictator for Life after conflict about adding the assignment expression operator in Python 3.8. Nevertheless, rather than building all functionality into its core, Python was designed to be highly extensible via modules. This compact modularity has made it particularly popular as a means of adding programmable interfaces to existing applications. Van Rossum's vision of a small core language with a large standard library and easily extensible interpreter stemmed from his frustrations with ABC, which represented the opposite approach. Python claims to strive for a simpler, less-cluttered syntax and grammar, while giving developers a choice in their coding methodology. Python lacks do .. while loops, which Rossum considered harmful. In contrast to Perl's motto "there is more than one way to do it", Python advocates an approach where "there should be one – and preferably only one – obvious way to do it". In practice, however, Python provides many ways to achieve a given goal. There are at least three ways to format a string literal, with no certainty as to which one a programmer should use. Alex Martelli is a Fellow at the Python Software Foundation and Python book author; he wrote that "To describe something as 'clever' is not considered a compliment in the Python culture." Python's developers typically prioritize readability over performance. For example, they reject patches to non-critical parts of the CPython reference implementation that would offer increases in speed that do not justify the cost of clarity and readability.[failed verification] Execution speed can be improved by moving speed-critical functions to extension modules written in languages such as C, or by using a just-in-time compiler like PyPy. Also, it is possible to transpile to other languages. However, this approach either fails to achieve the expected speed-up, since Python is a very dynamic language, or only a restricted subset of Python is compiled (with potential minor semantic changes). Python is meant to be a fun language to use. This goal is reflected in the name – a tribute to the British comedy group Monty Python – and in playful approaches to some tutorials and reference materials. For instance, some code examples use the terms "spam" and "eggs" (in reference to a Monty Python sketch), rather than the typical terms "foo" and "bar". A common neologism in the Python community is pythonic, which has a broad range of meanings related to program style: Pythonic code may use Python idioms well; be natural or show fluency in the language; or conform with Python's minimalist philosophy and emphasis on readability. Syntax and semantics Python is meant to be an easily readable language. Its formatting is visually uncluttered and often uses English keywords where other languages use punctuation. Unlike many other languages, it does not use curly brackets to delimit blocks, and semicolons after statements are allowed but rarely used. It has fewer syntactic exceptions and special cases than C or Pascal. Python uses whitespace indentation, rather than curly brackets or keywords, to delimit blocks. An increase in indentation comes after certain statements; a decrease in indentation signifies the end of the current block. Thus, the program's visual structure accurately represents its semantic structure. This feature is sometimes termed the off-side rule. Some other languages use indentation this way; but in most, indentation has no semantic meaning. The recommended indent size is four spaces. Python's statements include the following: The assignment statement (=) binds a name as a reference to a separate, dynamically allocated object. Variables may subsequently be rebound at any time to any object. In Python, a variable name is a generic reference holder without a fixed data type; however, it always refers to some object with a type. This is called dynamic typing—in contrast to statically-typed languages, where each variable may contain only a value of a certain type. Python does not support tail call optimization or first-class continuations; according to Van Rossum, the language never will. However, better support for coroutine-like functionality is provided by extending Python's generators. Before 2.5, generators were lazy iterators; data was passed unidirectionally out of the generator. From Python 2.5 on, it is possible to pass data back into a generator function; and from version 3.3, data can be passed through multiple stack levels. Python's expressions include the following: In Python, a distinction between expressions and statements is rigidly enforced, in contrast to languages such as Common Lisp, Scheme, or Ruby. This distinction leads to duplicating some functionality, for example: A statement cannot be part of an expression; because of this restriction, expressions such as list and dict comprehensions (and lambda expressions) cannot contain statements. As a particular case, an assignment statement such as a = 1 cannot be part of the conditional expression of a conditional statement. Python uses duck typing, and it has typed objects but untyped variable names. Type constraints are not checked at definition time; rather, operations on an object may fail at usage time, indicating that the object is not of an appropriate type. Despite being dynamically typed, Python is strongly typed, forbidding operations that are poorly defined (e.g., adding a number and a string) rather than quietly attempting to interpret them. Python allows programmers to define their own types using classes, most often for object-oriented programming. New instances of classes are constructed by calling the class, for example, SpamClass() or EggsClass()); the classes are instances of the metaclass type (which is an instance of itself), thereby allowing metaprogramming and reflection. Before version 3.0, Python had two kinds of classes, both using the same syntax: old-style and new-style. Current Python versions support the semantics of only the new style. Python supports optional type annotations. These annotations are not enforced by the language, but may be used by external tools such as mypy to catch errors. Python includes a module typing including several type names for type annotations. Also, mypy supports a Python compiler called mypyc, which leverages type annotations for optimization. 1.33333 frozenset() Python includes conventional symbols for arithmetic operators (+, -, *, /), the floor-division operator //, and the modulo operator %. (With the modulo operator, a remainder can be negative, e.g., 4 % -3 == -2.) Also, Python offers the ** symbol for exponentiation, e.g. 5**3 == 125 and 9**0.5 == 3.0. Also, it offers the matrix‑multiplication operator @ . These operators work as in traditional mathematics; with the same precedence rules, the infix operators + and - can also be unary, to represent positive and negative numbers respectively. Division between integers produces floating-point results. The behavior of division has changed significantly over time: In Python terms, the / operator represents true division (or simply division), while the // operator represents floor division. Before version 3.0, the / operator represents classic division. Rounding towards negative infinity, though a different method than in most languages, adds consistency to Python. For instance, this rounding implies that the equation (a + b)//b == a//b + 1 is always true. Also, the rounding implies that the equation b*(a//b) + a%b == a is valid for both positive and negative values of a. As expected, the result of a%b lies in the half-open interval [0, b), where b is a positive integer; however, maintaining the validity of the equation requires that the result must lie in the interval (b, 0] when b is negative. Python provides a round function for rounding a float to the nearest integer. For tie-breaking, Python 3 uses the round to even method: round(1.5) and round(2.5) both produce 2. Python versions before 3 used the round-away-from-zero method: round(0.5) is 1.0, and round(-0.5) is −1.0. Python allows Boolean expressions that contain multiple equality relations to be consistent with general usage in mathematics. For example, the expression a < b < c tests whether a is less than b and b is less than c. C-derived languages interpret this expression differently: in C, the expression would first evaluate a < b, resulting in 0 or 1, and that result would then be compared with c. Python uses arbitrary-precision arithmetic for all integer operations. The Decimal type/class in the decimal module provides decimal floating-point numbers to a pre-defined arbitrary precision with several rounding modes. The Fraction class in the fractions module provides arbitrary precision for rational numbers. Due to Python's extensive mathematics library and the third-party library NumPy, the language is frequently used for scientific scripting in tasks such as numerical data processing and manipulation. Functions are created in Python by using the def keyword. A function is defined similarly to how it is called, by first providing the function name and then the required parameters. Here is an example of a function that prints its inputs: To assign a default value to a function parameter in case no actual value is provided at run time, variable-definition syntax can be used inside the function header. Code examples "Hello, World!" program: Program to calculate the factorial of a non-negative integer: Libraries Python's large standard library is commonly cited as one of its greatest strengths. For Internet-facing applications, many standard formats and protocols such as MIME and HTTP are supported. The language includes modules for creating graphical user interfaces, connecting to relational databases, generating pseudorandom numbers, arithmetic with arbitrary-precision decimals, manipulating regular expressions, and unit testing. Some parts of the standard library are covered by specifications—for example, the Web Server Gateway Interface (WSGI) implementation wsgiref follows PEP 333—but most parts are specified by their code, internal documentation, and test suites. However, because most of the standard library is cross-platform Python code, only a few modules must be altered or rewritten for variant implementations. As of 13 March 2025,[update] the Python Package Index (PyPI), the official repository for third-party Python software, contains over 614,339 packages. Development environments Most[which?] Python implementations (including CPython) include a read–eval–print loop (REPL); this permits the environment to function as a command line interpreter, with which users enter statements sequentially and receive results immediately. Also, CPython is bundled with an integrated development environment (IDE) called IDLE, which is oriented toward beginners.[citation needed] Other shells, including IDLE and IPython, add additional capabilities such as improved auto-completion, session-state retention, and syntax highlighting. Standard desktop IDEs include PyCharm, Spyder, and Visual Studio Code; there are web browser-based IDEs, such as the following environments: Implementations CPython is the reference implementation of Python. This implementation is written in C, meeting the C11 standard since version 3.11. Older versions use the C89 standard with several select C99 features, but third-party extensions are not limited to older C versions—e.g., they can be implemented using C11 or C++. CPython compiles Python programs into an intermediate bytecode, which is then executed by a virtual machine. CPython is distributed with a large standard library written in a combination of C and native Python. CPython is available for many platforms, including Windows and most modern Unix-like systems, including macOS (and Apple M1 Macs, since Python 3.9.1, using an experimental installer). Starting with Python 3.9, the Python installer intentionally fails to install on Windows 7 and 8; Windows XP was supported until Python 3.5, with unofficial support for VMS. Platform portability was one of Python's earliest priorities. During development of Python 1 and 2, even OS/2 and Solaris were supported; since that time, support has been dropped for many platforms. All current Python versions (since 3.7) support only operating systems that feature multithreading, by now supporting not nearly as many operating systems (dropping many outdated) than in the past. All alternative implementations have at least slightly different semantics. For example, an alternative may include unordered dictionaries, in contrast to other current Python versions. As another example in the larger Python ecosystem, PyPy does not support the full C Python API. Creating an executable with Python often is done by bundling an entire Python interpreter into the executable, which causes binary sizes to be massive for small programs, yet there exist implementations that are capable of truly compiling Python. Alternative implementations include the following: Stackless Python is a significant fork of CPython that implements microthreads. This implementation uses the call stack differently, thus allowing massively concurrent programs. PyPy also offers a stackless version. Just-in-time Python compilers have been developed, but are now unsupported: There are several compilers/transpilers to high-level object languages; the source language is unrestricted Python, a subset of Python, or a language similar to Python: There are also specialized compilers: Some older projects existed, as well as compilers not designed for use with Python 3.x and related syntax: A performance comparison among various Python implementations, using a non-numerical (combinatorial) workload, was presented at EuroSciPy '13. In addition, Python's performance relative to other programming languages is benchmarked by The Computer Language Benchmarks Game. There are several approaches to optimizing Python performance, despite the inherent slowness of an interpreted language. These approaches include the following strategies or tools: Language Development Python's development is conducted mostly through the Python Enhancement Proposal (PEP) process; this process is the primary mechanism for proposing major new features, collecting community input on issues, and documenting Python design decisions. Python coding style is covered in PEP 8. Outstanding PEPs are reviewed and commented on by the Python community and the steering council. Enhancement of the language corresponds with development of the CPython reference implementation. The mailing list python-dev is the primary forum for the language's development. Specific issues were originally discussed in the Roundup bug tracker hosted by the foundation. In 2022, all issues and discussions were migrated to GitHub. Development originally took place on a self-hosted source-code repository running Mercurial, until Python moved to GitHub in January 2017. CPython's public releases have three types, distinguished by which part of the version number is incremented: Many alpha, beta, and release-candidates are also released as previews and for testing before final releases. Although there is a rough schedule for releases, they are often delayed if the code is not ready yet. Python's development team monitors the state of the code by running a large unit test suite during development. The major academic conference on Python is PyCon. Also, there are special Python mentoring programs, such as PyLadies. Naming Python's name is inspired by the British comedy group Monty Python, whom Python creator Guido van Rossum enjoyed while developing the language. Monty Python references appear frequently in Python code and culture; for example, the metasyntactic variables often used in Python literature are spam and eggs, rather than the traditional foo and bar. Also, the official Python documentation contains various references to Monty Python routines. Python users are sometimes referred to as "Pythonistas". Languages influenced by Python See also Notes References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Social_network#cite_ref-77] | [TOKENS: 5247]
Contents Social network 1800s: Martineau · Tocqueville · Marx · Spencer · Le Bon · Ward · Pareto · Tönnies · Veblen · Simmel · Durkheim · Addams · Mead · Weber · Du Bois · Mannheim · Elias A social network is a social structure consisting of a set of social actors (such as individuals or organizations), networks of dyadic ties, and other social interactions between actors. The social network perspective provides a set of methods for analyzing the structure of whole social entities along with a variety of theories explaining the patterns observed in these structures. The study of these structures uses social network analysis to identify local and global patterns, locate influential entities, and examine dynamics of networks. For instance, social network analysis has been used in studying the spread of misinformation on social media platforms or analyzing the influence of key figures in social networks. Social networks and the analysis of them is an inherently interdisciplinary academic field which emerged from social psychology, sociology, statistics, and graph theory. Georg Simmel authored early structural theories in sociology emphasizing the dynamics of triads and "web of group affiliations". Jacob Moreno is credited with developing the first sociograms in the 1930s to study interpersonal relationships. These approaches were mathematically formalized in the 1950s and theories and methods of social networks became pervasive in the social and behavioral sciences by the 1980s. Social network analysis is now one of the major paradigms in contemporary sociology, and is also employed in a number of other social and formal sciences. Together with other complex networks, it forms part of the nascent field of network science. Overview The social network is a theoretical construct useful in the social sciences to study relationships between individuals, groups, organizations, or even entire societies (social units, see differentiation). The term is used to describe a social structure determined by such interactions. The ties through which any given social unit connects represent the convergence of the various social contacts of that unit. This theoretical approach is, necessarily, relational. An axiom of the social network approach to understanding social interaction is that social phenomena should be primarily conceived and investigated through the properties of relations between and within units, instead of the properties of these units themselves. Thus, one common criticism of social network theory is that individual agency is often ignored although this may not be the case in practice (see agent-based modeling). Precisely because many different types of relations, singular or in combination, form these network configurations, network analytics are useful to a broad range of research enterprises. In social science, these fields of study include, but are not limited to anthropology, biology, communication studies, economics, geography, information science, organizational studies, social psychology, sociology, and sociolinguistics. History In the late 1890s, both Émile Durkheim and Ferdinand Tönnies foreshadowed the idea of social networks in their theories and research of social groups. Tönnies argued that social groups can exist as personal and direct social ties that either link individuals who share values and belief (Gemeinschaft, German, commonly translated as "community") or impersonal, formal, and instrumental social links (Gesellschaft, German, commonly translated as "society"). Durkheim gave a non-individualistic explanation of social facts, arguing that social phenomena arise when interacting individuals constitute a reality that can no longer be accounted for in terms of the properties of individual actors. Georg Simmel, writing at the turn of the twentieth century, pointed to the nature of networks and the effect of network size on interaction and examined the likelihood of interaction in loosely knit networks rather than groups. Major developments in the field can be seen in the 1930s by several groups in psychology, anthropology, and mathematics working independently. In psychology, in the 1930s, Jacob L. Moreno began systematic recording and analysis of social interaction in small groups, especially classrooms and work groups (see sociometry). In anthropology, the foundation for social network theory is the theoretical and ethnographic work of Bronislaw Malinowski, Alfred Radcliffe-Brown, and Claude Lévi-Strauss. A group of social anthropologists associated with Max Gluckman and the Manchester School, including John A. Barnes, J. Clyde Mitchell and Elizabeth Bott Spillius, often are credited with performing some of the first fieldwork from which network analyses were performed, investigating community networks in southern Africa, India and the United Kingdom. Concomitantly, British anthropologist S. F. Nadel codified a theory of social structure that was influential in later network analysis. In sociology, the early (1930s) work of Talcott Parsons set the stage for taking a relational approach to understanding social structure. Later, drawing upon Parsons' theory, the work of sociologist Peter Blau provides a strong impetus for analyzing the relational ties of social units with his work on social exchange theory. By the 1970s, a growing number of scholars worked to combine the different tracks and traditions. One group consisted of sociologist Harrison White and his students at the Harvard University Department of Social Relations. Also independently active in the Harvard Social Relations department at the time were Charles Tilly, who focused on networks in political and community sociology and social movements, and Stanley Milgram, who developed the "six degrees of separation" thesis. Mark Granovetter and Barry Wellman are among the former students of White who elaborated and championed the analysis of social networks. Beginning in the late 1990s, social network analysis experienced work by sociologists, political scientists, and physicists such as Duncan J. Watts, Albert-László Barabási, Peter Bearman, Nicholas A. Christakis, James H. Fowler, and others, developing and applying new models and methods to emerging data available about online social networks, as well as "digital traces" regarding face-to-face networks. Levels of analysis In general, social networks are self-organizing, emergent, and complex, such that a globally coherent pattern appears from the local interaction of the elements that make up the system. These patterns become more apparent as network size increases. However, a global network analysis of, for example, all interpersonal relationships in the world is not feasible and is likely to contain so much information as to be uninformative. Practical limitations of computing power, ethics and participant recruitment and payment also limit the scope of a social network analysis. The nuances of a local system may be lost in a large network analysis, hence the quality of information may be more important than its scale for understanding network properties. Thus, social networks are analyzed at the scale relevant to the researcher's theoretical question. Although levels of analysis are not necessarily mutually exclusive, there are three general levels into which networks may fall: micro-level, meso-level, and macro-level. At the micro-level, social network research typically begins with an individual, snowballing as social relationships are traced, or may begin with a small group of individuals in a particular social context. Dyadic level: A dyad is a social relationship between two individuals. Network research on dyads may concentrate on structure of the relationship (e.g. multiplexity, strength), social equality, and tendencies toward reciprocity/mutuality. Triadic level: Add one individual to a dyad, and you have a triad. Research at this level may concentrate on factors such as balance and transitivity, as well as social equality and tendencies toward reciprocity/mutuality. In the balance theory of Fritz Heider the triad is the key to social dynamics. The discord in a rivalrous love triangle is an example of an unbalanced triad, likely to change to a balanced triad by a change in one of the relations. The dynamics of social friendships in society has been modeled by balancing triads. The study is carried forward with the theory of signed graphs. Actor level: The smallest unit of analysis in a social network is an individual in their social setting, i.e., an "actor" or "ego." Egonetwork analysis focuses on network characteristics, such as size, relationship strength, density, centrality, prestige and roles such as isolates, liaisons, and bridges. Such analyses, are most commonly used in the fields of psychology or social psychology, ethnographic kinship analysis or other genealogical studies of relationships between individuals. Subset level: Subset levels of network research problems begin at the micro-level, but may cross over into the meso-level of analysis. Subset level research may focus on distance and reachability, cliques, cohesive subgroups, or other group actions or behavior. In general, meso-level theories begin with a population size that falls between the micro- and macro-levels. However, meso-level may also refer to analyses that are specifically designed to reveal connections between micro- and macro-levels. Meso-level networks are low density and may exhibit causal processes distinct from interpersonal micro-level networks. Organizations: Formal organizations are social groups that distribute tasks for a collective goal. Network research on organizations may focus on either intra-organizational or inter-organizational ties in terms of formal or informal relationships. Intra-organizational networks themselves often contain multiple levels of analysis, especially in larger organizations with multiple branches, franchises or semi-autonomous departments. In these cases, research is often conducted at a work group level and organization level, focusing on the interplay between the two structures. Experiments with networked groups online have documented ways to optimize group-level coordination through diverse interventions, including the addition of autonomous agents to the groups. Randomly distributed networks: Exponential random graph models of social networks became state-of-the-art methods of social network analysis in the 1980s. This framework has the capacity to represent social-structural effects commonly observed in many human social networks, including general degree-based structural effects commonly observed in many human social networks as well as reciprocity and transitivity, and at the node-level, homophily and attribute-based activity and popularity effects, as derived from explicit hypotheses about dependencies among network ties. Parameters are given in terms of the prevalence of small subgraph configurations in the network and can be interpreted as describing the combinations of local social processes from which a given network emerges. These probability models for networks on a given set of actors allow generalization beyond the restrictive dyadic independence assumption of micro-networks, allowing models to be built from theoretical structural foundations of social behavior. Scale-free networks: A scale-free network is a network whose degree distribution follows a power law, at least asymptotically. In network theory a scale-free ideal network is a random network with a degree distribution that unravels the size distribution of social groups. Specific characteristics of scale-free networks vary with the theories and analytical tools used to create them, however, in general, scale-free networks have some common characteristics. One notable characteristic in a scale-free network is the relative commonness of vertices with a degree that greatly exceeds the average. The highest-degree nodes are often called "hubs", and may serve specific purposes in their networks, although this depends greatly on the social context. Another general characteristic of scale-free networks is the clustering coefficient distribution, which decreases as the node degree increases. This distribution also follows a power law. The Barabási model of network evolution shown above is an example of a scale-free network. Rather than tracing interpersonal interactions, macro-level analyses generally trace the outcomes of interactions, such as economic or other resource transfer interactions over a large population. Large-scale networks: Large-scale network is a term somewhat synonymous with "macro-level." It is primarily used in social and behavioral sciences, and in economics. Originally, the term was used extensively in the computer sciences (see large-scale network mapping). Complex networks: Most larger social networks display features of social complexity, which involves substantial non-trivial features of network topology, with patterns of complex connections between elements that are neither purely regular nor purely random (see, complexity science, dynamical system and chaos theory), as do biological, and technological networks. Such complex network features include a heavy tail in the degree distribution, a high clustering coefficient, assortativity or disassortativity among vertices, community structure (see stochastic block model), and hierarchical structure. In the case of agency-directed networks these features also include reciprocity, triad significance profile (TSP, see network motif), and other features. In contrast, many of the mathematical models of networks that have been studied in the past, such as lattices and random graphs, do not show these features. Theoretical links Various theoretical frameworks have been imported for the use of social network analysis. The most prominent of these are Graph theory, Balance theory, Social comparison theory, and more recently, the Social identity approach. Few complete theories have been produced from social network analysis. Two that have are structural role theory and heterophily theory. The basis of Heterophily Theory was the finding in one study that more numerous weak ties can be important in seeking information and innovation, as cliques have a tendency to have more homogeneous opinions as well as share many common traits. This homophilic tendency was the reason for the members of the cliques to be attracted together in the first place. However, being similar, each member of the clique would also know more or less what the other members knew. To find new information or insights, members of the clique will have to look beyond the clique to its other friends and acquaintances. This is what Granovetter called "the strength of weak ties". Structural holes In the context of networks, social capital exists where people have an advantage because of their location in a network. Contacts in a network provide information, opportunities and perspectives that can be beneficial to the central player in the network. Most social structures tend to be characterized by dense clusters of strong connections. Information within these clusters tends to be rather homogeneous and redundant. Non-redundant information is most often obtained through contacts in different clusters. When two separate clusters possess non-redundant information, there is said to be a structural hole between them. Thus, a network that bridges structural holes will provide network benefits that are in some degree additive, rather than overlapping. An ideal network structure has a vine and cluster structure, providing access to many different clusters and structural holes. Networks rich in structural holes are a form of social capital in that they offer information benefits. The main player in a network that bridges structural holes is able to access information from diverse sources and clusters. For example, in business networks, this is beneficial to an individual's career because he is more likely to hear of job openings and opportunities if his network spans a wide range of contacts in different industries/sectors. This concept is similar to Mark Granovetter's theory of weak ties, which rests on the basis that having a broad range of contacts is most effective for job attainment. Structural holes have been widely applied in social network analysis, resulting in applications in a wide range of practical scenarios as well as machine learning-based social prediction. Research clusters Research has used network analysis to examine networks created when artists are exhibited together in museum exhibition. Such networks have been shown to affect an artist's recognition in history and historical narratives, even when controlling for individual accomplishments of the artist. Other work examines how network grouping of artists can affect an individual artist's auction performance. An artist's status has been shown to increase when associated with higher status networks, though this association has diminishing returns over an artist's career. In J.A. Barnes' day, a "community" referred to a specific geographic location and studies of community ties had to do with who talked, associated, traded, and attended church with whom. Today, however, there are extended "online" communities developed through telecommunications devices and social network services. Such devices and services require extensive and ongoing maintenance and analysis, often using network science methods. Community development studies, today, also make extensive use of such methods. Complex networks require methods specific to modelling and interpreting social complexity and complex adaptive systems, including techniques of dynamic network analysis. Mechanisms such as Dual-phase evolution explain how temporal changes in connectivity contribute to the formation of structure in social networks. The study of social networks is being used to examine the nature of interdependencies between actors and the ways in which these are related to outcomes of conflict and cooperation. Areas of study include cooperative behavior among participants in collective actions such as protests; promotion of peaceful behavior, social norms, and public goods within communities through networks of informal governance; the role of social networks in both intrastate conflict and interstate conflict; and social networking among politicians, constituents, and bureaucrats. In criminology and urban sociology, much attention has been paid to the social networks among criminal actors. For example, murders can be seen as a series of exchanges between gangs. Murders can be seen to diffuse outwards from a single source, because weaker gangs cannot afford to kill members of stronger gangs in retaliation, but must commit other violent acts to maintain their reputation for strength. Diffusion of ideas and innovations studies focus on the spread and use of ideas from one actor to another or one culture and another. This line of research seeks to explain why some become "early adopters" of ideas and innovations, and links social network structure with facilitating or impeding the spread of an innovation. A case in point is the social diffusion of linguistic innovation such as neologisms. Experiments and large-scale field trials (e.g., by Nicholas Christakis and collaborators) have shown that cascades of desirable behaviors can be induced in social groups, in settings as diverse as Honduras villages, Indian slums, or in the lab. Still other experiments have documented the experimental induction of social contagion of voting behavior, emotions, risk perception, and commercial products. In demography, the study of social networks has led to new sampling methods for estimating and reaching populations that are hard to enumerate (for example, homeless people or intravenous drug users.) For example, respondent driven sampling is a network-based sampling technique that relies on respondents to a survey recommending further respondents. The field of sociology focuses almost entirely on networks of outcomes of social interactions. More narrowly, economic sociology considers behavioral interactions of individuals and groups through social capital and social "markets". Sociologists, such as Mark Granovetter, have developed core principles about the interactions of social structure, information, ability to punish or reward, and trust that frequently recur in their analyses of political, economic and other institutions. Granovetter examines how social structures and social networks can affect economic outcomes like hiring, price, productivity and innovation and describes sociologists' contributions to analyzing the impact of social structure and networks on the economy. Analysis of social networks is increasingly incorporated into health care analytics, not only in epidemiological studies but also in models of patient communication and education, disease prevention, mental health diagnosis and treatment, and in the study of health care organizations and systems. Human ecology is an interdisciplinary and transdisciplinary study of the relationship between humans and their natural, social, and built environments. The scientific philosophy of human ecology has a diffuse history with connections to geography, sociology, psychology, anthropology, zoology, and natural ecology. In the study of literary systems, network analysis has been applied by Anheier, Gerhards and Romo, De Nooy, Senekal, and Lotker, to study various aspects of how literature functions. The basic premise is that polysystem theory, which has been around since the writings of Even-Zohar, can be integrated with network theory and the relationships between different actors in the literary network, e.g. writers, critics, publishers, literary histories, etc., can be mapped using visualization from SNA. Research studies of formal or informal organization relationships, organizational communication, economics, economic sociology, and other resource transfers. Social networks have also been used to examine how organizations interact with each other, characterizing the many informal connections that link executives together, as well as associations and connections between individual employees at different organizations. Many organizational social network studies focus on teams. Within team network studies, research assesses, for example, the predictors and outcomes of centrality and power, density and centralization of team instrumental and expressive ties, and the role of between-team networks. Intra-organizational networks have been found to affect organizational commitment, organizational identification, interpersonal citizenship behaviour. Social capital is a form of economic and cultural capital in which social networks are central, transactions are marked by reciprocity, trust, and cooperation, and market agents produce goods and services not mainly for themselves, but for a common good. Social capital is split into three dimensions: the structural, the relational and the cognitive dimension. The structural dimension describes how partners interact with each other and which specific partners meet in a social network. Also, the structural dimension of social capital indicates the level of ties among organizations. This dimension is highly connected to the relational dimension which refers to trustworthiness, norms, expectations and identifications of the bonds between partners. The relational dimension explains the nature of these ties which is mainly illustrated by the level of trust accorded to the network of organizations. The cognitive dimension analyses the extent to which organizations share common goals and objectives as a result of their ties and interactions. Social capital is a sociological concept about the value of social relations and the role of cooperation and confidence to achieve positive outcomes. The term refers to the value one can get from their social ties. For example, newly arrived immigrants can make use of their social ties to established migrants to acquire jobs they may otherwise have trouble getting (e.g., because of unfamiliarity with the local language). A positive relationship exists between social capital and the intensity of social network use. In a dynamic framework, higher activity in a network feeds into higher social capital which itself encourages more activity. This particular cluster focuses on brand-image and promotional strategy effectiveness, taking into account the impact of customer participation on sales and brand-image. This is gauged through techniques such as sentiment analysis which rely on mathematical areas of study such as data mining and analytics. This area of research produces vast numbers of commercial applications as the main goal of any study is to understand consumer behaviour and drive sales. In many organizations, members tend to focus their activities inside their own groups, which stifles creativity and restricts opportunities. A player whose network bridges structural holes has an advantage in detecting and developing rewarding opportunities. Such a player can mobilize social capital by acting as a "broker" of information between two clusters that otherwise would not have been in contact, thus providing access to new ideas, opinions and opportunities. British philosopher and political economist John Stuart Mill, writes, "it is hardly possible to overrate the value of placing human beings in contact with persons dissimilar to themselves.... Such communication [is] one of the primary sources of progress." Thus, a player with a network rich in structural holes can add value to an organization through new ideas and opportunities. This in turn, helps an individual's career development and advancement. A social capital broker also reaps control benefits of being the facilitator of information flow between contacts. Full communication with exploratory mindsets and information exchange generated by dynamically alternating positions in a social network promotes creative and deep thinking. In the case of consulting firm Eden McCallum, the founders were able to advance their careers by bridging their connections with former big three consulting firm consultants and mid-size industry firms. By bridging structural holes and mobilizing social capital, players can advance their careers by executing new opportunities between contacts. There has been research that both substantiates and refutes the benefits of information brokerage. A study of high tech Chinese firms by Zhixing Xiao found that the control benefits of structural holes are "dissonant to the dominant firm-wide spirit of cooperation and the information benefits cannot materialize due to the communal sharing values" of such organizations. However, this study only analyzed Chinese firms, which tend to have strong communal sharing values. Information and control benefits of structural holes are still valuable in firms that are not quite as inclusive and cooperative on the firm-wide level. In 2004, Ronald Burt studied 673 managers who ran the supply chain for one of America's largest electronics companies. He found that managers who often discussed issues with other groups were better paid, received more positive job evaluations and were more likely to be promoted. Thus, bridging structural holes can be beneficial to an organization, and in turn, to an individual's career. Computer networks combined with social networking software produce a new medium for social interaction. A relationship over a computerized social networking service can be characterized by context, direction, and strength. The content of a relation refers to the resource that is exchanged. In a computer-mediated communication context, social pairs exchange different kinds of information, including sending a data file or a computer program as well as providing emotional support or arranging a meeting. With the rise of electronic commerce, information exchanged may also correspond to exchanges of money, goods or services in the "real" world. Social network analysis methods have become essential to examining these types of computer mediated communication. In addition, the sheer size and the volatile nature of social media has given rise to new network metrics. A key concern with networks extracted from social media is the lack of robustness of network metrics given missing data. Based on the pattern of homophily, ties between people are most likely to occur between nodes that are most similar to each other, or within neighbourhood segregation, individuals are most likely to inhabit the same regional areas as other individuals who are like them. Therefore, social networks can be used as a tool to measure the degree of segregation or homophily within a social network. Social Networks can both be used to simulate the process of homophily but it can also serve as a measure of level of exposure of different groups to each other within a current social network of individuals in a certain area. See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Extraterrestrial_life#cite_note-133] | [TOKENS: 11349]
Contents Extraterrestrial life Extraterrestrial life, or alien life (colloquially aliens), is life that originates from another world rather than on Earth. No extraterrestrial life has yet been scientifically or conclusively detected. Such life might range from simple forms such as prokaryotes to intelligent beings, possibly bringing forth civilizations that might be far more, or far less, advanced than humans. The Drake equation speculates about the existence of sapient life elsewhere in the universe. The science of extraterrestrial life is known as astrobiology. Speculation about inhabited worlds beyond Earth dates back to antiquity. Early Christian writers, including Augustine, discussed ideas from thinkers like Democritus and Epicurus about countless worlds in the vast universe. Pre-modern writers typically assumed extraterrestrial "worlds" were inhabited by living beings. William Vorilong, in the 15th century, acknowledged the possibility Jesus could have visited extraterrestrial worlds to redeem their inhabitants.: 26 In 1440, Nicholas of Cusa suggested Earth is a "brilliant star"; he theorized that all celestial bodies, even the Sun, could host life. Descartes wrote that there were no means to prove the stars were not inhabited by "intelligent creatures", but their existence was a matter of speculation.: 67 In comparison to the life-abundant Earth, the vast majority of intrasolar and extrasolar planets and moons have harsh surface conditions and disparate atmospheric chemistry, or lack an atmosphere. However, there are many extreme and chemically harsh ecosystems on Earth that do support forms of life and are often hypothesized to be the origin of life on Earth. Examples include life surrounding hydrothermal vents, acidic hot springs, and volcanic lakes, as well as halophiles and the deep biosphere. Since the mid-20th century, researchers have searched for extraterrestrial life and intelligence. Solar system studies focus on Venus, Mars, Europa, and Titan, while exoplanet discoveries now total 6,022 confirmed planets in 4,490 systems as of October 2025. Depending on the category of search, methods range from analysis of telescope and specimen data to radios used to detect and transmit interstellar communication. Interstellar travel remains largely hypothetical, with only the Voyager 1 and Voyager 2 probes confirmed to have entered the interstellar medium. The concept of extraterrestrial life, especially intelligent life, has greatly influenced culture and fiction. A key debate centers on contacting extraterrestrial intelligence: some advocate active attempts, while others warn it could be risky, given human history of exploiting other societies. Context Initially, after the Big Bang, the universe was too hot to allow life. It is estimated that the temperature of the universe was around 10 billion Kelvin at the one-second mark. Roughly 15 million years later, it cooled to temperate levels, though the elements of organic life were yet nonexistent. The only freely available elements at that point were hydrogen and helium. Carbon and oxygen (and later, water) would not appear until 50 million years later, created through stellar fusion. At that point, the difficulty for life to appear was not the temperature, but the scarcity of free heavy elements. Planetary systems emerged, and the first organic compounds may have formed in the protoplanetary disk of dust grains that would eventually create rocky planets like Earth. Although Earth was in a molten state after its birth and may have burned any organics that fell on it, it would have been more receptive once it cooled down. Once the right conditions on Earth were met, life started by a chemical process known as abiogenesis. Alternatively, life may have formed less frequently, then spread—by meteoroids, for example—between habitable planets in a process called panspermia. During most of its stellar evolution, stars combine hydrogen nuclei to make helium nuclei by stellar fusion, and the comparatively lighter weight of helium allows the star to release the extra energy. The process continues until the star uses all of its available fuel, with the speed of consumption being related to the size of the star. During its last stages, stars start combining helium nuclei to form carbon nuclei. The larger stars can further combine carbon nuclei to create oxygen and silicon, oxygen into neon and sulfur, and so on until iron. Ultimately, the star blows much of its content back into the stellar medium, where it would join clouds that would eventually become new generations of stars and planets. Many of those materials are the raw components of life on Earth. As this process takes place in all the universe, said materials are ubiquitous in the cosmos and not a rarity from the Solar System. Earth is a planet in the Solar System, a planetary system formed by a star at the center, the Sun, and the objects that orbit it: other planets, moons, asteroids, and comets. The sun is part of the Milky Way, a galaxy. The Milky Way is part of the Local Group, a galaxy group that is in turn part of the Laniakea Supercluster. The universe is composed of all similar structures in existence. The immense distances between celestial objects are a difficulty for studying extraterrestrial life. So far, humans have only set foot on the Moon and sent robotic probes to other planets and moons in the Solar System. Although probes can withstand conditions that may be lethal to humans, the distances cause time delays: the New Horizons took nine years after launch to reach Pluto. No probe has ever reached extrasolar planetary systems. The Voyager 2 left the Solar System at a speed of 50,000 kilometers per hour; if it headed towards the Alpha Centauri system, the closest one to Earth at 4.4 light years, it would reach it in 100,000 years. Under current technology, such systems can only be studied by telescopes, which have limitations. It is estimated that dark matter has a larger amount of combined matter than stars and gas clouds, but as it plays no role in the stellar evolution of stars and planets, it is usually not taken into account by astrobiology. There is an area around a star, the circumstellar habitable zone or "Goldilocks zone", wherein water may be at the right temperature to exist in liquid form at a planetary surface. This area is neither too close to the star, where water would become steam, nor too far away, where water would be frozen as ice. However, although useful as an approximation, planetary habitability is complex and defined by several factors. Being in the habitable zone is not enough for a planet to be habitable, not even to actually have such liquid water. Venus is located in the solar system's habitable zone, but does not have liquid water because of the conditions of its atmosphere. Jovian planets or gas giants are not considered habitable even if they orbit close enough to their stars as hot Jupiters, due to crushing atmospheric pressures. The actual distances for the habitable zones vary according to the type of star, and even the solar activity of each specific star influences the local habitability. The type of star also defines the time the habitable zone will exist, as its presence and limits will change along with the star's stellar evolution. The Big Bang occurred 13.8 billion years ago, the Solar System was formed 4.6 billion years ago, and the first hominids appeared 6 million years ago. Life on other planets may have started, evolved, given birth to extraterrestrial intelligences, and perhaps even faced a planetary extinction event millions or billions of years ago. When considered from a cosmic perspective, the brief times of existence of Earth's species may suggest that extraterrestrial life may be equally fleeting under such a scale. During a period of about 7 million years, from about 10 to 17 million years after the Big Bang, the background temperature was between 373 and 273 K (100 and 0 °C; 212 and 32 °F), allowing the possibility of liquid water if any planets existed. Avi Loeb (2014) speculated that primitive life might in principle have appeared during this window, which he called "the Habitable Epoch of the Early Universe". Life on Earth is quite ubiquitous across the planet and has adapted over time to almost all the available environments in it, extremophiles and the deep biosphere thrive at even the most hostile ones. As a result, it is inferred that life in other celestial bodies may be equally adaptive. However, the origin of life is unrelated to its ease of adaptation and may have stricter requirements. A celestial body may not have any life on it, even if it were habitable. Likelihood of existence Life in the cosmos beyond Earth has been observed. The hypothesis of ubiquitous extraterrestrial life relies on three main ideas. The first one, the size of the universe, allows for plenty of planets to have a similar habitability to Earth, and the age of the universe gives enough time for a long process analog to the history of Earth to happen there. The second is that the substances that make life, such as carbon and water, are ubiquitous in the universe. The third is that the physical laws are universal, which means that the forces that would facilitate or prevent the existence of life would be the same ones as on Earth. According to this argument, made by scientists such as Carl Sagan and Stephen Hawking, it would be improbable for life not to exist somewhere else other than Earth. This argument is embodied in the Copernican principle, which states that Earth does not occupy a unique position in the Universe, and the mediocrity principle, which states that there is nothing special about life on Earth. Other authors consider instead that life in the cosmos, or at least multicellular life, may actually be rare. The Rare Earth hypothesis maintains that life on Earth is possible because of a series of factors that range from the location in the galaxy and the configuration of the Solar System to local characteristics of the planet, and that it is unlikely that another planet simultaneously meets all such requirements. The proponents of this hypothesis consider that very little evidence suggests the existence of extraterrestrial life and that, at this point, it is just a desired result and not a reasonable scientific explanation for any gathered data. In 1961, astronomer and astrophysicist Frank Drake devised the Drake equation as a way to stimulate scientific dialogue at a meeting on the search for extraterrestrial intelligence (SETI). The Drake equation is a probabilistic argument used to estimate the number of active, communicative extraterrestrial civilizations in the Milky Way galaxy. The Drake equation is:: xix where: and Drake's proposed estimates are as follows, but numbers on the right side of the equation are agreed as speculative and open to substitution: 10,000 = 5 ⋅ 0.5 ⋅ 2 ⋅ 1 ⋅ 0.2 ⋅ 1 ⋅ 10,000 {\displaystyle 10{,}000=5\cdot 0.5\cdot 2\cdot 1\cdot 0.2\cdot 1\cdot 10{,}000} [better source needed] The Drake equation has proved controversial since, although it is written as a math equation, none of its values were known at the time. Although some values may eventually be measured, others are based on social sciences and are not knowable by their very nature. This does not allow one to make noteworthy conclusions from the equation. Based on observations from the Hubble Space Telescope, there are nearly 2 trillion galaxies in the observable universe. It is estimated that at least ten percent of all Sun-like stars have a system of planets. In other words, there are 6.25×1018 stars with planets orbiting them in the observable universe. Even if it is assumed that only one out of a billion of these stars has planets supporting life, there would be some 6.25 billion life-supporting planetary systems in the observable universe. A 2013 study based on results from the Kepler spacecraft estimated that the Milky Way contains at least as many planets as it does stars, resulting in 100–400 billion exoplanets. The Nebular hypothesis that explains the formation of the Solar System and other planetary systems would suggest that those can have several configurations, and not all of them may have rocky planets within the habitable zone. The apparent contradiction between high estimates of the probability of the existence of extraterrestrial civilisations and the lack of evidence for such civilisations is known as the Fermi paradox. Dennis W. Sciama claimed that life's existence in the universe depends on various fundamental constants. Zhi-Wei Wang and Samuel L. Braunstein suggest that a random universe capable of supporting life is likely to be just barely able to do so, giving a potential explanation to the Fermi paradox. Biochemical basis If extraterrestrial life exists, it could range from simple microorganisms and multicellular organisms similar to animals or plants, to complex alien intelligences akin to humans. When scientists talk about extraterrestrial life, they consider all those types. Although it is possible that extraterrestrial life may have other configurations, scientists use the hierarchy of lifeforms from Earth for simplicity, as it is the only one known to exist. The first basic requirement for life is an environment with non-equilibrium thermodynamics, which means that the thermodynamic equilibrium must be broken by a source of energy. The traditional sources of energy in the cosmos are the stars, such as for life on Earth, which depends on the energy of the sun. However, there are other alternative energy sources, such as volcanoes, plate tectonics, and hydrothermal vents. There are ecosystems on Earth in deep areas of the ocean that do not receive sunlight, and take energy from black smokers instead. Magnetic fields and radioactivity have also been proposed as sources of energy, although they would be less efficient ones. Life on Earth requires water in a liquid state as a solvent in which biochemical reactions take place. It is highly unlikely that an abiogenesis process can start within a gaseous or solid medium: the atom speeds, either too fast or too slow, make it difficult for specific ones to meet and start chemical reactions. A liquid medium also allows the transport of nutrients and substances required for metabolism. Sufficient quantities of carbon and other elements, along with water, might enable the formation of living organisms on terrestrial planets with a chemical make-up and temperature range similar to that of Earth. Life based on ammonia rather than water has been suggested as an alternative, though this solvent appears less suitable than water. It is also conceivable that there are forms of life whose solvent is a liquid hydrocarbon, such as methane, ethane or propane. Another unknown aspect of potential extraterrestrial life would be the chemical elements that would compose it. Life on Earth is largely composed of carbon, but there could be other hypothetical types of biochemistry. A replacement for carbon would need to be able to create complex molecules, store information required for evolution, and be freely available in the medium. To create DNA, RNA, or a close analog, such an element should be able to bind its atoms with many others, creating complex and stable molecules. It should be able to create at least three covalent bonds: two for making long strings and at least a third to add new links and allow for diverse information. Only nine elements meet this requirement: boron, nitrogen, phosphorus, arsenic, antimony (three bonds), carbon, silicon, germanium and tin (four bonds). As for abundance, carbon, nitrogen, and silicon are the most abundant ones in the universe, far more than the others. On Earth's crust the most abundant of those elements is silicon, in the Hydrosphere it is carbon and in the atmosphere, it is carbon and nitrogen. Silicon, however, has disadvantages over carbon. The molecules formed with silicon atoms are less stable, and more vulnerable to acids, oxygen, and light. An ecosystem of silicon-based lifeforms would require very low temperatures, high atmospheric pressure, an atmosphere devoid of oxygen, and a solvent other than water. The low temperatures required would add an extra problem, the difficulty to kickstart a process of abiogenesis to create life in the first place. Norman Horowitz, head of the Jet Propulsion Laboratory bioscience section for the Mariner and Viking missions from 1965 to 1976 considered that the great versatility of the carbon atom makes it the element most likely to provide solutions, even exotic solutions, to the problems of survival of life on other planets. However, he also considered that the conditions found on Mars were incompatible with carbon based life. Even if extraterrestrial life is based on carbon and uses water as a solvent, like Earth life, it may still have a radically different biochemistry. Life is generally considered to be a product of natural selection. It has been proposed that to undergo natural selection a living entity must have the capacity to replicate itself, the capacity to avoid damage/decay, and the capacity to acquire and process resources in support of the first two capacities. Life on Earth may have started with an RNA world and later evolved to its current form, where some of the RNA tasks were transferred to DNA and proteins. Extraterrestrial life may still be stuck using RNA, or evolve into other configurations. It is unclear if our biochemistry is the most efficient one that could be generated, or which elements would follow a similar pattern. However, it is likely that, even if cells had a different composition to those from Earth, they would still have a cell membrane. Life on Earth jumped from prokaryotes to eukaryotes and from unicellular organisms to multicellular organisms through evolution. So far no alternative process to achieve such a result has been conceived, even if hypothetical. Evolution requires life to be divided into individual organisms, and no alternative organisation has been satisfactorily proposed either. At the basic level, membranes define the limit of a cell, between it and its environment, while remaining partially open to exchange energy and resources with it. The evolution from simple cells to eukaryotes, and from them to multicellular lifeforms, is not guaranteed. The Cambrian explosion took place thousands of millions of years after the origin of life, and its causes are not fully known yet. On the other hand, the jump to multicellularity took place several times, which suggests that it could be a case of convergent evolution, and so likely to take place on other planets as well. Palaeontologist Simon Conway Morris considers that convergent evolution would lead to kingdoms similar to our plants and animals, and that many features are likely to develop in alien animals as well, such as bilateral symmetry, limbs, digestive systems and heads with sensory organs. Scientists from the University of Oxford analysed it from the perspective of evolutionary theory and wrote in a study in the International Journal of Astrobiology that aliens may be similar to humans. The planetary context would also have an influence: a planet with higher gravity would have smaller animals, and other types of stars can lead to non-green photosynthesizers. The amount of energy available would also affect biodiversity, as an ecosystem sustained by black smokers or hydrothermal vents would have less energy available than those sustained by a star's light and heat, and so its lifeforms would not grow beyond a certain complexity. There is also research in assessing the capacity of life for developing intelligence. It has been suggested that this capacity arises with the number of potential niches a planet contains, and that the complexity of life itself is reflected in the information density of planetary environments, which in turn can be computed from its niches. It is common knowledge that the conditions on other planets in the solar system, in addition to the many galaxies outside of the Milky Way galaxy, are very harsh and seem to be too extreme to harbor any life. The environmental conditions on these planets can have intense UV radiation paired with extreme temperatures, lack of water, and much more that can lead to conditions that don't seem to favor the creation or maintenance of extraterrestrial life. However, there has been much historical evidence that some of the earliest and most basic forms of life on Earth originated in some extreme environments that seem unlikely to have harbored life at least at one point in Earth's history. Fossil evidence as well as many historical theories backed up by years of research and studies have marked environments like hydrothermal vents or acidic hot springs as some of the first places that life could have originated on Earth. These environments can be considered extreme when compared to the typical ecosystems that the majority of life on Earth now inhabit, as hydrothermal vents are scorching hot due to the magma escaping from the Earth's mantle and meeting the much colder oceanic water. Even in today's world, there can be a diverse population of bacteria found inhabiting the area surrounding these hydrothermal vents which can suggest that some form of life can be supported even in the harshest of environments like the other planets in the solar system. The aspects of these harsh environments that make them ideal for the origin of life on Earth, as well as the possibility of creation of life on other planets, is the chemical reactions forming spontaneously. For example, the hydrothermal vents found on the ocean floor are known to support many chemosynthetic processes which allow organisms to utilize energy through reduced chemical compounds that fix carbon. In return, these reactions will allow for organisms to live in relatively low oxygenated environments while maintaining enough energy to support themselves. The early Earth environment was reducing and therefore, these carbon fixing compounds were necessary for the survival and possible origin of life on Earth. With the little amount of information that scientists have found regarding the atmosphere on other planets in the Milky Way galaxy and beyond, the atmospheres are most likely reducing or with very low oxygen levels, especially when compared with Earth's atmosphere. If there were the necessary elements and ions on these planets, the same carbon fixing, reduced chemical compounds occurring around hydrothermal vents could also occur on these planets' surfaces and possibly result in the origin of extraterrestrial life. Planetary habitability in the Solar System The Solar System has a wide variety of planets, dwarf planets, and moons, and each one is studied for its potential to host life. Each one has its own specific conditions that may benefit or harm life. So far, the only lifeforms found are those from Earth. No extraterrestrial intelligence other than humans exists or has ever existed within the Solar System. Astrobiologist Mary Voytek points out that it would be unlikely to find large ecosystems, as they would have already been detected by now. The inner Solar System is likely devoid of life. However, Venus is still of interest to astrobiologists, as it is a terrestrial planet that was likely similar to Earth in its early stages and developed in a different way. There is a greenhouse effect, the surface is the hottest in the Solar System, sulfuric acid clouds, all surface liquid water is lost, and it has a thick carbon-dioxide atmosphere with huge pressure. Comparing both helps to understand the precise differences that lead to beneficial or harmful conditions for life. And despite the conditions against life on Venus, there are suspicions that microbial life-forms may still survive in high-altitude clouds. Mars is a cold and almost airless desert, inhospitable to life. However, recent studies revealed that water on Mars used to be quite abundant, forming rivers, lakes, and perhaps even oceans. Mars may have been habitable back then, and life on Mars may have been possible. But when the planetary core ceased to generate a magnetic field, solar winds removed the atmosphere and the planet became vulnerable to solar radiation. Ancient life-forms may still have left fossilised remains, and microbes may still survive deep underground. As mentioned, the gas giants and ice giants are unlikely to contain life. The most distant solar system bodies, found in the Kuiper Belt and outwards, are locked in permanent deep-freeze, but cannot be ruled out completely. Although the giant planets themselves are highly unlikely to have life, there is much hope to find it on moons orbiting these planets. Europa, from the Jovian system, has a subsurface ocean below a thick layer of ice. Ganymede and Callisto also have subsurface oceans, but life is less likely in them because water is sandwiched between layers of solid ice. Europa would have contact between the ocean and the rocky surface, which helps the chemical reactions. It may be difficult to dig so deep in order to study those oceans, though. Enceladus, a tiny moon of Saturn with another subsurface ocean, may not need to be dug, as it releases water to space in eruption columns. The space probe Cassini flew inside one of these, but could not make a full study because NASA did not expect this phenomenon and did not equip the probe to study ocean water. Still, Cassini detected complex organic molecules, salts, evidence of hydrothermal activity, hydrogen, and methane. Titan is the only celestial body in the Solar System besides Earth that has liquid bodies on the surface. It has rivers, lakes, and rain of hydrocarbons, methane, and ethane, and even a cycle similar to Earth's water cycle. This special context encourages speculations about lifeforms with different biochemistry, but the cold temperatures would make such chemistry take place at a very slow pace. Water is rock-solid on the surface, but Titan does have a subsurface water ocean like several other moons. However, it is of such a great depth that it would be very difficult to access it for study. Scientific search The science that searches and studies life in the universe, both on Earth and elsewhere, is called astrobiology. With the study of Earth's life, the only known form of life, astrobiology seeks to study how life starts and evolves and the requirements for its continuous existence. This helps to determine what to look for when searching for life in other celestial bodies. This is a complex area of study, and uses the combined perspectives of several scientific disciplines, such as astronomy, biology, chemistry, geology, oceanography, and atmospheric sciences. The scientific search for extraterrestrial life is being carried out both directly and indirectly. As of September 2017[update], 3,667 exoplanets in 2,747 systems have been identified, and other planets and moons in the Solar System hold the potential for hosting primitive life such as microorganisms. As of 8 February 2021, an updated status of studies considering the possible detection of lifeforms on Venus (via phosphine) and Mars (via methane) was reported. Scientists search for biosignatures within the Solar System by studying planetary surfaces and examining meteorites. Some claim to have identified evidence that microbial life has existed on Mars. In 1996, a controversial report stated that structures resembling nanobacteria were discovered in a meteorite, ALH84001, formed of rock ejected from Mars. Although all the unusual properties of the meteorite were eventually explained as the result of inorganic processes, the controversy over its discovery laid the groundwork for the development of astrobiology. An experiment on the two Viking Mars landers reported gas emissions from heated Martian soil samples that some scientists argue are consistent with the presence of living microorganisms. Lack of corroborating evidence from other experiments on the same samples suggests that a non-biological reaction is a more likely hypothesis. In February 2005 NASA scientists reported they may have found some evidence of extraterrestrial life on Mars. The two scientists, Carol Stoker and Larry Lemke of NASA's Ames Research Center, based their claim on methane signatures found in Mars's atmosphere resembling the methane production of some forms of primitive life on Earth, as well as on their own study of primitive life near the Rio Tinto river in Spain. NASA officials soon distanced NASA from the scientists' claims, and Stoker herself backed off from her initial assertions. In November 2011, NASA launched the Mars Science Laboratory that landed the Curiosity rover on Mars. It is designed to assess the past and present habitability on Mars using a variety of scientific instruments. The rover landed on Mars at Gale Crater in August 2012. A group of scientists at Cornell University started a catalog of microorganisms, with the way each one reacts to sunlight. The goal is to help with the search for similar organisms in exoplanets, as the starlight reflected by planets rich in such organisms would have a specific spectrum, unlike that of starlight reflected from lifeless planets. If Earth was studied from afar with this system, it would reveal a shade of green, as a result of the abundance of plants with photosynthesis. In August 2011, NASA studied meteorites found on Antarctica, finding adenine, guanine, hypoxanthine, and xanthine. Adenine and guanine are components of DNA, and the others are used in other biological processes. The studies ruled out pollution of the meteorites on Earth, as those components would not be freely available the way they were found in the samples. This discovery suggests that several organic molecules that serve as building blocks of life may be generated within asteroids and comets. In October 2011, scientists reported that cosmic dust contains complex organic compounds ("amorphous organic solids with a mixed aromatic-aliphatic structure") that could be created naturally, and rapidly, by stars. It is still unclear if those compounds played a role in the creation of life on Earth, but Sun Kwok, of the University of Hong Kong, thinks so. "If this is the case, life on Earth may have had an easier time getting started as these organics can serve as basic ingredients for life." In August 2012, and in a world first, astronomers at Copenhagen University reported the detection of a specific sugar molecule, glycolaldehyde, in a distant star system. The molecule was found around the protostellar binary IRAS 16293-2422, which is located 400 light years from Earth. Glycolaldehyde is needed to form ribonucleic acid, or RNA, which is similar in function to DNA. This finding suggests that complex organic molecules may form in stellar systems prior to the formation of planets, eventually arriving on young planets early in their formation. In December 2023, astronomers reported the first time discovery, in the plumes of Enceladus, moon of the planet Saturn, of hydrogen cyanide, a possible chemical essential for life as we know it, as well as other organic molecules, some of which are yet to be better identified and understood. According to the researchers, "these [newly discovered] compounds could potentially support extant microbial communities or drive complex organic synthesis leading to the origin of life." Although most searches are focused on the biology of extraterrestrial life, an extraterrestrial intelligence capable enough to develop a civilization may be detectable by other means as well. Technology may generate technosignatures, effects on the native planet that may not be caused by natural causes. There are three main types of techno-signatures considered: interstellar communications, effects on the atmosphere, and planetary-sized structures such as Dyson spheres. Organizations such as the SETI Institute search the cosmos for potential forms of communication. They started with radio waves, and now search for laser pulses as well. The challenge for this search is that there are natural sources of such signals as well, such as gamma-ray bursts and supernovae, and the difference between a natural signal and an artificial one would be in its specific patterns. Astronomers intend to use artificial intelligence for this, as it can manage large amounts of data and is devoid of biases and preconceptions. Besides, even if there is an advanced extraterrestrial civilization, there is no guarantee that it is transmitting radio communications in the direction of Earth. The length of time required for a signal to travel across space means that a potential answer may arrive decades or centuries after the initial message. The atmosphere of Earth is rich in nitrogen dioxide as a result of air pollution, which can be detectable. The natural abundance of carbon, which is also relatively reactive, makes it likely to be a basic component of the development of a potential extraterrestrial technological civilization, as it is on Earth. Fossil fuels may likely be generated and used on such worlds as well. The abundance of chlorofluorocarbons in the atmosphere can also be a clear technosignature, considering their role in ozone depletion. Light pollution may be another technosignature, as multiple lights on the night side of a rocky planet can be a sign of advanced technological development. However, modern telescopes are not strong enough to study exoplanets with the required level of detail to perceive it. The Kardashev scale proposes that a civilization may eventually start consuming energy directly from its local star. This would require giant structures built next to it, called Dyson spheres. Those speculative structures would cause an excess infrared radiation, that telescopes may notice. The infrared radiation is typical of young stars, surrounded by dusty protoplanetary disks that will eventually form planets. An older star such as the Sun would have no natural reason to have excess infrared radiation. The presence of heavy elements in a star's light-spectrum is another potential biosignature; such elements would (in theory) be found if the star were being used as an incinerator/repository for nuclear waste products. Some astronomers search for extrasolar planets that may be conducive to life, narrowing the search to terrestrial planets within the habitable zones of their stars. Since 1992, over four thousand exoplanets have been discovered (6,128 planets in 4,584 planetary systems including 1,017 multiple planetary systems as of 30 October 2025). The extrasolar planets so far discovered range in size from that of terrestrial planets similar to Earth's size to that of gas giants larger than Jupiter. The number of observed exoplanets is expected to increase greatly in the coming years.[better source needed] The Kepler space telescope has also detected a few thousand candidate planets, of which about 11% may be false positives. There is at least one planet on average per star. About 1 in 5 Sun-like stars[a] have an "Earth-sized"[b] planet in the habitable zone,[c] with the nearest expected to be within 12 light-years distance from Earth. Assuming 200 billion stars in the Milky Way,[d] that would be 11 billion potentially habitable Earth-sized planets in the Milky Way, rising to 40 billion if red dwarfs are included. The rogue planets in the Milky Way possibly number in the trillions. The nearest known exoplanet is Proxima Centauri b, located 4.2 light-years (1.3 pc) from Earth in the southern constellation of Centaurus. As of March 2014[update], the least massive exoplanet known is PSR B1257+12 A, which is about twice the mass of the Moon. The most massive planet listed on the NASA Exoplanet Archive is DENIS-P J082303.1−491201 b, about 29 times the mass of Jupiter, although according to most definitions of a planet, it is too massive to be a planet and may be a brown dwarf instead. Almost all of the planets detected so far are within the Milky Way, but there have also been a few possible detections of extragalactic planets. The study of planetary habitability also considers a wide range of other factors in determining the suitability of a planet for hosting life. One sign that a planet probably already contains life is the presence of an atmosphere with significant amounts of oxygen, since that gas is highly reactive and generally would not last long without constant replenishment. This replenishment occurs on Earth through photosynthetic organisms. One way to analyse the atmosphere of an exoplanet is through spectrography when it transits its star, though this might only be feasible with dim stars like white dwarfs. History and cultural impact The modern concept of extraterrestrial life is based on assumptions that were not commonplace during the early days of astronomy. The first explanations for the celestial objects seen in the night sky were based on mythology. Scholars from Ancient Greece were the first to consider that the universe is inherently understandable and rejected explanations based on supernatural incomprehensible forces, such as the myth of the Sun being pulled across the sky in the chariot of Apollo. They had not developed the scientific method yet and based their ideas on pure thought and speculation, but they developed precursor ideas to it, such as that explanations had to be discarded if they contradict observable facts. The discussions of those Greek scholars established many of the pillars that would eventually lead to the idea of extraterrestrial life, such as Earth being round and not flat. The cosmos was first structured in a geocentric model that considered that the sun and all other celestial bodies revolve around Earth. However, they did not consider them as worlds. In Greek understanding, the world was composed by both Earth and the celestial objects with noticeable movements. Anaximander thought that the cosmos was made from apeiron, a substance that created the world, and that the world would eventually return to the cosmos. Eventually two groups emerged, the atomists that thought that matter at both Earth and the cosmos was equally made of small atoms of the classical elements (earth, water, fire and air), and the Aristotelians who thought that those elements were exclusive of Earth and that the cosmos was made of a fifth one, the aether. Atomist Epicurus thought that the processes that created the world, its animals and plants should have created other worlds elsewhere, along with their own animals and plants. Aristotle thought instead that all the earth element naturally fell towards the center of the universe, and that would make it impossible for other planets to exist elsewhere. Under that reasoning, Earth was not only in the center, it was also the only planet in the universe. Cosmic pluralism, the plurality of worlds, or simply pluralism, describes the philosophical belief in numerous "worlds" in addition to Earth, which might harbor extraterrestrial life. The earliest recorded assertion of extraterrestrial human life is found in ancient scriptures of Jainism. There are multiple "worlds" mentioned in Jain scriptures that support human life. These include, among others, Bharat Kshetra, Mahavideh Kshetra, Airavat Kshetra, and Hari kshetra. Medieval Muslim writers like Fakhr al-Din al-Razi and Muhammad al-Baqir supported cosmic pluralism on the basis of the Qur'an. Chaucer's poem The House of Fame engaged in medieval thought experiments that postulated the plurality of worlds. However, those ideas about other worlds were different from the current knowledge about the structure of the universe, and did not postulate the existence of planetary systems other than the Solar System. When those authors talk about other worlds, they talk about places located at the center of their own systems, and with their own stellar vaults and cosmos surrounding them. The Greek ideas and the disputes between atomists and Aristotelians outlived the fall of the Greek empire. The Great Library of Alexandria compiled information about it, part of which was translated by Islamic scholars and thus survived the end of the Library. Baghdad combined the knowledge of the Greeks, the Indians, the Chinese and its own scholars, and the knowledge expanded through the Byzantine Empire. From there it eventually returned to Europe by the time of the Middle Ages. However, as the Greek atomist doctrine held that the world was created by random movements of atoms, with no need for a creator deity, it became associated with atheism, and the dispute intertwined with religious ones. Still, the Church did not react to those topics in a homogeneous way, and there were stricter and more permissive views within the church itself. The first known mention of the term 'panspermia' was in the writings of the 5th-century BC Greek philosopher Anaxagoras. He proposed the idea that life exists everywhere. By the time of the late Middle Ages there were many known inaccuracies in the geocentric model, but it was kept in use because naked eye observations provided limited data. Nicolaus Copernicus started the Copernican Revolution by proposing that the planets revolve around the sun rather than Earth. His proposal had little acceptance at first because, as he kept the assumption that orbits were perfect circles, his model led to as many inaccuracies as the geocentric one. Tycho Brahe improved the available data with naked-eye observatories, which worked with highly complex sextants and quadrants. Tycho could not make sense of his observations, but Johannes Kepler did: orbits were not perfect circles, but ellipses. This knowledge benefited the Copernican model, which worked now almost perfectly. The invention of the telescope a short time later, perfected by Galileo Galilei, clarified the final doubts, and the paradigm shift was completed. Under this new understanding, the notion of extraterrestrial life became feasible: if Earth is but just a planet orbiting around a star, there may be planets similar to Earth elsewhere. The astronomical study of distant bodies also proved that physical laws are the same elsewhere in the universe as on Earth, with nothing making the planet truly special. The new ideas were met with resistance from the Catholic church. Galileo was tried for the heliocentric model, which was considered heretical, and forced to recant it. The best-known early-modern proponent of ideas of extraterrestrial life was the Italian philosopher Giordano Bruno, who argued in the 16th century for an infinite universe in which every star is surrounded by its own planetary system. Bruno wrote that other worlds "have no less virtue nor a nature different to that of our earth" and, like Earth, "contain animals and inhabitants". Bruno's belief in the plurality of worlds was one of the charges leveled against him by the Venetian Holy Inquisition, which tried and executed him. The heliocentric model was further strengthened by the postulation of the theory of gravity by Sir Isaac Newton. This theory provided the mathematics that explains the motions of all things in the universe, including planetary orbits. By this point, the geocentric model was definitely discarded. By this time, the use of the scientific method had become a standard, and new discoveries were expected to provide evidence and rigorous mathematical explanations. Science also took a deeper interest in the mechanics of natural phenomena, trying to explain not just the way nature works but also the reasons for working that way. There was very little actual discussion about extraterrestrial life before this point, as the Aristotelian ideas remained influential while geocentrism was still accepted. When it was finally proved wrong, it not only meant that Earth was not the center of the universe, but also that the lights seen in the sky were not just lights, but physical objects. The notion that life may exist in them as well soon became an ongoing topic of discussion, although one with no practical ways to investigate. The possibility of extraterrestrials remained a widespread speculation as scientific discovery accelerated. William Herschel, the discoverer of Uranus, was one of many 18th–19th-century astronomers who believed that the Solar System is populated by alien life. Other scholars of the period who championed "cosmic pluralism" included Immanuel Kant and Benjamin Franklin. At the height of the Enlightenment, even the Sun and Moon were considered candidates for extraterrestrial inhabitants. Speculation about life on Mars increased in the late 19th century, following telescopic observation of apparent Martian canals – which soon, however, turned out to be optical illusions. Despite this, in 1895, American astronomer Percival Lowell published his book Mars, followed by Mars and its Canals in 1906, proposing that the canals were the work of a long-gone civilisation. Spectroscopic analysis of Mars's atmosphere began in earnest in 1894, when U.S. astronomer William Wallace Campbell showed that neither water nor oxygen was present in the Martian atmosphere. By 1909 better telescopes and the best perihelic opposition of Mars since 1877 conclusively put an end to the canal hypothesis. As a consequence of the belief in the spontaneous generation there was little thought about the conditions of each celestial body: it was simply assumed that life would thrive anywhere. This theory was disproved by Louis Pasteur in the 19th century. Popular belief in thriving alien civilisations elsewhere in the solar system still remained strong until Mariner 4 and Mariner 9 provided close images of Mars, which debunked forever the idea of the existence of Martians and decreased the previous expectations of finding alien life in general. The end of the spontaneous generation belief forced investigation into the origin of life. Although abiogenesis is the more accepted theory, a number of authors reclaimed the term "panspermia" and proposed that life was brought to Earth from elsewhere. Some of those authors are Jöns Jacob Berzelius (1834), Kelvin (1871), Hermann von Helmholtz (1879) and, somewhat later, by Svante Arrhenius (1903). The science fiction genre, although not so named during the time, developed during the late 19th century. The expansion of the genre of extraterrestrials in fiction influenced the popular perception over the real-life topic, making people eager to jump to conclusions about the discovery of aliens. Science marched at a slower pace, some discoveries fueled expectations and others dashed excessive hopes. For example, with the advent of telescopes, most structures seen on the Moon or Mars were immediately attributed to Selenites or Martians, and later ones (such as more powerful telescopes) revealed that all such discoveries were natural features. A famous case is the Cydonia region of Mars, first imaged by the Viking 1 orbiter. The low-resolution photos showed a rock formation that resembled a human face, but later spacecraft took photos in higher detail that showed that there was nothing special about the site. The search and study of extraterrestrial life became a science of its own, astrobiology. Also known as exobiology, this discipline is studied by the NASA, the ESA, the INAF, and others. Astrobiology studies life from Earth as well, but with a cosmic perspective. For example, abiogenesis is of interest to astrobiology, not because of the origin of life on Earth, but for the chances of a similar process taking place in other celestial bodies. Many aspects of life, from its definition to its chemistry, are analyzed as either likely to be similar in all forms of life across the cosmos or only native to Earth. Astrobiology, however, remains constrained by the current lack of extraterrestrial life-forms to study, as all life on Earth comes from the same ancestor, and it is hard to infer general characteristics from a group with a single example to analyse. The 20th century came with great technological advances, speculations about future hypothetical technologies, and an increased basic knowledge of science by the general population thanks to science divulgation through the mass media. The public interest in extraterrestrial life and the lack of discoveries by mainstream science led to the emergence of pseudosciences that provided affirmative, if questionable, answers to the existence of aliens. Ufology claims that many unidentified flying objects (UFOs) would be spaceships from alien species, and ancient astronauts hypothesis claim that aliens would have visited Earth in antiquity and prehistoric times but people would have failed to understand it by then. Most UFOs or UFO sightings can be readily explained as sightings of Earth-based aircraft (including top-secret aircraft), known astronomical objects or weather phenomenons, or as hoaxes. Looking beyond the pseudosciences, Lewis White Beck strove to elevate the level of public discourse on the topic of extraterrestrial life by tracing the evolution of philosophical thought over the centuries from ancient times into the modern era. His review of the contributions made by Lucretius, Plutarch, Aristotle, Copernicus, Immanuel Kant, John Wilkins, Charles Darwin and Karl Marx demonstrated that even in modern times, humanity could be profoundly influenced in its search for extraterrestrial life by subtle and comforting archetypal ideas which are largely derived from firmly held religious, philosophical and existential belief systems. On a positive note, however, Beck further argued that even if the search for extraterrestrial life proves to be unsuccessful, the endeavor itself could have beneficial consequences by assisting humanity in its attempt to actualize superior ways of living here on Earth. By the 21st century, it was accepted that multicellular life in the Solar System can only exist on Earth, but the interest in extraterrestrial life increased regardless. This is a result of the advances in several sciences. The knowledge of planetary habitability allows to consider on scientific terms the likelihood of finding life at each specific celestial body, as it is known which features are beneficial and harmful for life. Astronomy and telescopes also improved to the point exoplanets can be confirmed and even studied, increasing the number of search places. Life may still exist elsewhere in the Solar System in unicellular form, but the advances in spacecraft allow to send robots to study samples in situ, with tools of growing complexity and reliability. Although no extraterrestrial life has been found and life may still be just a rarity from Earth, there are scientific reasons to suspect that it can exist elsewhere, and technological advances that may detect it if it does. Many scientists are optimistic about the chances of finding alien life. In the words of SETI's Frank Drake, "All we know for sure is that the sky is not littered with powerful microwave transmitters". Drake noted that it is entirely possible that advanced technology results in communication being carried out in some way other than conventional radio transmission. At the same time, the data returned by space probes, and giant strides in detection methods, have allowed science to begin delineating habitability criteria on other worlds, and to confirm that at least other planets are plentiful, though aliens remain a question mark. The Wow! signal, detected in 1977 by a SETI project, remains a subject of speculative debate. On the other hand, other scientists are pessimistic. Jacques Monod wrote that "Man knows at last that he is alone in the indifferent immensity of the universe, whence which he has emerged by chance". In 2000, geologist and paleontologist Peter Ward and astrobiologist Donald Brownlee published a book entitled Rare Earth: Why Complex Life is Uncommon in the Universe.[better source needed] In it, they discussed the Rare Earth hypothesis, in which they claim that Earth-like life is rare in the universe, whereas microbial life is common. Ward and Brownlee are open to the idea of evolution on other planets that is not based on essential Earth-like characteristics such as DNA and carbon. As for the possible risks, theoretical physicist Stephen Hawking warned in 2010 that humans should not try to contact alien life forms. He warned that aliens might pillage Earth for resources. "If aliens visit us, the outcome would be much as when Columbus landed in America, which didn't turn out well for the Native Americans", he said. Jared Diamond had earlier expressed similar concerns. On 20 July 2015, Hawking and Russian billionaire Yuri Milner, along with the SETI Institute, announced a well-funded effort, called the Breakthrough Initiatives, to expand efforts to search for extraterrestrial life. The group contracted the services of the 100-meter Robert C. Byrd Green Bank Telescope in West Virginia in the United States and the 64-meter Parkes Telescope in New South Wales, Australia. On 13 February 2015, scientists (including Geoffrey Marcy, Seth Shostak, Frank Drake and David Brin) at a convention of the American Association for the Advancement of Science, discussed Active SETI and whether transmitting a message to possible intelligent extraterrestrials in the Cosmos was a good idea; one result was a statement, signed by many, that a "worldwide scientific, political and humanitarian discussion must occur before any message is sent". Government responses The 1967 Outer Space Treaty and the 1979 Moon Agreement define rules of planetary protection against potentially hazardous extraterrestrial life. COSPAR also provides guidelines for planetary protection. A committee of the United Nations Office for Outer Space Affairs had in 1977 discussed for a year strategies for interacting with extraterrestrial life or intelligence. The discussion ended without any conclusions. As of 2010, the UN lacks response mechanisms for the case of an extraterrestrial contact. One of the NASA divisions is the Office of Safety and Mission Assurance (OSMA), also known as the Planetary Protection Office. A part of its mission is to "rigorously preclude backward contamination of Earth by extraterrestrial life." In 2016, the Chinese Government released a white paper detailing its space program. According to the document, one of the research objectives of the program is the search for extraterrestrial life. It is also one of the objectives of the Chinese Five-hundred-meter Aperture Spherical Telescope (FAST) program. In 2020, Dmitry Rogozin, the head of the Russian space agency, said the search for extraterrestrial life is one of the main goals of deep space research. He also acknowledged the possibility of existence of primitive life on other planets of the Solar System. The French space agency has an office for the study of "non-identified aero spatial phenomena". The agency is maintaining a publicly accessible database of such phenomena, with over 1600 detailed entries. According to the head of the office, the vast majority of entries have a mundane explanation; but for 25% of entries, their extraterrestrial origin can neither be confirmed nor denied. In 2020, chairman of the Israel Space Agency Isaac Ben-Israel stated that the probability of detecting life in outer space is "quite large". But he disagrees with his former colleague Haim Eshed who stated that there are contacts between an advanced alien civilisation and some of Earth's governments. In fiction Although the idea of extraterrestrial peoples became feasible once astronomy developed enough to understand the nature of planets, they were not thought of as being any different from humans. Having no scientific explanation for the origin of mankind and its relation to other species, there was no reason to expect them to be any other way. This was changed by the 1859 book On the Origin of Species by Charles Darwin, which proposed the theory of evolution. Now with the notion that evolution on other planets may take other directions, science fiction authors created bizarre aliens, clearly distinct from humans. A usual way to do that was to add body features from other animals, such as insects or octopuses. Costuming and special effects feasibility alongside budget considerations forced films and TV series to tone down the fantasy, but these limitations lessened since the 1990s with the advent of computer-generated imagery (CGI), and later on as CGI became more effective and less expensive. Real-life events sometimes captivate people's imagination and this influences the works of fiction. For example, during the Barney and Betty Hill incident, the first recorded claim of an alien abduction, the couple reported that they were abducted and experimented on by aliens with oversized heads, big eyes, pale grey skin, and small noses, a description that eventually became the grey alien archetype once used in works of fiction. See also Notes References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-Office-246] | [TOKENS: 17273]
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023​, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America)
========================================
[SOURCE: https://en.wikipedia.org/wiki/OpenAI#cite_ref-156] | [TOKENS: 8773]
Contents OpenAI OpenAI is an American artificial intelligence research organization comprising both a non-profit foundation and a controlled for-profit public benefit corporation (PBC), headquartered in San Francisco. It aims to develop "safe and beneficial" artificial general intelligence (AGI), which it defines as "highly autonomous systems that outperform humans at most economically valuable work". OpenAI is widely recognized for its development of the GPT family of large language models, the DALL-E series of text-to-image models, and the Sora series of text-to-video models, which have influenced industry research and commercial applications. Its release of ChatGPT in November 2022 has been credited with catalyzing widespread interest in generative AI. The organization was founded in 2015 in Delaware but evolved a complex corporate structure. As of October 2025, following restructuring approved by California and Delaware regulators, the non-profit OpenAI Foundation holds 26% of the for-profit OpenAI Group PBC, with Microsoft holding 27% and employees/other investors holding 47%. Under its governance arrangements, the OpenAI Foundation holds the authority to appoint the board of the for-profit OpenAI Group PBC, a mechanism designed to align the entity’s strategic direction with the Foundation’s charter. Microsoft previously invested over $13 billion into OpenAI, and provides Azure cloud computing resources. In October 2025, OpenAI conducted a $6.6 billion share sale that valued the company at $500 billion. In 2023 and 2024, OpenAI faced multiple lawsuits for alleged copyright infringement against authors and media companies whose work was used to train some of OpenAI's products. In November 2023, OpenAI's board removed Sam Altman as CEO, citing a lack of confidence in him, but reinstated him five days later following a reconstruction of the board. Throughout 2024, roughly half of then-employed AI safety researchers left OpenAI, citing the company's prominent role in an industry-wide problem. Founding In December 2015, OpenAI was founded as a not for profit organization by Sam Altman, Elon Musk, Ilya Sutskever, Greg Brockman, Trevor Blackwell, Vicki Cheung, Andrej Karpathy, Durk Kingma, John Schulman, Pamela Vagata, and Wojciech Zaremba, with Sam Altman and Elon Musk as the co-chairs. A total of $1 billion in capital was pledged by Sam Altman, Greg Brockman, Elon Musk, Reid Hoffman, Jessica Livingston, Peter Thiel, Amazon Web Services (AWS), and Infosys. However, the actual capital collected significantly lagged pledges. According to company disclosures, only $130 million had been received by 2019. In its founding charter, OpenAI stated an intention to collaborate openly with other institutions by making certain patents and research publicly available, but later restricted access to its most capable models, citing competitive and safety concerns. OpenAI was initially run from Brockman's living room. It was later headquartered at the Pioneer Building in the Mission District, San Francisco. According to OpenAI's charter, its founding mission is "to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity." Musk and Altman stated in 2015 that they were partly motivated by concerns about AI safety and existential risk from artificial general intelligence. OpenAI stated that "it's hard to fathom how much human-level AI could benefit society", and that it is equally difficult to comprehend "how much it could damage society if built or used incorrectly". The startup also wrote that AI "should be an extension of individual human wills and, in the spirit of liberty, as broadly and evenly distributed as possible", and that "because of AI's surprising history, it's hard to predict when human-level AI might come within reach. When it does, it'll be important to have a leading research institution which can prioritize a good outcome for all over its own self-interest." Co-chair Sam Altman expected a decades-long project that eventually surpasses human intelligence. Brockman met with Yoshua Bengio, one of the "founding fathers" of deep learning, and drew up a list of great AI researchers. Brockman was able to hire nine of them as the first employees in December 2015. OpenAI did not pay AI researchers salaries comparable to those of Facebook or Google. It also did not pay stock options which AI researchers typically get. Nevertheless, OpenAI spent $7 million on its first 52 employees in 2016. OpenAI's potential and mission drew these researchers to the firm; a Google employee said he was willing to leave Google for OpenAI "partly because of the very strong group of people and, to a very large extent, because of its mission." OpenAI co-founder Wojciech Zaremba stated that he turned down "borderline crazy" offers of two to three times his market value to join OpenAI instead. In April 2016, OpenAI released a public beta of "OpenAI Gym", its platform for reinforcement learning research. Nvidia gifted its first DGX-1 supercomputer to OpenAI in August 2016 to help it train larger and more complex AI models with the capability of reducing processing time from six days to two hours. In December 2016, OpenAI released "Universe", a software platform for measuring and training an AI's general intelligence across the world's supply of games, websites, and other applications. Corporate structure In 2019, OpenAI transitioned from non-profit to "capped" for-profit, with the profit being capped at 100 times any investment. According to OpenAI, the capped-profit model allows OpenAI Global, LLC to legally attract investment from venture funds and, in addition, to grant employees stakes in the company. Many top researchers work for Google Brain, DeepMind, or Facebook, which offer equity that a nonprofit would be unable to match. Before the transition, OpenAI was legally required to publicly disclose the compensation of its top employees. The company then distributed equity to its employees and partnered with Microsoft, announcing an investment package of $1 billion into the company. Since then, OpenAI systems have run on an Azure-based supercomputing platform from Microsoft. OpenAI Global, LLC then announced its intention to commercially license its technologies. It planned to spend $1 billion "within five years, and possibly much faster". Altman stated that even a billion dollars may turn out to be insufficient, and that the lab may ultimately need "more capital than any non-profit has ever raised" to achieve artificial general intelligence. The nonprofit, OpenAI, Inc., is the sole controlling shareholder of OpenAI Global, LLC, which, despite being a for-profit company, retains a formal fiduciary responsibility to OpenAI, Inc.'s nonprofit charter. A majority of OpenAI, Inc.'s board is barred from having financial stakes in OpenAI Global, LLC. In addition, minority members with a stake in OpenAI Global, LLC are barred from certain votes due to conflict of interest. Some researchers have argued that OpenAI Global, LLC's switch to for-profit status is inconsistent with OpenAI's claims to be "democratizing" AI. On February 29, 2024, Elon Musk filed a lawsuit against OpenAI and CEO Sam Altman, accusing them of shifting focus from public benefit to profit maximization—a case OpenAI dismissed as "incoherent" and "frivolous," though Musk later revived legal action against Altman and others in August. On April 9, 2024, OpenAI countersued Musk in federal court, alleging that he had engaged in "bad-faith tactics" to slow the company's progress and seize its innovations for his personal benefit. OpenAI also argued that Musk had previously supported the creation of a for-profit structure and had expressed interest in controlling OpenAI himself. The countersuit seeks damages and legal measures to prevent further alleged interference. On February 10, 2025, a consortium of investors led by Elon Musk submitted a $97.4 billion unsolicited bid to buy the nonprofit that controls OpenAI, declaring willingness to match or exceed any better offer. The offer was rejected on 14 February 2025, with OpenAI stating that it was not for sale, but the offer complicated Altman's restructuring plan by suggesting a lower bar for how much the nonprofit should be valued. OpenAI, Inc. was originally designed as a nonprofit in order to ensure that AGI "benefits all of humanity" rather than "the private gain of any person". In 2019, it created OpenAI Global, LLC, a capped-profit subsidiary controlled by the nonprofit. In December 2024, OpenAI proposed a restructuring plan to convert the capped-profit into a Delaware-based public benefit corporation (PBC), and to release it from the control of the nonprofit. The nonprofit would sell its control and other assets, getting equity in return, and would use it to fund and pursue separate charitable projects, including in science and education. OpenAI's leadership described the change as necessary to secure additional investments, and claimed that the nonprofit's founding mission to ensure AGI "benefits all of humanity" would be better fulfilled. The plan has been criticized by former employees. A legal letter named "Not For Private Gain" asked the attorneys general of California and Delaware to intervene, stating that the restructuring is illegal and would remove governance safeguards from the nonprofit and the attorneys general. The letter argues that OpenAI's complex structure was deliberately designed to remain accountable to its mission, without the conflicting pressure of maximizing profits. It contends that the nonprofit is best positioned to advance its mission of ensuring AGI benefits all of humanity by continuing to control OpenAI Global, LLC, whatever the amount of equity that it could get in exchange. PBCs can choose how they balance their mission with profit-making. Controlling shareholders have a large influence on how closely a PBC sticks to its mission. On October 28, 2025, OpenAI announced that it had adopted the new PBC corporate structure after receiving approval from the attorneys general of California and Delaware. Under the new structure, OpenAI's for-profit branch became a public benefit corporation known as OpenAI Group PBC, while the non-profit was renamed to the OpenAI Foundation. The OpenAI Foundation holds a 26% stake in the PBC, while Microsoft holds a 27% stake and the remaining 47% is owned by employees and other investors. All members of the OpenAI Group PBC board of directors will be appointed by the OpenAI Foundation, which can remove them at any time. Members of the Foundation's board will also serve on the for-profit board. The new structure allows the for-profit PBC to raise investor funds like most traditional tech companies, including through an initial public offering, which Altman claimed was the most likely path forward. In January 2023, OpenAI Global, LLC was in talks for funding that would value the company at $29 billion, double its 2021 value. On January 23, 2023, Microsoft announced a new US$10 billion investment in OpenAI Global, LLC over multiple years, partially needed to use Microsoft's cloud-computing service Azure. From September to December, 2023, Microsoft rebranded all variants of its Copilot to Microsoft Copilot, and they added MS-Copilot to many installations of Windows and released Microsoft Copilot mobile apps. Following OpenAI's 2025 restructuring, Microsoft owns a 27% stake in the for-profit OpenAI Group PBC, valued at $135 billion. In a deal announced the same day, OpenAI agreed to purchase $250 billion of Azure services, with Microsoft ceding their right of first refusal over OpenAI's future cloud computing purchases. As part of the deal, OpenAI will continue to share 20% of its revenue with Microsoft until it achieves AGI, which must now be verified by an independent panel of experts. The deal also loosened restrictions on both companies working with third parties, allowing Microsoft to pursue AGI independently and allowing OpenAI to develop products with other companies. In 2017, OpenAI spent $7.9 million, a quarter of its functional expenses, on cloud computing alone. In comparison, DeepMind's total expenses in 2017 were $442 million. In the summer of 2018, training OpenAI's Dota 2 bots required renting 128,000 CPUs and 256 GPUs from Google for multiple weeks. In October 2024, OpenAI completed a $6.6 billion capital raise with a $157 billion valuation including investments from Microsoft, Nvidia, and SoftBank. On January 21, 2025, Donald Trump announced The Stargate Project, a joint venture between OpenAI, Oracle, SoftBank and MGX to build an AI infrastructure system in conjunction with the US government. The project takes its name from OpenAI's existing "Stargate" supercomputer project and is estimated to cost $500 billion. The partners planned to fund the project over the next four years. In July, the United States Department of Defense announced that OpenAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and xAI. In the same month, the company made a deal with the UK Government to use ChatGPT and other AI tools in public services. OpenAI subsequently began a $50 million fund to support nonprofit and community organizations. In April 2025, OpenAI raised $40 billion at a $300 billion post-money valuation, which was the highest-value private technology deal in history. The financing round was led by SoftBank, with other participants including Microsoft, Coatue, Altimeter and Thrive. In July 2025, the company reported annualized revenue of $12 billion. This was an increase from $3.7 billion in 2024, which was driven by ChatGPT subscriptions, which reached 20 million paid subscribers by April 2025, up from 15.5 million at the end of 2024, alongside a rapidly expanding enterprise customer base that grew to five million business users. The company’s cash burn remains high because of the intensive computational costs required to train and operate large language models. It projects an $8 billion operating loss in 2025. OpenAI reports revised long-term spending projections totaling approximately $115 billion through 2029, with annual expenditures projected to escalate significantly, reaching $17 billion in 2026, $35 billion in 2027, and $45 billion in 2028. These expenditures are primarily allocated toward expanding compute infrastructure, developing proprietary AI chips, constructing data centers, and funding intensive model training programs, with more than half of the spending through the end of the decade expected to support research-intensive compute for model training and development. The company's financial strategy prioritizes market expansion and technological advancement over near-term profitability, with OpenAI targeting cash-flow-positive operations by 2029 and projecting revenue of approximately $200 billion by 2030. This aggressive spending trajectory underscores both the enormous capital requirements of scaling cutting-edge AI technology and OpenAI's commitment to maintaining its position as a leader in the artificial intelligence industry. In October 2025, OpenAI completed an employee share sale of up to $10 billion to existing investors which valued the company at $500 billion. The deal values OpenAI as the most valuable privately owned company in the world—surpassing SpaceX as the world's most valuable private company. On November 17, 2023, Sam Altman was removed as CEO when its board of directors (composed of Helen Toner, Ilya Sutskever, Adam D'Angelo and Tasha McCauley) cited a lack of confidence in him. Chief Technology Officer Mira Murati took over as interim CEO. Greg Brockman, the president of OpenAI, was also removed as chairman of the board and resigned from the company's presidency shortly thereafter. Three senior OpenAI researchers subsequently resigned: director of research and GPT-4 lead Jakub Pachocki, head of AI risk Aleksander Mądry, and researcher Szymon Sidor. On November 18, 2023, there were reportedly talks of Altman returning as CEO amid pressure placed upon the board by investors such as Microsoft and Thrive Capital, who objected to Altman's departure. Although Altman himself spoke in favor of returning to OpenAI, he has since stated that he considered starting a new company and bringing former OpenAI employees with him if talks to reinstate him didn't work out. The board members agreed "in principle" to resign if Altman returned. On November 19, 2023, negotiations with Altman to return failed and Murati was replaced by Emmett Shear as interim CEO. The board initially contacted Anthropic CEO Dario Amodei (a former OpenAI executive) about replacing Altman, and proposed a merger of the two companies, but both offers were declined. On November 20, 2023, Microsoft CEO Satya Nadella announced Altman and Brockman would be joining Microsoft to lead a new advanced AI research team, but added that they were still committed to OpenAI despite recent events. Before the partnership with Microsoft was finalized, Altman gave the board another opportunity to negotiate with him. About 738 of OpenAI's 770 employees, including Murati and Sutskever, signed an open letter stating they would quit their jobs and join Microsoft if the board did not rehire Altman and then resign. This prompted OpenAI investors to consider legal action against the board as well. In response, OpenAI management sent an internal memo to employees stating that negotiations with Altman and the board had resumed and would take some time. On November 21, 2023, after continued negotiations, Altman and Brockman returned to the company in their prior roles along with a reconstructed board made up of new members Bret Taylor (as chairman) and Lawrence Summers, with D'Angelo remaining. According to subsequent reporting, shortly before Altman’s firing, some employees raised concerns to the board about how he had handled the safety implications of a recent internal AI capability discovery. On November 29, 2023, OpenAI announced that an anonymous Microsoft employee had joined the board as a non-voting member to observe the company's operations; Microsoft resigned from the board in July 2024. In February 2024, the Securities and Exchange Commission subpoenaed OpenAI's internal communication to determine if Altman's alleged lack of candor misled investors. In 2024, following the temporary removal of Sam Altman and his return, many employees gradually left OpenAI, including most of the original leadership team and a significant number of AI safety researchers. In August 2023, it was announced that OpenAI had acquired the New York-based start-up Global Illumination, a company that deploys AI to develop digital infrastructure and creative tools. In June 2024, OpenAI acquired Multi, a startup focused on remote collaboration. In March 2025, OpenAI reached a deal with CoreWeave to acquire $350 million worth of CoreWeave shares and access to AI infrastructure, in return for $11.9 billion paid over five years. Microsoft was already CoreWeave's biggest customer in 2024. Alongside their other business dealings, OpenAI and Microsoft were renegotiating the terms of their partnership to facilitate a potential future initial public offering by OpenAI, while ensuring Microsoft's continued access to advanced AI models. On May 21, OpenAI announced the $6.5 billion acquisition of io, an AI hardware start-up founded by former Apple designer Jony Ive in 2024. In September 2025, OpenAI agreed to acquire the product testing startup Statsig for $1.1 billion in an all-stock deal and appointed Statsig's founding CEO Vijaye Raji as OpenAI's chief technology officer of applications. The company also announced development of an AI-driven hiring service designed to rival LinkedIn. OpenAI acquired personal finance app Roi in October 2025. In October 2025, OpenAI acquired Software Applications Incorporated, the developer of Sky, a macOS-based natural language interface designed to operate across desktop applications. The Sky team joined OpenAI, and the company announced plans to integrate Sky’s capabilities into ChatGPT. In December 2025, it was announced OpenAI had agreed to acquire Neptune, an AI tooling startup that helps companies track and manage model training, for an undisclosed amount. In January 2026, it was announced OpenAI had acquired healthcare technology startup Torch for approximately $60 million. The acquisition followed the launch of OpenAI’s ChatGPT Health product and was intended to strengthen the company’s medical data and healthcare artificial intelligence capabilities. OpenAI has been criticized for outsourcing the annotation of data sets to Sama, a company based in San Francisco that employed workers in Kenya. These annotations were used to train an AI model to detect toxicity, which could then be used to moderate toxic content, notably from ChatGPT's training data and outputs. However, these pieces of text usually contained detailed descriptions of various types of violence, including sexual violence. The investigation uncovered that OpenAI began sending snippets of data to Sama as early as November 2021. The four Sama employees interviewed by Time described themselves as mentally scarred. OpenAI paid Sama $12.50 per hour of work, and Sama was redistributing the equivalent of between $1.32 and $2.00 per hour post-tax to its annotators. Sama's spokesperson said that the $12.50 was also covering other implicit costs, among which were infrastructure expenses, quality assurance and management. In 2024, OpenAI began collaborating with Broadcom to design a custom AI chip capable of both training and inference, targeted for mass production in 2026 and to be manufactured by TSMC on a 3 nm process node. This initiative intended to reduce OpenAI's dependence on Nvidia GPUs, which are costly and face high demand in the market. In January 2024, Arizona State University purchased ChatGPT Enterprise in OpenAI's first deal with a university. In June 2024, Apple Inc. signed a contract with OpenAI to integrate ChatGPT features into its products as part of its new Apple Intelligence initiative. In June 2025, OpenAI began renting Google Cloud's Tensor Processing Units (TPUs) to support ChatGPT and related services, marking its first meaningful use of non‑Nvidia AI chips. In September 2025, it was revealed that OpenAI signed a contract with Oracle to purchase $300 billion in computing power over the next five years. In September 2025, OpenAI and NVIDIA announced a memorandum of understanding that included a potential deployment of at least 10 gigawatts of NVIDIA systems and a $100 billion investment from NVIDIA in OpenAI. OpenAI expected the negotiations to be completed within weeks. As of January 2026, this has not been realized, and the two sides are rethinking the future of their partnership. In October 2025, OpenAI announced a multi-billion dollar deal with AMD. OpenAI committed to purchasing six gigawatts worth of AMD chips, starting with the MI450. OpenAI will have the option to buy up to 160 million shares of AMD, about 10% of the company, depending on development, performance and share price targets. In December 2025, Disney said it would make a $1 billion investment in OpenAI, and signed a three-year licensing deal that will let users generate videos using Sora—OpenAI's short-form AI video platform. More than 200 Disney, Marvel, Star Wars and Pixar characters will be available to OpenAI users. In early 2026, Amazon entered advanced discussions to invest up to $50 billion in OpenAI as part of a potential artificial intelligence partnership. Under the proposed agreement, OpenAI’s models could be integrated into Amazon’s digital assistant Alexa and other internal projects. OpenAI provides LLMs to the Artificial Intelligence Cyber Challenge and to the Advanced Research Projects Agency for Health. In October 2024, The Intercept revealed that OpenAI's tools are considered "essential" for AFRICOM's mission and included in an "Exception to Fair Opportunity" contractual agreement between the United States Department of Defense and Microsoft. In December 2024, OpenAI said it would partner with defense-tech company Anduril to build drone defense technologies for the United States and its allies. In 2025, OpenAI's Chief Product Officer, Kevin Weil, was commissioned lieutenant colonel in the U.S. Army to join Detachment 201 as senior advisor. In June 2025, the U.S. Department of Defense awarded OpenAI a $200 million one-year contract to develop AI tools for military and national security applications. OpenAI announced a new program, OpenAI for Government, to give federal, state, and local governments access to its models, including ChatGPT. Services In February 2019, GPT-2 was announced, which gained attention for its ability to generate human-like text. In 2020, OpenAI announced GPT-3, a language model trained on large internet datasets. GPT-3 is aimed at natural language answering questions, but it can also translate between languages and coherently generate improvised text. It also announced that an associated API, named the API, would form the heart of its first commercial product. Eleven employees left OpenAI, mostly between December 2020 and January 2021, in order to establish Anthropic. In 2021, OpenAI introduced DALL-E, a specialized deep learning model adept at generating complex digital images from textual descriptions, utilizing a variant of the GPT-3 architecture. In December 2022, OpenAI received widespread media coverage after launching a free preview of ChatGPT, its new AI chatbot based on GPT-3.5. According to OpenAI, the preview received over a million signups within the first five days. According to anonymous sources cited by Reuters in December 2022, OpenAI Global, LLC was projecting $200 million of revenue in 2023 and $1 billion in revenue in 2024. After ChatGPT was launched, Google announced a similar chatbot, Bard, amid internal concerns that ChatGPT could threaten Google’s position as a primary source of online information. On February 7, 2023, Microsoft announced that it was building AI technology based on the same foundation as ChatGPT into Microsoft Bing, Edge, Microsoft 365 and other products. On March 14, 2023, OpenAI released GPT-4, both as an API (with a waitlist) and as a feature of ChatGPT Plus. On November 6, 2023, OpenAI launched GPTs, allowing individuals to create customized versions of ChatGPT for specific purposes, further expanding the possibilities of AI applications across various industries. On November 14, 2023, OpenAI announced they temporarily suspended new sign-ups for ChatGPT Plus due to high demand. Access for newer subscribers re-opened a month later on December 13. In December 2024, the company launched the Sora model. It also launched OpenAI o1, an early reasoning model that was internally codenamed strawberry. Additionally, ChatGPT Pro—a $200/month subscription service offering unlimited o1 access and enhanced voice features—was introduced, and preliminary benchmark results for the upcoming OpenAI o3 models were shared. On January 23, 2025, OpenAI released Operator, an AI agent and web automation tool for accessing websites to execute goals defined by users. The feature was only available to Pro users in the United States. OpenAI released deep research agent, nine days later. It scored a 27% accuracy on the benchmark Humanity's Last Exam (HLE). Altman later stated GPT-4.5 would be the last model without full chain-of-thought reasoning. In July 2025, reports indicated that AI models by both OpenAI and Google DeepMind solved mathematics problems at the level of top-performing students in the International Mathematical Olympiad. OpenAI's large language model was able to achieve gold medal-level performance, reflecting significant progress in AI's reasoning abilities. On October 6, 2025, OpenAI unveiled its Agent Builder platform during the company's DevDay event. The platform includes a visual drag-and-drop interface that lets developers and businesses design, test, and deploy agentic workflows with limited coding. On October 21, 2025, OpenAI introduced ChatGPT Atlas, a browser integrating the ChatGPT assistant directly into web navigation, to compete with existing browsers such as Google Chrome and Apple Safari. On December 11, 2025, OpenAI announced GPT-5.2. This model will be better at creating spreadsheets, building presentations, perceiving images, writing code and understanding long context. On January 27, 2026, OpenAI introduced Prism, a LaTeX-native workspace meant to assist scientists to help with research and writing. The platform utilizes GPT-5.2 as a backend to automate the process of drafting for scientific papers, including features for managing citations, complex equation formatting, and real-time collaborative editing. In March 2023, the company was criticized for disclosing particularly few technical details about products like GPT-4, contradicting its initial commitment to openness and making it harder for independent researchers to replicate its work and develop safeguards. OpenAI cited competitiveness and safety concerns to justify this repudiation. OpenAI's former chief scientist Ilya Sutskever argued in 2023 that open-sourcing increasingly capable models was increasingly risky, and that the safety reasons for not open-sourcing the most potent AI models would become "obvious" in a few years. In September 2025, OpenAI published a study on how people use ChatGPT for everyday tasks. The study found that "non-work tasks" (according to an LLM-based classifier) account for more than 72 percent of all ChatGPT usage, with a minority of overall usage related to business productivity. In July 2023, OpenAI launched the superalignment project, aiming within four years to determine how to align future superintelligent systems. OpenAI promised to dedicate 20% of its computing resources to the project, although the team denied receiving anything close to 20%. OpenAI ended the project in May 2024 after its co-leaders Ilya Sutskever and Jan Leike left the company. In August 2025, OpenAI was criticized after thousands of private ChatGPT conversations were inadvertently exposed to public search engines like Google due to an experimental "share with search engines" feature. The opt-in toggle, intended to allow users to make specific chats discoverable, resulted in some discussions including personal details such as names, locations, and intimate topics appearing in search results when users accidentally enabled it while sharing links. OpenAI announced the feature's permanent removal on August 1, 2025, and the company began coordinating with search providers to remove the exposed content, emphasizing that it was not a security breach but a design flaw that heightened privacy risks. CEO Sam Altman acknowledged the issue in a podcast, noting users often treat ChatGPT as a confidant for deeply personal matters, which amplified concerns about AI handling sensitive data. Management In 2018, Musk resigned from his Board of Directors seat, citing "a potential future conflict [of interest]" with his role as CEO of Tesla due to Tesla's AI development for self-driving cars. OpenAI stated that Musk's financial contributions were below $45 million. On March 3, 2023, Reid Hoffman resigned from his board seat, citing a desire to avoid conflicts of interest with his investments in AI companies via Greylock Partners, and his co-founding of the AI startup Inflection AI. Hoffman remained on the board of Microsoft, a major investor in OpenAI. In May 2024, Chief Scientist Ilya Sutskever resigned and was succeeded by Jakub Pachocki. Co-leader Jan Leike also departed amid concerns over safety and trust. OpenAI then signed deals with Reddit, News Corp, Axios, and Vox Media. Paul Nakasone then joined the board of OpenAI. In August 2024, cofounder John Schulman left OpenAI to join Anthropic, and OpenAI's president Greg Brockman took extended leave until November. In September 2024, CTO Mira Murati left the company. In November 2025, Lawrence Summers resigned from the board of directors. Governance and legal issues In May 2023, Sam Altman, Greg Brockman and Ilya Sutskever posted recommendations for the governance of superintelligence. They stated that superintelligence could happen within the next 10 years, allowing a "dramatically more prosperous future" and that "given the possibility of existential risk, we can't just be reactive". They proposed creating an international watchdog organization similar to IAEA to oversee AI systems above a certain capability threshold, suggesting that relatively weak AI systems on the other side should not be overly regulated. They also called for more technical safety research for superintelligences, and asked for more coordination, for example through governments launching a joint project which "many current efforts become part of". In July 2023, the FTC issued a civil investigative demand to OpenAI to investigate whether the company's data security and privacy practices to develop ChatGPT were unfair or harmed consumers (including by reputational harm) in violation of Section 5 of the Federal Trade Commission Act of 1914. These are typically preliminary investigative matters and are nonpublic, but the FTC's document was leaked. In July 2023, the FTC launched an investigation into OpenAI over allegations that the company scraped public data and published false and defamatory information. They asked OpenAI for comprehensive information about its technology and privacy safeguards, as well as any steps taken to prevent the recurrence of situations in which its chatbot generated false and derogatory content about people. The agency also raised concerns about ‘circular’ spending arrangements—for example, Microsoft extending Azure credits to OpenAI while both companies shared engineering talent—and warned that such structures could negatively affect the public. In September 2024, OpenAI's global affairs chief endorsed the UK's "smart" AI regulation during testimony to a House of Lords committee. In February 2025, OpenAI CEO Sam Altman stated that the company is interested in collaborating with the People's Republic of China, despite regulatory restrictions imposed by the U.S. government. This shift comes in response to the growing influence of the Chinese artificial intelligence company DeepSeek, which has disrupted the AI market with open models, including DeepSeek V3 and DeepSeek R1. Following DeepSeek's market emergence, OpenAI enhanced security protocols to protect proprietary development techniques from industrial espionage. Some industry observers noted similarities between DeepSeek's model distillation approach and OpenAI's methodology, though no formal intellectual property claim was filed. According to Oliver Roberts, in March 2025, the United States had 781 state AI bills or laws. OpenAI advocated for preempting state AI laws with federal laws. According to Scott Kohler, OpenAI has opposed California's AI legislation and suggested that the state bill encroaches on a more competent federal government. Public Citizen opposed a federal preemption on AI and pointed to OpenAI's growth and valuation as evidence that existing state laws have not hampered innovation. Before May 2024, OpenAI required departing employees to sign a lifelong non-disparagement agreement forbidding them from criticizing OpenAI and acknowledging the existence of the agreement. Daniel Kokotajlo, a former employee, publicly stated that he forfeited his vested equity in OpenAI in order to leave without signing the agreement. Sam Altman stated that he was unaware of the equity cancellation provision, and that OpenAI never enforced it to cancel any employee's vested equity. However, leaked documents and emails refute this claim. On May 23, 2024, OpenAI sent a memo releasing former employees from the agreement. OpenAI was sued for copyright infringement by authors Sarah Silverman, Matthew Butterick, Paul Tremblay and Mona Awad in July 2023. In September 2023, 17 authors, including George R. R. Martin, John Grisham, Jodi Picoult and Jonathan Franzen, joined the Authors Guild in filing a class action lawsuit against OpenAI, alleging that the company's technology was illegally using their copyrighted work. The New York Times also sued the company in late December 2023. In May 2024 it was revealed that OpenAI had destroyed its Books1 and Books2 training datasets, which were used in the training of GPT-3, and which the Authors Guild believed to have contained over 100,000 copyrighted books. In 2021, OpenAI developed a speech recognition tool called Whisper. OpenAI used it to transcribe more than one million hours of YouTube videos into text for training GPT-4. The automated transcription of YouTube videos raised concerns within OpenAI employees regarding potential violations of YouTube's terms of service, which prohibit the use of videos for applications independent of the platform, as well as any type of automated access to its videos. Despite these concerns, the project proceeded with notable involvement from OpenAI's president, Greg Brockman. The resulting dataset proved instrumental in training GPT-4. In February 2024, The Intercept as well as Raw Story and Alternate Media Inc. filed lawsuit against OpenAI on copyright litigation ground. The lawsuit is said to have charted a new legal strategy for digital-only publishers to sue OpenAI. On April 30, 2024, eight newspapers filed a lawsuit in the Southern District of New York against OpenAI and Microsoft, claiming illegal harvesting of their copyrighted articles. The suing publications included The Mercury News, The Denver Post, The Orange County Register, St. Paul Pioneer Press, Chicago Tribune, Orlando Sentinel, Sun Sentinel, and New York Daily News. In June 2023, a lawsuit claimed that OpenAI scraped 300 billion words online without consent and without registering as a data broker. It was filed in San Francisco, California, by sixteen anonymous plaintiffs. They also claimed that OpenAI and its partner as well as customer Microsoft continued to unlawfully collect and use personal data from millions of consumers worldwide to train artificial intelligence models. On May 22, 2024, OpenAI entered into an agreement with News Corp to integrate news content from The Wall Street Journal, the New York Post, The Times, and The Sunday Times into its AI platform. Meanwhile, other publications like The New York Times chose to sue OpenAI and Microsoft for copyright infringement over the use of their content to train AI models. In November 2024, a coalition of Canadian news outlets, including the Toronto Star, Metroland Media, Postmedia, The Globe and Mail, The Canadian Press and CBC, sued OpenAI for using their news articles to train its software without permission. In October 2024 during a New York Times interview, Suchir Balaji accused OpenAI of violating copyright law in developing its commercial LLMs which he had helped engineer. He was a likely witness in a major copyright trial against the AI company, and was one of several of its current or former employees named in court filings as potentially having documents relevant to the case. On November 26, 2024, Balaji died by suicide. His death prompted the circulation of conspiracy theories alleging that he had been deliberately silenced. California Congressman Ro Khanna endorsed calls for an investigation. On April 24, 2025, Ziff Davis sued OpenAI in Delaware federal court for copyright infringement. Ziff Davis is known for publications such as ZDNet, PCMag, CNET, IGN and Lifehacker. In April 2023, the EU's European Data Protection Board (EDPB) formed a dedicated task force on ChatGPT "to foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities" based on the "enforcement action undertaken by the Italian data protection authority against OpenAI about the ChatGPT service". In late April 2024 NOYB filed a complaint with the Austrian Datenschutzbehörde against OpenAI for violating the European General Data Protection Regulation. A text created with ChatGPT gave a false date of birth for a living person without giving the individual the option to see the personal data used in the process. A request to correct the mistake was denied. Additionally, neither the recipients of ChatGPT's work nor the sources used, could be made available, OpenAI claimed. OpenAI was criticized for lifting its ban on using ChatGPT for "military and warfare". Up until January 10, 2024, its "usage policies" included a ban on "activity that has high risk of physical harm, including", specifically, "weapons development" and "military and warfare". Its new policies prohibit "[using] our service to harm yourself or others" and to "develop or use weapons". In August 2025, the parents of a 16-year-old boy who died by suicide filed a wrongful death lawsuit against OpenAI (and CEO Sam Altman), alleging that months of conversations with ChatGPT about mental health and methods of self-harm contributed to their son's death and that safeguards were inadequate for minors. OpenAI expressed condolences and said it was strengthening protections (including updated crisis response behavior and parental controls). Coverage described it as a first-of-its-kind wrongful death case targeting the company's chatbot. The complaint was filed in California state court in San Francisco. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed seven lawsuits against OpenAI, of which four lawsuits alleged wrongful death. The suits were filed on behalf of Zane Shamblin, 23, of Texas; Amaurie Lacey, 17, of Georgia; Joshua Enneking, 26, of Florida; and Joe Ceccanti, 48, of Oregon, who each committed suicide after prolonged ChatGPT usage. In December 2025, Stein-Erik Soelberg, who was 56 years old at the time, allegedly murdered his mother Suzanne Adams. In the months prior the paranoid, delusional man often discussed his ideas with ChatGPT. Adam's estate then sued OpenAI claiming that the company shared responsibility due to the risk of chatbot psychosis despite the fact that chatbot psychosis is not a real medical diagnosis. OpenAI responded saying they will make ChatGPT safer for users disconnected from reality. See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Dana_White] | [TOKENS: 3923]
Contents Dana White Page version status This is an accepted version of this page Dana Frederick White Jr. (born July 28, 1969) is an American businessman who is the CEO and president of the Ultimate Fighting Championship (UFC), a global mixed martial arts organization. In 2025, White's net worth was estimated at over $600 million by Forbes. Outside the UFC, he is the owner of Power Slap, a slap fighting promotion he founded in 2022. Meta Platforms elected White to its board of directors in 2025. Early life and education White was born in Manchester, Connecticut, to June and Dana White Sr. on July 28, 1969. He is an Irish American. White spent many of his early years residing in Ware, Massachusetts. White and his sister, Kelly, were raised by their mother and her family for the majority of their childhoods. White's mother was a nurse, and the family moved to Las Vegas when White was in third grade, as Vegas offered higher wages for nurses. White attended Bishop Gorman High School, where he first met Lorenzo Fertitta, although they did not become close friends until years later. White said he disliked school and "got kicked out of Gorman twice". Despite living in Nevada, the Whites returned to the East Coast in the summers, to spend time with White's grandparents in Levant, Maine, a small town near Bangor. White spent his senior year of high school in Maine. After graduating from Hermon High School in 1987, White started college twice, once at Quincy College and once at UMass Boston, but dropped out during his first semester each time. During this time, he had various jobs, such as laying asphalt, working as a bouncer at an Irish bar, and being a bellhop at the Boston Harbor Hotel. White had begun boxing at age 17, and befriended former Golden Gloves champion Peter Welch. Through this relationship, White decided he wanted to enter the fight business, and he started a boxing gym in Boston with Welch. White initially intended to become a professional boxer himself, but was put off the idea after seeing a punch drunk boxer and worrying that he would suffer the same neurodegeneration. White then worked as a boxercise coach. White has stated he left Boston to return to Las Vegas in the early 1990s after being threatened by mobster Whitey Bulger and his associate Kevin Weeks. "He basically said, 'You owe us money.' It was like $2,500, which was like $25,000 to me back then, and I didn't pay him. This went on for a while and one day I was at my place and I got a call and they said, 'You owe us the money tomorrow by 1 o'clock.' I literally hung up the phone, picked up the phone and called Delta and bought a ticket to Vegas." In Las Vegas, White continued running boxercise gyms and also began training jiu-jitsu under the tutelage of John Lewis (a competitor in UFC 22 and UFC 28), alongside Lorenzo Fertitta and his older brother Frank Fertitta III. White had reconnected with the Fertitta brothers after meeting Lorenzo at a mutual friend's wedding; they had not spoken to each other in 10 years prior to this encounter. It was in Lewis' practices where White met mixed martial artists Tito Ortiz and Chuck Liddell and ultimately became their manager. Career While working as a manager for Ortiz and Liddell, White met Bob Meyrowitz, the owner of Semaphore Entertainment Group, the then-parent company of the Ultimate Fighting Championship (UFC). When White learned that Meyrowitz was looking to sell the UFC, he contacted his friend Lorenzo Fertitta (an executive and co-founder of Station Casinos, and former commissioner of the Nevada Athletic Commission), to ask if he would be interested in acquiring the company. In January 2001, Lorenzo and his older brother Frank acquired the UFC for $2 million (equivalent to $3.6 million in 2025); it subsequently became a subsidiary of Zuffa. White was installed as the company's president. He was also granted a stake in the company as a finder's fee and as sweat equity. White said that when he and the Fertittas acquired the UFC, all they received was the brand name "UFC" and an old octagon. The previous owners had stripped the company's assets to avoid bankruptcy, so much so that the UFC.com website had been sold to a company named "User Friendly Computers". The first UFC cards under White's leadership, UFC 30 and UFC 31, were held in the Trump Taj Mahal, which led to the development of White's long-term friendship with Donald Trump. White re-hired Joe Rogan, who had previously conducted backstage interviews at UFC events; Rogan made his debut as a color commentator at UFC 37.5 in June 2002. The UFC did not immediately have success after the Zuffa takeover, and by 2004 the Fertittas had invested over $40 million (equivalent to $68 million in 2025) into the company without attaining significant growth or reaching profitability. White, alongside the Fertittas and television producer Craig Piligian, developed the idea of an MMA-based reality series, The Ultimate Fighter, as an attempt to create interest in the sport. The company self-funded the show as television networks refused to pay for the cost of production. The Ultimate Fighter, particularly the finale fight between Stephan Bonnar and Forrest Griffin, is credited for having "saved the UFC". Under White's leadership the UFC subsequently developed into a highly successful business; in 2006, UFC 66 headlined by Chuck Liddell and Tito Ortiz generated over 1,000,000 pay-per-view buys, and by 2008 the company was valued by Forbes at $1.1 billion. In 2010, Flash Entertainment, a subsidiary of the Abu Dhabi Government, bought a 10% stake in Zuffa for a reported $150–175 million, leaving White with a 9% stake in the company and the Fertittas with 40.5% each. In a 2011 street interview with TMZ Sports, White responded "never ever" to the question: "When will we see women in the UFC?" He later reversed this stance and in February 2013 the UFC held its first women's bout, featuring Ronda Rousey and Liz Carmouche. Rousey subsequently became one of the UFC's two main marquee stars in the mid-2010s, alongside Conor McGregor. The pair accounted for 4.6 million pay-per-view buys in 2015, 61% of the company's total buyrate that year, as the UFC's gross revenue reached $600 million in 2015. In July 2016, Zuffa was sold to a consortium of investors led by WME-IMG for $4.025 billion. White owned 9% of the company at the time of the sale. White continued in his role as president, and was given a stake in the new business. In March 2019, White signed a new, seven-year contract to remain president of the UFC, as the UFC signed a deal with ESPN. In April 2023, Endeavor (renamed from WME-IMG) announced a deal that would see professional wrestling company WWE merge with UFC to form a new public company, TKO Group Holdings. White was subsequently given the title of chief executive officer of the UFC. In August 2025, White stated the UFC had reached a seven-year, $7.7-billion broadcast deal with Paramount and CBS. Set to begin in 2026 when the ESPN contract expires, White said the deal would mean "[f]or the first time ever, fans in the US will have access to all UFC content without a Pay-Per-View model." In February 2026, White testified in Nevada Federal District Court as part of an anti-trust investigation into the UFC that he was no longer involved in negotiating contracts or matchmaking with fighters, stating that since 2017 this business has been handled by his delegates Hunter Campbell, Mick Maynard and Sean Shelby. Campbell testified that White is focused on the UFC's strategy and overall growth, stating: "I run the fight business, and [White] runs the entire company." Dana White recently has announced the first and only UFC Card set to take place on the South Lawn of the White House, calling it a "one-of-one" event that will be the "most-watched UFC event ever". The card will contain around six or seven fights, which is a lot less than a usual UFC event. Many fans have speculated a possible return from UFC legend Connor McGregor for the super card, which Dana has not confirmed or denied. White recently announced that the card was made and that he "laid out two different options", while not directly informing the public of the fighters just yet. White entered the boxing scene by co-promoting Floyd Mayweather Jr. vs. Conor McGregor due to McGregor being contracted to the UFC. In October 2017, White said at Freddie Roach's Wild Card West boxing club that he was "getting into boxing, 100 percent". In 2019 White said that he wanted to incorporate boxing into the company portfolio of the UFC. He later backpedaled on these plans, stating in 2022 that boxing promotion is "a broken business that is an absolute nightmare to try to fix". In 2024, White promoted a card featuring Irish boxer Callum Walsh which was streamed on UFC Fight Pass. He subsequently re-announced his intention to enter boxing promotion, and said he was "coming in guns blazing" in 2025. In March 2025, White announced a partnership with Turki Al-Sheikh, head of Saudi Arabia's General Entertainment Authority, to create a new boxing league named Zuffa Boxing, modeled after the UFC. The UFC's parent company TKO Group Holdings and the Saudi entertainment conglomerate Sela were announced as backers of the venture. White was named as executive leader of the league. Zuffa Boxing held its first numbered event, Zuffa Boxing 01, in January 2026. It was hosted at the Meta Apex in Las Vegas and was headlined by a bout between Callum Walsh and Carlos Ocampo. White announced in February that the first bout for a Zuffa Boxing title would be contested between cruiserweights Jai Opetaia and Brandon Glanton, at Zuffa Boxing 04 in March. In 2017, White began hosting Dana White's Contender Series. Available initially through UFC Fight Pass, the promotion's digital streaming service, and licensed separately from the UFC brand, the show allows burgeoning fighters to compete for a contract in the UFC. White was an executive producer of The Ultimate Surfer, a surfing competition television series styled after The Ultimate Fighter. It was created by Kelly Slater, a professional surfer and long-time friend of White. The show premiered on ABC in August 2021, and was canceled after one season due to low ratings. In 2022, White became a co-owner of Thrill One Sports & Entertainment, which owns the properties of Nitro Circus, Nitrocross, and Street League Skateboarding. White alongside the Fertittas co-founded Power Slap, a slap fighting competition that debuted in January 2023 and was previously televised on TBS. After TBS did not renew its deal with Power Slap, the competition began to be aired on the digital streaming platform Rumble. In January 2025, Meta Platforms announced that White had been elected to its board of directors. In May 2025, White partnered with 1st Phorm and Anheuser-Busch to found Phorm Energy, a zero-sugar energy drink venture. Personal life White met his wife Anne (née Stella) when they were in the eighth grade, and they married in 1996. They have two sons and one daughter. White bought a mansion in Pine Island Court, Las Vegas, in 2006 from Frank Fertitta III for $1.95 million. White bought three other mansions in the same area from October 2016 to June 2017 for a combined total of around $6.2 million. Demolition permits were issued for the houses, presumably with the intent of creating a mega-mansion for White and his family. White was raised as a Catholic and was an altar boy when he was a child, but said in 2008 that he is an atheist. White stated in 2023: "I don’t believe in much, but I believe in karma for some reason. I never really was guided by religion in any way shape or form as I got into my 20s." In 2011, White's mother, June, released the book Dana White, King of MMA: An Unauthorized Biography. June claimed in the book that, since his success with the UFC, Dana had "turned his back on his family and friends who were there for him when he needed help and support". In May 2012, White revealed that he had been diagnosed with Ménière's disease, an inner ear disorder. He said, "It's like vertigo but on steroids." White said that the disease was brought on because of a large fight he was involved in during his youth. The UFC on Fuel TV 3: Korean Zombie vs. Poirier event in May 2012 was the first UFC card he had missed in 11 years with White staying home, adhering to medical advice. White underwent Orthokine treatment for Ménière's disease in 2013, which he says has greatly reduced his symptoms. In 2022, White stated he had undergone medical testing and was diagnosed with extremely high triglyceride levels and other irregularities. He was given 10.4 years left to live if he did not rectify the problems. White said that since then he has adhered to a keto diet, which he says has remedied his sleep apnea and alleviated his leg pain. The regimen was prescribed by biologist Gary Brecka, whom White credited for the improvement in his health. White spoke at the 2016 Republican National Convention, where he endorsed Republican nominee Donald Trump. White said that Trump helped the UFC at its beginnings, allowing the UFC to host its first event under Zuffa ownership (UFC 30) at the Trump Taj Mahal when other venues refused to host the UFC. White said, "No arenas wanted us. This guy reached out, and he's always been a friend to me." White reiterated his support for Trump for the 2020 election and spoke at the podium at a Trump reelection rally in Colorado. In February 2020, White donated $1 million to America First Action, a super political action committee that supported Trump's re-election bid. In April 2020, White joined a group of industry leaders to help the United States rebuild its economy hit by the coronavirus pandemic. White again endorsed Trump in the 2024 election and introduced him at the 2024 Republican National Convention. White is a fan of the Beastie Boys, Red Hot Chili Peppers and Rage Against the Machine. Speaking on the passing of Adam Yauch, White said: "I seriously haven't been impacted by a death in a long time like I was with the Beastie Boys." White also has a guitar hand-signed by all three members of the Beastie Boys in his office. White has been a fan of the New England Patriots since childhood, and has stated that he is an admirer of the former Patriots quarterback Tom Brady. White said in a 2022 interview alongside Rob Gronkowski that he had worked to negotiate a deal which would have brought Brady and Gronkowski to the Las Vegas Raiders in 2020, but Jon Gruden nixed the deal and the duo instead signed with the Tampa Bay Buccaneers. White began training Brazilian jiu-jitsu in 1998, and stated it had a positive impact on him and made him realize the importance of having ground skills in fighting situations. In a season 15 episode of History Channel's Pawn Stars (originally airing on February 5, 2018), White purchased around $60,000 worth of katana swords, including one of Rick Harrison's 17th-century Japanese katanas. White is a recreational gambler. He plays high-stakes blackjack, and has been limited from playing at multiple casinos in Las Vegas. He states he plays $75,000 a hand, and if he wins the first couple of hands, he takes the $150,000 profit and leaves. White also gambles on sports. He stated in 2021 that the biggest sports bet he had placed was in 2007, when he wagered $1 million on Jermain Taylor to beat Kelly Pavlik. Taylor lost the bout by technical knockout, and White said about the experience, "I was in Cabo, relaxing having fun. That ruined my fun. Ruined my whole trip." White stated in 2024 that he plays high-stakes baccarat, and wagers up to $350,000 per hand at Caesars Palace. He claims to have lost $8 million in one night, and from January to March 2024 to have won $26 million at Caesars Palace. White stated a "goal in life before I die" is for a casino to allow him to play $1 million per hand. During a 2022 New Year's Eve party at a nightclub in Cabo San Lucas, Mexico, White and his wife Anne were filmed arguing and then getting into a physical altercation with each other. Anne slapped White and White responded by slapping her in return. They both apologized for their actions, and said they had consumed too much alcohol that night. The UFC did not respond or address White's actions after the incident. The California Legislative Women's Caucus sent a letter to Ari Emanuel, CEO of Endeavor, the company that owns the UFC, requesting White be removed from his role in the UFC. In 2010, White donated $50,000 for a liver transplant to Tuptim Jadnooleum, the daughter of "Kru Nai" Rattanachai, one of the instructors for Tiger Muay Thai and MMA in Phuket, Thailand. In 2011, White donated $100,000 to his former high school to fund renovations to their athletic facilities. In 2017, White donated $1 million to the victims of the Las Vegas shooting. In 2019, after UFC 242, Khabib Nurmagomedov raised $100,000 for Dustin Poirier's charity, The Good Fight Foundation, White followed and also donated $100,000 to Poirier's charity foundation. In 2024, White donated $50,000 towards the support of victims affected by the attempted assassination of former president Donald J. Trump during a rally held on July 13 in Butler, Pennsylvania. Awards References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Python_(programming_language)#cite_ref-153] | [TOKENS: 4314]
Contents Python (programming language) Python is a high-level, general-purpose programming language. Its design philosophy emphasizes code readability with the use of significant indentation. Python is dynamically type-checked and garbage-collected. It supports multiple programming paradigms, including structured (particularly procedural), object-oriented and functional programming. Guido van Rossum began working on Python in the late 1980s as a successor to the ABC programming language. Python 3.0, released in 2008, was a major revision and not completely backward-compatible with earlier versions. Beginning with Python 3.5, capabilities and keywords for typing were added to the language, allowing optional static typing. As of 2026[update], the Python Software Foundation supports Python 3.10, 3.11, 3.12, 3.13, and 3.14, following the project's annual release cycle and five-year support policy. Python 3.15 is currently in the alpha development phase, and the stable release is expected to come out in October 2026. Earlier versions in the 3.x series have reached end-of-life and no longer receive security updates. Python has gained widespread use in the machine learning community. It is widely taught as an introductory programming language. Since 2003, Python has consistently ranked in the top ten of the most popular programming languages in the TIOBE Programming Community Index, which ranks based on searches in 24 platforms. History Python was conceived in the late 1980s by Guido van Rossum at Centrum Wiskunde & Informatica (CWI) in the Netherlands. It was designed as a successor to the ABC programming language, which was inspired by SETL, capable of exception handling and interfacing with the Amoeba operating system. Python implementation began in December 1989. Van Rossum first released it in 1991 as Python 0.9.0. Van Rossum assumed sole responsibility for the project, as the lead developer, until 12 July 2018, when he announced his "permanent vacation" from responsibilities as Python's "benevolent dictator for life" (BDFL); this title was bestowed on him by the Python community to reflect his long-term commitment as the project's chief decision-maker. (He has since come out of retirement and is self-titled "BDFL-emeritus".) In January 2019, active Python core developers elected a five-member Steering Council to lead the project. The name Python derives from the British comedy series Monty Python's Flying Circus. (See § Naming.) Python 2.0 was released on 16 October 2000, featuring many new features such as list comprehensions, cycle-detecting garbage collection, reference counting, and Unicode support. Python 2.7's end-of-life was initially set for 2015, and then postponed to 2020 out of concern that a large body of existing code could not easily be forward-ported to Python 3. It no longer receives security patches or updates. While Python 2.7 and older versions are officially unsupported, a different unofficial Python implementation, PyPy, continues to support Python 2, i.e., "2.7.18+" (plus 3.11), with the plus signifying (at least some) "backported security updates". Python 3.0 was released on 3 December 2008, and was a major revision and not completely backward-compatible with earlier versions, with some new semantics and changed syntax. Python 2.7.18, released in 2020, was the last release of Python 2. Several releases in the Python 3.x series have added new syntax to the language, and made a few (considered very minor) backward-incompatible changes. As of January 2026[update], Python 3.14.3 is the latest stable release. All older 3.x versions had a security update down to Python 3.9.24 then again with 3.9.25, the final version in 3.9 series. Python 3.10 is, since November 2025, the oldest supported branch. Python 3.15 has an alpha released, and Android has an official downloadable executable available for Python 3.14. Releases receive two years of full support followed by three years of security support. Design philosophy and features Python is a multi-paradigm programming language. Object-oriented programming and structured programming are fully supported, and many of their features support functional programming and aspect-oriented programming – including metaprogramming and metaobjects. Many other paradigms are supported via extensions, including design by contract and logic programming. Python is often referred to as a 'glue language' because it is purposely designed to be able to integrate components written in other languages. Python uses dynamic typing and a combination of reference counting and a cycle-detecting garbage collector for memory management. It uses dynamic name resolution (late binding), which binds method and variable names during program execution. Python's design offers some support for functional programming in the "Lisp tradition". It has filter, map, and reduce functions; list comprehensions, dictionaries, sets, and generator expressions. The standard library has two modules (itertools and functools) that implement functional tools borrowed from Haskell and Standard ML. Python's core philosophy is summarized in the Zen of Python (PEP 20) written by Tim Peters, which includes aphorisms such as these: However, Python has received criticism for violating these principles and adding unnecessary language bloat. Responses to these criticisms note that the Zen of Python is a guideline rather than a rule. The addition of some new features had been controversial: Guido van Rossum resigned as Benevolent Dictator for Life after conflict about adding the assignment expression operator in Python 3.8. Nevertheless, rather than building all functionality into its core, Python was designed to be highly extensible via modules. This compact modularity has made it particularly popular as a means of adding programmable interfaces to existing applications. Van Rossum's vision of a small core language with a large standard library and easily extensible interpreter stemmed from his frustrations with ABC, which represented the opposite approach. Python claims to strive for a simpler, less-cluttered syntax and grammar, while giving developers a choice in their coding methodology. Python lacks do .. while loops, which Rossum considered harmful. In contrast to Perl's motto "there is more than one way to do it", Python advocates an approach where "there should be one – and preferably only one – obvious way to do it". In practice, however, Python provides many ways to achieve a given goal. There are at least three ways to format a string literal, with no certainty as to which one a programmer should use. Alex Martelli is a Fellow at the Python Software Foundation and Python book author; he wrote that "To describe something as 'clever' is not considered a compliment in the Python culture." Python's developers typically prioritize readability over performance. For example, they reject patches to non-critical parts of the CPython reference implementation that would offer increases in speed that do not justify the cost of clarity and readability.[failed verification] Execution speed can be improved by moving speed-critical functions to extension modules written in languages such as C, or by using a just-in-time compiler like PyPy. Also, it is possible to transpile to other languages. However, this approach either fails to achieve the expected speed-up, since Python is a very dynamic language, or only a restricted subset of Python is compiled (with potential minor semantic changes). Python is meant to be a fun language to use. This goal is reflected in the name – a tribute to the British comedy group Monty Python – and in playful approaches to some tutorials and reference materials. For instance, some code examples use the terms "spam" and "eggs" (in reference to a Monty Python sketch), rather than the typical terms "foo" and "bar". A common neologism in the Python community is pythonic, which has a broad range of meanings related to program style: Pythonic code may use Python idioms well; be natural or show fluency in the language; or conform with Python's minimalist philosophy and emphasis on readability. Syntax and semantics Python is meant to be an easily readable language. Its formatting is visually uncluttered and often uses English keywords where other languages use punctuation. Unlike many other languages, it does not use curly brackets to delimit blocks, and semicolons after statements are allowed but rarely used. It has fewer syntactic exceptions and special cases than C or Pascal. Python uses whitespace indentation, rather than curly brackets or keywords, to delimit blocks. An increase in indentation comes after certain statements; a decrease in indentation signifies the end of the current block. Thus, the program's visual structure accurately represents its semantic structure. This feature is sometimes termed the off-side rule. Some other languages use indentation this way; but in most, indentation has no semantic meaning. The recommended indent size is four spaces. Python's statements include the following: The assignment statement (=) binds a name as a reference to a separate, dynamically allocated object. Variables may subsequently be rebound at any time to any object. In Python, a variable name is a generic reference holder without a fixed data type; however, it always refers to some object with a type. This is called dynamic typing—in contrast to statically-typed languages, where each variable may contain only a value of a certain type. Python does not support tail call optimization or first-class continuations; according to Van Rossum, the language never will. However, better support for coroutine-like functionality is provided by extending Python's generators. Before 2.5, generators were lazy iterators; data was passed unidirectionally out of the generator. From Python 2.5 on, it is possible to pass data back into a generator function; and from version 3.3, data can be passed through multiple stack levels. Python's expressions include the following: In Python, a distinction between expressions and statements is rigidly enforced, in contrast to languages such as Common Lisp, Scheme, or Ruby. This distinction leads to duplicating some functionality, for example: A statement cannot be part of an expression; because of this restriction, expressions such as list and dict comprehensions (and lambda expressions) cannot contain statements. As a particular case, an assignment statement such as a = 1 cannot be part of the conditional expression of a conditional statement. Python uses duck typing, and it has typed objects but untyped variable names. Type constraints are not checked at definition time; rather, operations on an object may fail at usage time, indicating that the object is not of an appropriate type. Despite being dynamically typed, Python is strongly typed, forbidding operations that are poorly defined (e.g., adding a number and a string) rather than quietly attempting to interpret them. Python allows programmers to define their own types using classes, most often for object-oriented programming. New instances of classes are constructed by calling the class, for example, SpamClass() or EggsClass()); the classes are instances of the metaclass type (which is an instance of itself), thereby allowing metaprogramming and reflection. Before version 3.0, Python had two kinds of classes, both using the same syntax: old-style and new-style. Current Python versions support the semantics of only the new style. Python supports optional type annotations. These annotations are not enforced by the language, but may be used by external tools such as mypy to catch errors. Python includes a module typing including several type names for type annotations. Also, mypy supports a Python compiler called mypyc, which leverages type annotations for optimization. 1.33333 frozenset() Python includes conventional symbols for arithmetic operators (+, -, *, /), the floor-division operator //, and the modulo operator %. (With the modulo operator, a remainder can be negative, e.g., 4 % -3 == -2.) Also, Python offers the ** symbol for exponentiation, e.g. 5**3 == 125 and 9**0.5 == 3.0. Also, it offers the matrix‑multiplication operator @ . These operators work as in traditional mathematics; with the same precedence rules, the infix operators + and - can also be unary, to represent positive and negative numbers respectively. Division between integers produces floating-point results. The behavior of division has changed significantly over time: In Python terms, the / operator represents true division (or simply division), while the // operator represents floor division. Before version 3.0, the / operator represents classic division. Rounding towards negative infinity, though a different method than in most languages, adds consistency to Python. For instance, this rounding implies that the equation (a + b)//b == a//b + 1 is always true. Also, the rounding implies that the equation b*(a//b) + a%b == a is valid for both positive and negative values of a. As expected, the result of a%b lies in the half-open interval [0, b), where b is a positive integer; however, maintaining the validity of the equation requires that the result must lie in the interval (b, 0] when b is negative. Python provides a round function for rounding a float to the nearest integer. For tie-breaking, Python 3 uses the round to even method: round(1.5) and round(2.5) both produce 2. Python versions before 3 used the round-away-from-zero method: round(0.5) is 1.0, and round(-0.5) is −1.0. Python allows Boolean expressions that contain multiple equality relations to be consistent with general usage in mathematics. For example, the expression a < b < c tests whether a is less than b and b is less than c. C-derived languages interpret this expression differently: in C, the expression would first evaluate a < b, resulting in 0 or 1, and that result would then be compared with c. Python uses arbitrary-precision arithmetic for all integer operations. The Decimal type/class in the decimal module provides decimal floating-point numbers to a pre-defined arbitrary precision with several rounding modes. The Fraction class in the fractions module provides arbitrary precision for rational numbers. Due to Python's extensive mathematics library and the third-party library NumPy, the language is frequently used for scientific scripting in tasks such as numerical data processing and manipulation. Functions are created in Python by using the def keyword. A function is defined similarly to how it is called, by first providing the function name and then the required parameters. Here is an example of a function that prints its inputs: To assign a default value to a function parameter in case no actual value is provided at run time, variable-definition syntax can be used inside the function header. Code examples "Hello, World!" program: Program to calculate the factorial of a non-negative integer: Libraries Python's large standard library is commonly cited as one of its greatest strengths. For Internet-facing applications, many standard formats and protocols such as MIME and HTTP are supported. The language includes modules for creating graphical user interfaces, connecting to relational databases, generating pseudorandom numbers, arithmetic with arbitrary-precision decimals, manipulating regular expressions, and unit testing. Some parts of the standard library are covered by specifications—for example, the Web Server Gateway Interface (WSGI) implementation wsgiref follows PEP 333—but most parts are specified by their code, internal documentation, and test suites. However, because most of the standard library is cross-platform Python code, only a few modules must be altered or rewritten for variant implementations. As of 13 March 2025,[update] the Python Package Index (PyPI), the official repository for third-party Python software, contains over 614,339 packages. Development environments Most[which?] Python implementations (including CPython) include a read–eval–print loop (REPL); this permits the environment to function as a command line interpreter, with which users enter statements sequentially and receive results immediately. Also, CPython is bundled with an integrated development environment (IDE) called IDLE, which is oriented toward beginners.[citation needed] Other shells, including IDLE and IPython, add additional capabilities such as improved auto-completion, session-state retention, and syntax highlighting. Standard desktop IDEs include PyCharm, Spyder, and Visual Studio Code; there are web browser-based IDEs, such as the following environments: Implementations CPython is the reference implementation of Python. This implementation is written in C, meeting the C11 standard since version 3.11. Older versions use the C89 standard with several select C99 features, but third-party extensions are not limited to older C versions—e.g., they can be implemented using C11 or C++. CPython compiles Python programs into an intermediate bytecode, which is then executed by a virtual machine. CPython is distributed with a large standard library written in a combination of C and native Python. CPython is available for many platforms, including Windows and most modern Unix-like systems, including macOS (and Apple M1 Macs, since Python 3.9.1, using an experimental installer). Starting with Python 3.9, the Python installer intentionally fails to install on Windows 7 and 8; Windows XP was supported until Python 3.5, with unofficial support for VMS. Platform portability was one of Python's earliest priorities. During development of Python 1 and 2, even OS/2 and Solaris were supported; since that time, support has been dropped for many platforms. All current Python versions (since 3.7) support only operating systems that feature multithreading, by now supporting not nearly as many operating systems (dropping many outdated) than in the past. All alternative implementations have at least slightly different semantics. For example, an alternative may include unordered dictionaries, in contrast to other current Python versions. As another example in the larger Python ecosystem, PyPy does not support the full C Python API. Creating an executable with Python often is done by bundling an entire Python interpreter into the executable, which causes binary sizes to be massive for small programs, yet there exist implementations that are capable of truly compiling Python. Alternative implementations include the following: Stackless Python is a significant fork of CPython that implements microthreads. This implementation uses the call stack differently, thus allowing massively concurrent programs. PyPy also offers a stackless version. Just-in-time Python compilers have been developed, but are now unsupported: There are several compilers/transpilers to high-level object languages; the source language is unrestricted Python, a subset of Python, or a language similar to Python: There are also specialized compilers: Some older projects existed, as well as compilers not designed for use with Python 3.x and related syntax: A performance comparison among various Python implementations, using a non-numerical (combinatorial) workload, was presented at EuroSciPy '13. In addition, Python's performance relative to other programming languages is benchmarked by The Computer Language Benchmarks Game. There are several approaches to optimizing Python performance, despite the inherent slowness of an interpreted language. These approaches include the following strategies or tools: Language Development Python's development is conducted mostly through the Python Enhancement Proposal (PEP) process; this process is the primary mechanism for proposing major new features, collecting community input on issues, and documenting Python design decisions. Python coding style is covered in PEP 8. Outstanding PEPs are reviewed and commented on by the Python community and the steering council. Enhancement of the language corresponds with development of the CPython reference implementation. The mailing list python-dev is the primary forum for the language's development. Specific issues were originally discussed in the Roundup bug tracker hosted by the foundation. In 2022, all issues and discussions were migrated to GitHub. Development originally took place on a self-hosted source-code repository running Mercurial, until Python moved to GitHub in January 2017. CPython's public releases have three types, distinguished by which part of the version number is incremented: Many alpha, beta, and release-candidates are also released as previews and for testing before final releases. Although there is a rough schedule for releases, they are often delayed if the code is not ready yet. Python's development team monitors the state of the code by running a large unit test suite during development. The major academic conference on Python is PyCon. Also, there are special Python mentoring programs, such as PyLadies. Naming Python's name is inspired by the British comedy group Monty Python, whom Python creator Guido van Rossum enjoyed while developing the language. Monty Python references appear frequently in Python code and culture; for example, the metasyntactic variables often used in Python literature are spam and eggs, rather than the traditional foo and bar. Also, the official Python documentation contains various references to Monty Python routines. Python users are sometimes referred to as "Pythonistas". Languages influenced by Python See also Notes References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Category:Multiplayer_and_single-player_video_games] | [TOKENS: 66]
Category:Multiplayer and single-player video games Subcategories This category has the following 10 subcategories, out of 10 total. Pages in category "Multiplayer and single-player video games" The following 200 pages are in this category, out of approximately 10,733 total. This list may not reflect recent changes.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Audio_mixing] | [TOKENS: 585]
Contents Audio mixing Audio mixing is the process by which multiple sounds are combined into one or more audio channels. In the process, a source's volume level, frequency content, dynamics, and panoramic position are manipulated or enhanced. This practical, aesthetic, or otherwise creative treatment is done in order to produce a finished version that is appealing to listeners. Audio mixing is practiced for music, film, television and live sound. The process is generally carried out by a mixing engineer operating a mixing console or digital audio workstation. Recorded music Before the introduction of multitrack recording, all the sounds and effects that were to be part of a recording were mixed together at one time during a live performance. If the sound blend was not satisfactory, or if one musician made a mistake, the selection had to be performed over until the desired balance and performance was obtained. However, with the introduction of multitrack recording, the production phase of a modern recording has radically changed into one that generally involves three stages: recording, overdubbing, and mixdown. Film and television During production dialogue recording of actors is done by a person variously known as location sound mixer, production sound or some similar designation. That person is a department head with a crew consisting of a boom operator and sometimes a cable person. Audio mixing for film and television is a process during the post-production stage of a moving image program by which a multitude of recorded sounds are combined. In the editing process, the source's signal level, frequency content, dynamics, and panoramic position are commonly manipulated and effects added. In video production, this is called sweetening. The process takes place on a mixing stage, typically in a studio or purpose-built theater, once the picture elements are edited into a final version. Normally the engineers will mix four main audio elements called stems: speech (dialogue, ADR, voice-overs, etc.), ambience (or atmosphere), sound effects, and music. As multi machine synchronization became available, filmmakers were able to split elements into multiple reels. With the advent of digital workstations and growing complexity, track counts in excess of 100 became common. Since the 2010s, critics and members of the audience have reported that dialogue in films tends to be increasingly more difficult to understand than in older films, to the point where viewers need to rely on subtitles to understand what is being said. Ben Pearson of SlashFilm attributed this to a combination of factors, only some of which can be addressed through audio mixing: Live sound Live sound mixing is the process of electrically blending together multiple sound sources at a live event using a mixing console. Sounds used include those from instruments, voices, and pre-recorded material. Individual sources may be equalised and routed to effect processors to ultimately be amplified and reproduced via loudspeakers. The live sound engineer balances the various audio sources in a way that best suits the needs of the event. References Further reading
========================================
[SOURCE: https://en.wikipedia.org/wiki/Special:BookSources/978-0789302625] | [TOKENS: 380]
Contents Book sources This page allows users to search multiple sources for a book given a 10- or 13-digit International Standard Book Number. Spaces and dashes in the ISBN do not matter. This page links to catalogs of libraries, booksellers, and other book sources where you will be able to search for the book by its International Standard Book Number (ISBN). Online text Google Books and other retail sources below may be helpful if you want to verify citations in Wikipedia articles, because they often let you search an online version of the book for specific words or phrases, or you can browse through the book (although for copyright reasons the entire book is usually not available). At the Open Library (part of the Internet Archive) you can borrow and read entire books online. Online databases Subscription eBook databases Libraries Alabama Alaska California Colorado Connecticut Delaware Florida Georgia Illinois Indiana Iowa Kansas Kentucky Massachusetts Michigan Minnesota Missouri Nebraska New Jersey New Mexico New York North Carolina Ohio Oklahoma Oregon Pennsylvania Rhode Island South Carolina South Dakota Tennessee Texas Utah Washington state Wisconsin Bookselling and swapping Find your book on a site that compiles results from other online sites: These sites allow you to search the catalogs of many individual booksellers: Non-English book sources If the book you are looking for is in a language other than English, you might find it helpful to look at the equivalent pages on other Wikipedias, linked below – they are more likely to have sources appropriate for that language. Find other editions The WorldCat xISBN tool for finding other editions is no longer available. However, there is often a "view all editions" link on the results page from an ISBN search. Google books often lists other editions of a book and related books under the "about this book" link. You can convert between 10 and 13 digit ISBNs with these tools: Find on Wikipedia See also Get free access to research! Research tools and services Outreach Get involved
========================================