text
stringlengths
0
473k
[SOURCE: https://en.wikipedia.org/wiki/United_States#Notes] | [TOKENS: 17273]
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023​, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America)
========================================
[SOURCE: https://en.wikipedia.org/wiki/United_States#History_2] | [TOKENS: 17273]
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023​, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America)
========================================
[SOURCE: https://en.wikipedia.org/wiki/United_States#Sports] | [TOKENS: 17273]
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023​, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America)
========================================
[SOURCE: https://en.wikipedia.org/wiki/Category:Constellations_listed_by_Ptolemy] | [TOKENS: 67]
Category:Constellations listed by Ptolemy Subcategories This category has the following 48 subcategories, out of 48 total. Pages in category "Constellations listed by Ptolemy" The following 47 pages are in this category, out of 47 total. This list may not reflect recent changes.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Slapstick] | [TOKENS: 1431]
Contents Slapstick Slapstick is a style of humor involving exaggerated physical activity that exceeds the boundaries of normal physical comedy. Slapstick may involve both intentional violence and violence by mishap, often resulting from physical abuse and/or inept use of props such as saws and ladders. The term arises from a device developed for use in the broad, physical comedy style known as commedia dell'arte in 16th-century Italy. The "slap stick" consists of two thin slats of wood, which makes a "slap" when striking another actor, with little force needed to make a loud—and comical—sound. The physical slap stick remains a key component of the plot in the traditional and popular Punch and Judy puppet show. More contemporary examples of slapstick humor include The Three Stooges, The Naked Gun and Mr. Bean. Origins The name "slapstick" originates from the Italian batacchio or bataccio—called the "slap stick" in English—a club-like object composed of two wooden slats used in commedia dell'arte. When struck, the Batacchio produces a loud smacking noise, though it is only a little force that is transferred from the object to the person being struck. Actors may thus hit one another repeatedly with great audible effect while causing no damage and only very minor, if any, pain. Along with the inflatable bladder (of which the whoopee cushion is a modern variant), it was among the earliest special effects. Early uses Slapstick comedy's history is measured in centuries. Shakespeare incorporated many chase scenes and beatings into his comedies, such as in his play The Comedy of Errors. In early 19th-century England, pantomime acquired its present form which includes slapstick comedy: its most famous performer, Joseph Grimaldi—the father of modern clowning—"was a master of physical comedy". Comedy routines also featured heavily in British music hall theatre which became popular in the 1850s. In Punch and Judy shows, which first appeared in England on 9 May 1662, a large slapstick is wielded by Punch against the other characters. Fred Karno and music hall British comedians who honed their skills at pantomime and music hall sketches include Charlie Chaplin, Stan Laurel, George Formby and Dan Leno. The English music hall comedian and theatre impresario Fred Karno developed a form of sketch comedy without dialogue in the 1890s, and Chaplin and Laurel were among the young comedians who worked for him as part of "Fred Karno's London Comedians". Chaplin's fifteen-year music hall career inspired the comedy in all his later film work, especially as pantomimicry. In 1904, Karno's Komics produced a new sketch for the Hackney Empire in London called Mumming Birds, which included the "pie in the face" gag, in which one person hits another with a pie, among other new innovations. Immensely popular, it became the longest-running sketch the music halls produced. Chaplin and Laurel were among the music hall comedians who partook in the sketch, while Charlie's older brother Sydney was the first of the brothers to perform it for Karno. In a biography of Karno, Laurel stated: "Fred Karno didn't teach Charlie [Chaplin] and me all we know about comedy. He just taught us most of it". American film producer Hal Roach described Karno as "not only a genius, he is the man who originated slapstick comedy. We in Hollywood owe much to him." In film and television Building on its later popularity in the 19th and early 20th-century routines of music hall in Britain and the American vaudeville house, the style was explored extensively during the "golden era" of black and white movies directed by Hal Roach and Mack Sennett that featured such notables as Charlie Chaplin, Mabel Normand, Abbott and Costello, Laurel and Hardy, the Three Stooges, and Larry Semon. The pie in the face gag was used extensively in this era. Chaplin's 1915 film A Night in the Show, which includes the pie in the face gag, brings one of the classic music hall comedy sketches, Mumming Birds, known as A Night in an English Music Hall when Chaplin performed it on tour, into his film work. Silent slapstick comedy was also popular in early French films and included films by Max Linder, Charles Prince, and Sarah Duhamel. Slapstick also became a common element in animated cartoons starting in the 1930s and 1940s; examples include Disney's Mickey Mouse and Donald Duck shorts, Walter Lantz's Woody Woodpecker, the Beary Family, MGM's Tom and Jerry, the unrelated Tom and Jerry cartoons of Van Beuren Studios, Warner Bros. Looney Tunes/Merrie Melodies, MGM's Barney Bear, and Tex Avery's Screwy Squirrel. Slapstick was later used in Japanese Tokusatsu TV Kamen Rider Den O, Kamen Rider Gaim, Kamen Rider Drive, by Benny Hill in The Benny Hill Show in the UK, and in the US used in the three 1960s TV series, Gilligan's Island, Batman, The Flying Nun and I Love Lucy. Hill, whose comedy sketches first appeared on British television in the early 1950s, was described by writer Anthony Burgess as "a comic genius steeped in the British music hall tradition". In the 1970s, the sitcom Three's Company featured slapstick infused scenes in most episodes. In 1990, Mr. Bean, starring Rowan Atkinson, debuted on British television, and, like Benny Hill, cartoons and other comedians whose "visual humour transcended language barriers" (description of Hill by the BFI), the show would be exported around the world. 20th century fad Examples of the use of the slapstick in public places as a fad in the early 20th century include: During the 1911 Veiled Prophet Parade in St. Louis, according to the St. Louis Post-Dispatch, The slapstick, so long indispensable to low comedy, found a new use among the crowds ... they used the slapstick to the extreme embarrassment of many women. The carnival spirit, for the most part tempered by high good humor, at times verged on rowdyism. Girls used a stick ripped with feathers to tickle the faces of young men, and they retaliated vigorously with the slapstick. An editorial in the Asbury Park Press, New Jersey, said in 1914: Slapsticks are the latest "fun-making" fad for masque fetes ... Orders to stop the slapstick nuisance should be issued by the police and the Asbury Park carnival commissioners. Any device that cannot be operated or used without inflicting unmerited pain and injury should be excluded ... See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Star_of_David] | [TOKENS: 3909]
Contents Star of David The Star of David (Hebrew: מָגֵן דָּוִד, romanized: Māḡēn Dāvīḏ, [maˈɡen daˈvid] ⓘ, lit. 'Shield of David')[a] is a symbol generally recognized as representing both Jewish identity and the Jewish people's ethnic religion, Judaism. Its shape is that of a hexagram: the compound of two equilateral triangles. A derivation of the Seal of Solomon was used for decorative and mystical purposes by Kabbalistic Jews and Muslims. The hexagram appears occasionally in Jewish contexts since antiquity as a decorative motif, such as a stone bearing a hexagram from the arch of the 3rd–4th century Khirbet Shura synagogue. A hexagram found in a religious context can be seen in the Leningrad Codex, a manuscript of the Hebrew Bible from 11th-century Cairo. Its association as a distinctive symbol for the Jewish people and their religion dates to 17th-century Prague. In the 19th century, the symbol began to be widely used by the Jewish communities of Eastern Europe, ultimately coming to represent Jewish identity or religious beliefs. It became representative of Zionism after it was chosen as the central symbol for a Jewish national flag at the First Zionist Congress in 1897. By the end of World War I, it was an internationally accepted symbol for the Jewish people, used on the gravestones of fallen Jewish soldiers. Today, the star is the central symbol on the national flag of the State of Israel. Roots Unlike the menorah, the Lion of Judah, the shofar and the lulav, the hexagram was not originally a uniquely Jewish symbol. The hexagram, being an inherently simple geometric construction, has been used throughout human history in various motifs which were not exclusively religious. Kabbalah scholar Gershom Scholem noted how the symbol was found on a Jewish seal in Sidon from the 7th-century BCE, and how it was also found alongside other symbols that were known to not be of Jewish origin. It appeared as a decorative motif in both 4th-century synagogues and Christian churches in the Galilee region. Gershom Scholem writes that the term "seal of Solomon" was adopted by Jews from Islamic magic literature, while he could not assert with certainty whether the term "shield of David" originated in Islamic or Jewish mysticism. Scholem noted how the hexagram star was also found in Hinduism, where it is a symbol of the goddess Lakshmi, and Buddhism, where it is used as a meditation aid to achieve a sense of peace and harmony. Leonora Leet argues though that not just the terminology, but the esoteric philosophy behind it had pre-Islamic Jewish roots. She also shows that Jewish alchemists were the teachers of their Muslim and Christian counterparts, and that a way-opener such as Maria Hebraea of Alexandria (2nd or 3rd century CE; others date her earlier) already used concepts which were later adopted by Muslim and Christian alchemists and could be graphically associated with the symbolism of the upper and lower triangles constituting the hexagram, which came into explicit use after her time. The hexagram however only becomes widespread in Jewish magical texts and amulets (segulot) in the early Middle Ages, which is why most modern authors have seen Islamic mysticism as the source of the medieval Spanish Kabbalists' use of the hexagram. The name "Star of David" originates from King David of ancient Israel. Use as Jewish emblem Only around one millennium later, however, did the star begin to be used as a symbol to identify Jewish communities, a tradition that seems to have started in Prague before the 17th century, and from there spread to much of Eastern Europe. In the 19th century, it came to be adopted by European Jews as a symbol to represent Jewish religion or identity in the same manner the Christian cross identified that religion's believers. The symbol became representative of the worldwide Zionist community after it was chosen as the central symbol on a flag at the First Zionist Congress in 1897, due to its usage in some Jewish communities and its lack of specifically religious connotations. It was not considered an exclusively Jewish symbol until after it began to be used on the gravestones of fallen Jewish soldiers in World War I. History of Jewish usage The hexagram does appear occasionally in Jewish contexts since antiquity, apparently as a decorative motif. For example, in Israel, there is a stone bearing a hexagram from the arch of the 3rd–4th century Khirbet Shura synagogue in Galilee. It also appears on a temple on Bar Kokhba Revolt coinage which dates from 135 CE. Originally, the hexagram may have been employed as an architectural ornament on synagogues, as it is, for example, on the cathedrals of Brandenburg and Stendal, and on the Marktkirche at Hanover. A hexagram in this form is found on the ancient synagogue at Capernaum. The use of the hexagram in a Jewish context as a possibly meaningful symbol may occur as early as the 11th century, in the decoration of the carpet page of the famous Tanakh manuscript, the Leningrad Codex dated 1008. Similarly, the symbol illuminates a medieval Tanakh manuscript dated 1307 belonging to Rabbi Yosef bar Yehuda ben Marvas from Toledo, Spain. A hexagram has been noted on a Jewish tombstone in Taranto, Apulia in Southern Italy, which may date as early as the third century CE. The Jews of Apulia were noted for their scholarship in Kabbalah, which has been connected to the use of the Star of David. Medieval Kabbalistic grimoires show hexagrams among the tables of segulot, but without identifying them as "Shield of David". In the Renaissance, in the 16th-century Land of Israel, the book Ets Khayim conveys the Kabbalah of Ha-Ari (Rabbi Isaac Luria) who arranges the traditional items on the seder plate for Passover into two triangles, where they explicitly correspond to Jewish mystical concepts. The six sfirot of the masculine Zer Anpin correspond to the six items on the seder plate, while the seventh sfira being the feminine Malkhut corresponds to the plate itself. However, these seder-plate triangles are parallel, one above the other, and do not actually form a hexagram. According to G. S. Oegema (1996): Isaac Luria provided the hexagram with a further mystical meaning. In his book Etz Chayim he teaches that the elements of the plate for the Seder evening have to be placed in the order of the hexagram: above the three sefirot "Crown", "Wisdom", and "Insight", below the other seven.[page needed] Similarly, M. Costa[full citation needed] wrote that M. Gudemann and other researchers in the 1920s claimed that Isaac Luria was influential in turning the Star of David into a national Jewish emblem by teaching that the elements of the plate for the Seder evening have to be placed in the order of the hexagram. Gershom Scholem (1990) disagrees with this view, arguing that Isaac Luria talked about parallel triangles one beneath the other and not about the hexagram. The Star of David at least since the 20th century remains associated with the number seven and thus with the Menorah, and popular accounts[unreliable source?] associate it with the six directions of space plus the center (under the influence of the description of space found in the Sefer Yetsira: Up, Down, East, West, South, North, and Center), or the Six Sefirot of the Male (Zeir Anpin) united with the Seventh Sefirot of the Female (Nukva). Some say that one triangle represents the ruling tribe of Judah and the other the former ruling tribe of Benjamin. It is also seen as a dalet and yud, the two letters assigned to Judah. There are 12 Vav, or "men", representing the 12 tribes or patriarchs of Israel. In 1354, King of Bohemia Charles IV approved for the Jews of Prague a red flag with a hexagram. In 1460, the Jews of Ofen (Buda, now part of Budapest, Hungary) received King Matthias Corvinus with a red flag on which were two Shields of David and two stars. In the first Hebrew prayer book, printed in Prague in 1512, a large hexagram appears on the cover. In the colophon is written: "Each man beneath his flag according to the house of their fathers...and he will merit to bestow a bountiful gift on anyone who grasps the Shield of David." In 1592, Mordechai Maizel was allowed to affix "a flag of King David, similar to that located on the Main Synagogue" on his synagogue in Prague. Following the Battle of Prague (1648), the Jews of Prague were again granted a flag, in recognition of their contribution to the city's defense. That flag showed a yellow hexagram on a red background, with a "Swedish star" placed in the center of the hexagram. In the 1650s, the Jews of Vienna adopted a seal with the hexagram on it, likely choosing the motif used on the seal for the Jews of Prague. When a boundary was fixed between Vienna and the Jewish ghetto, a marker was fashioned which separated the two communities. The Christians were identified by the cross and the Jews by the hexagram. When the Jews of Vienna were expelled in 1669, many refugees fled to other cities which in turn used the symbol for their community seal. The early proto-Zionist Hibbat Zion societies used the Star of David it as a national emblem, although Herzl was not aware of this. The symbol became representative of the worldwide Zionist community, and later the broader Jewish community, after it was chosen to represent the First Zionist Congress in 1897. A year before the congress, Herzl had written in his 1896 Der Judenstaat: We have no flag, and we need one. If we desire to lead many men, we must raise a symbol above their heads. I would suggest a white flag, with seven golden stars. The white field symbolizes our pure new life; the stars are the seven golden hours of our working-day. For we shall march into the Promised Land carrying the badge of honor. David Wolffsohn (1856–1914), a businessman prominent in the early Zionist movement, was aware that the nascent Zionist movement had no official flag, and that the design proposed by Theodor Herzl was gaining no significant support, wrote: At the behest of our leader Herzl, I came to Basle to make preparations for the Zionist Congress. Among many other problems that occupied me then was one that contained something of the essence of the Jewish problem. What flag would we hang in the Congress Hall? Then an idea struck me. We have a flag—and it is blue and white. The talith (prayer shawl) with which we wrap ourselves when we pray: that is our symbol. Let us take this Talith from its bag and unroll it before the eyes of Israel and the eyes of all nations. So I ordered a blue and white flag with the Shield of David painted upon it. That is how the national flag, that flew over Congress Hall, came into being. In the early 20th century, the symbol began to be used to express Jewish affiliations in sports. Hakoah Vienna was a Jewish sports club founded in Vienna, Austria, in 1909 whose teams competed with the Star of David on the chest of their uniforms, and won the 1925 Austrian League soccer championship. Similarly, The Philadelphia Sphas basketball team in Philadelphia (whose name was an acronym of its founding South Philadelphia Hebrew Association) wore a large Star of David on their jerseys to proudly proclaim their Jewish identity, as they competed in the first half of the 20th century. In boxing, Benny "the Ghetto Wizard" Leonard (who said he felt as though he was fighting for all Jews) fought with a Star of David embroidered on his trunks in the 1910s.[citation needed] World heavyweight boxing champion Max Baer fought with a Star of David on his trunks as well, notably, for the first time as he knocked out Nazi Germany hero Max Schmeling in 1933; Hitler never permitted Schmeling to fight a Jew again.[citation needed] A Star of David, often yellow, was used by the Nazis during the Holocaust to identify Jews. After the German invasion of Poland in 1939, local German occupation commanders ordered Jewish Poles to wear an identifying mark (e.g. in the General Government, a white armband with a blue Star of David; in the Warthegau, a yellow badge, in the form of a Star of David, on the left breast and on the back). If a Jew was found in public without the star, he could be severely punished. The requirement to wear the Star of David with the word Jude (German for Jew) was then extended to all Jews over the age of six in the Reich and in the Protectorate of Bohemia and Moravia (by a decree issued on September 1, 1941, and signed by Reinhard Heydrich) and was gradually introduced in other Nazi-occupied areas. Others, however, wore the Star of David as a symbol of defiance against Nazi antisemitism, as in the case of United States Army private Hal Baumgarten, who wore a Star of David emblazoned on his back during the 1944 invasion of Normandy. Contemporary use The flag of Israel, depicting a blue Star of David on a white background, between two horizontal blue stripes was adopted on October 28, 1948, five months after the country's establishment. The origins of the flag's design date from the First Zionist Congress in 1897; the flag has subsequently been known as the "flag of Zion". Many Modern Orthodox, Conservative, and Reform synagogues have the Israeli flag with the Star of David prominently displayed on the bimah. Magen David Adom (MDA) ("Red Star of David" or, translated literally, "Red Shield of David") is Israel's only official emergency medical, disaster, and ambulance service. It has been an official member of the International Committee of the Red Cross since June 2006. According to the Israel Ministry of Foreign Affairs, Magen David Adom was boycotted by the International Committee of the Red Cross, which refused to grant the organization membership because "it was [...] argued that having an emblem used by only one country was contrary to the principles of universality." Other commentators said the ICRC did not recognize the medical and humanitarian use of this Jewish symbol, a Red Shield, alongside the Christian cross and the Muslim crescent. Since 1948, the Star of David has carried the dual significance of representing both the state of Israel and Jewish identity in general. In the United States especially, it continues to be used in the latter sense by a number of athletes. In baseball, Jewish major leaguer Gabe Kapler had a Star of David tattooed on his left calf in 2000, with the words "strong-willed" and "strong-minded", major leaguer Mike "Superjew" Epstein drew a Star of David on his baseball glove, and major leaguer Ron Blomberg had a Star of David emblazoned in the knob of his bat which is on display at the Baseball Hall of Fame. NBA basketball star Amar'e Stoudemire, who says he is spiritually and culturally Jewish, had a Star of David tattoo put on his left hand in 2010. NFL football defensive end Igor Olshansky has Star of David tattoos on each side of his neck, near his shoulders. Israeli golfer Laetitia Beck displays a blue-and-white Magen David symbol on her golf apparel. In boxing, Jewish light heavyweight world champion Mike "The Jewish Bomber" Rossman fought with a Star of David embroidered on his boxing trunks, and also has a blue Star of David tattoo on the outside of his right calf. Other boxers fought with Stars of David embroidered on their trunks include world lightweight champion, world light heavyweight boxing champion Battling Levinsky, Barney Ross (world champion as a lightweight, as a junior welterweight, and as a welterweight), world flyweight boxing champion Victor "Young" Peres, world bantamweight champion Alphonse Halimi, and more recently World Boxing Association super welterweight champion Yuri Foreman, light welterweight champion Cletus Seldin, and light middleweight Boyd Melson.[failed verification] Welterweight Zachary "Kid Yamaka" Wohlman has a tattoo of a Star of David across his stomach, and welterweight Dmitriy Salita even boxes under the nickname "Star of David". Maccabi clubs still use the Star of David in their emblems. Etymology The Jewish Encyclopedia cites a 12th-century Karaite document as the earliest Jewish literary source to mention a symbol called "Magen Dawid" (without specifying its shape). The name 'Shield of David' was used by at least the 11th century as a title of the God of Israel, independent of the use of the symbol. The phrase occurs independently as a divine title in the Siddur, the traditional Jewish prayer book, where it poetically refers to the divine protection of ancient King David and the anticipated restoration of his dynastic house, perhaps based on Psalm 18, which is attributed to David, and in which God is compared to a shield (v. 31 and v. 36). The term occurs at the end of the "Samkhaynu/Gladden us" blessing, which is recited after the reading of the Haftara portion on Saturday and holidays. The earliest known text related to Judaism which mentions a sign called the "Shield of David" is Eshkol Ha-Kofer by the Karaite Judah Hadassi, in the mid-12th century CE: Seven names of angels precede the mezuzah: Michael, Gabriel, etc. ...Tetragrammaton protect you! And likewise the sign, called the "Shield of David", is placed beside the name of each angel. This book is of Karaite, and not of Rabbinic Jewish origin, and it does not describe the shape of the sign in any way. Miscellaneous Gallery See also Notes References Bibliography Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Special:RecentChangesLinked/Computer] | [TOKENS: 57]
Related changes Enter a page name to see changes on pages linked to or from that page. (To see members of a category, enter Category:Name of category). Changes to pages on your Watchlist are shown in bold with a green bullet. See more at Help:Related changes.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Age_of_marriage] | [TOKENS: 6273]
Contents Marriageable age Marriageable age is the minimum legal age of marriage. Age and other prerequisites to marriage vary between jurisdictions, but in the vast majority of jurisdictions, the marriageable age as a right is set at the age of majority. Nevertheless, most jurisdictions allow marriage at a younger age with parental or judicial approval, especially if the female is pregnant. Among most indigenous cultures, people marry at fifteen, the age of sexual maturity for both the male and the female. In industrialized cultures, the age of marriage is most commonly 18 years old, but there are variations, and the marriageable age should not be confused with the age of majority or the age of consent, though they may be the same. The 55 parties to the 1962 Convention on Consent to Marriage, Minimum Age for Marriage, and Registration of Marriages have agreed to specify a minimum marriageable age by statute law‚ to override customary, religious, tribal laws and traditions. When the marriageable age under a law of a religious community is lower than that under the law of the land, the state law prevails. However, some religious communities do not accept the supremacy of state law in this respect, which may lead to child marriage or forced marriage. The 123 parties to the 1956 Supplementary Convention on the Abolition of Slavery have agreed to adopt a prescribed "suitable" minimum age for marriage. In many developing countries, the official age prescriptions stand as mere guidelines. UNICEF, the United Nations children's organization, regards a marriage of a minor (legal child), a person below the adult age, as child marriage and a violation of rights. Until recently, the minimum marriageable age for females was lower in many jurisdictions than for males, on the premise that females mature at an earlier age than males. This law has been viewed by some to be discriminatory, so that in many countries the marriageable age of females has been raised to equal that of males. History and social attitudes In Greece females married as young as 14 or 16. In Spartan marriages, females were around 18 and males were around 25. In the Roman Empire, the Emperor Augustus introduced marriage legislation, the Lex Papia Poppaea, which rewarded marriage and childbearing. The legislation also imposed penalties for both men and women who remained unmarried, or who married but for whatever reason failed to have children. For men it was between the ages of 25 and 60 while for women it was between ages 20 and 50. Women who were Vestal Virgins were selected between the ages of 10 and 13 to serve as priestesses in the temple of goddess Vesta in the Roman Forum for 30 years, after which they could marry. In Roman law the age of marriage was 12 years for females and 14 years for males, and age of betrothal was 7 years for both males and females. The father had the right and duty to seek a good and useful match for his children. To further the interests of their birth families, daughters of the elite would marry into respectable families. If a daughter could prove the proposed husband to be of bad character, she could legitimately refuse the match. Individuals remained under the authority of the pater familias until his death, and the latter had the power to approve or reject marriages for his sons and daughters, but by the late antique period, Roman law permitted women over 25 to marry without parental consent.: 29–37 Noblewomen were known to marry as young as 12 years of age, whereas women in the lower social classes were more likely to marry slightly further into their teenage years. 43% of Pagan females married at 12–15 years and 42% of Christian females married at 15–18 years. In late antiquity, most Roman women married in their late teens to early twenties, but noble women married younger than those of the lower classes, as an aristocratic girl was expected to be virgin until her first marriage. In late antiquity, under Roman law, daughters inherited equally from their parents if no will was produced.: 63 In addition, Roman law recognized wives' property as legally separate from husbands' property,: 133–154 as did some legal systems in parts of Europe and colonial Latin America. In 380 C.E., the Emperor Theodosius issued the Edict of Thessalonica, which made Nicene Christianity the official religion of the Roman Empire. The Holy See adapted Roman law into Canon law. After the fall of the Western Roman Empire and the rise of the Holy Roman Empire, manorialism also helped weaken the ties of kinship and thus the power of clans. As early as the 9th century in northwestern France, families that worked on manors were small, consisting of parents and children and occasionally a grandparent. The Roman Catholic Church and State had become allies in erasing the solidarity and thus the political power of the clans; the Roman Catholic Church sought to replace traditional religion, whose vehicle was the kin group, and substitute the authority of the elders of the kin group with that of a religious elder. At the same time, the king's rule was undermined by revolts by the most powerful kin groups, clans or sections, whose conspiracies and murders threatened the power of the state and also the demands by manorial Lords for obedient, compliant workers. As the peasants and serfs lived and worked on farms that they rented from the lord of the manor, they also needed the permission of the lord to marry. Couples therefore had to comply with the lord of the manor and wait until a small farm became available before they could marry and thus produce children. Those who could and did not delay marriage were presumably rewarded by the landlord and those who did not marry were presumably denied that reward. For example, marriageable ages in Medieval England varied depending on economic circumstances, with couples delaying marriage until their early twenties when times were bad, but might marry in their late teens after the Black Death, when there was a severe labour shortage;: 96 by appearances, marriage of adolescents was not the norm in England.: 98–100 In medieval Western Europe, the rise of Catholicism and manorialism had both created incentives to keep families nuclear, and thus the age of marriage increased; the Western Church instituted marriage laws and practices that undermined large kinship groups. The Roman Catholic Church prohibited consanguineous marriages, a marriage pattern that had been a means to maintain clans (and thus their power) throughout history. The Roman Catholic Church curtailed arranged marriages in which the bride did not clearly agree to the union. In the 12th century, the Roman Catholic Church drastically changed legal standards for marital consent by allowing daughters over 12 years old and sons over 14 years old to marry without their parents' approval, which was previously required, even if their marriage was made clandestinely. Parish studies have confirmed that in the late medieval period, females did sometimes marry without their parents' approval in England. In the 12th century, Canon law jurist Gratian, stated that consent for marriage could not take place before the age of 12 years old for females and 14 years old for males; also, consent for betrothal could not take place before the age of 7 years old for females and males, as that is the age of reason. The Church of England, after breaking away from the Roman Catholic Church, carried with it the same minimum age requirements. Age of consent for marriage of 12 years old for girls and of 14 years old for boys were written into English civil law. The first recorded age-of-consent law, in England, dates back 800 years. The age of consent law in question has to do with the law of rape and not the law of marriage as sometimes misunderstood. In 1275, in England, as part of the rape law, the Statute of Westminster 1275, made it a misdemeanor to have sex with a "maiden within age", whether with or without her consent. The phrase "within age" was interpreted by jurist Sir Edward Coke as meaning the age of marriage, which at the time was 12 years old. A 1576 law was created with more severe punishments for having sex with a girl for which the age of consent was set at 10 years old. Under English common law the age of consent, as part of the law of rape, was 10 or 12 years old and rape was defined as forceful sexual intercourse with a woman against her will. To convict a man of rape, both force and lack of consent had to be proved, except in the case of a girl who is under the age of consent. Since the age of consent applied in all circumstances, not just in physical assaults, the law also made it impossible for an underage girl (under 12 years old) to consent to sexual activity. There was one exception: a man's acts with his wife (females over 12 years old), to which rape law did not apply. Jurist Sir Matthew Hale stated that both rape laws were valid at the same time. In 1875, the Offence Against the Persons Act raised the age to 13 years in England; an act of sexual intercourse with a girl younger than 13 was a felony. There were some fathers who arranged marriages for a son or a daughter before he or she reached the age of maturity, which is similar to what some fathers in ancient Rome did. Consummation would not take place until the age of maturity. Roman Catholic Canon law defines a marriage as consummated when the "spouses have performed between themselves in a human fashion a conjugal act which is suitable in itself for the procreation of offspring, to which marriage is ordered by its nature and by which the spouses become one flesh." There are recorded marriages of two- and three-year-olds: in 1564, a three-year-old named John was married to a two-year-old named Jane in the Bishop's Court in Chester, England. The policy of the Roman Catholic Church, and later various protestant churches, of considering clandestine marriages and marriages made without parental consent to be valid was controversial, and in the 16th century both the French monarchy and the Lutheran Church sought to end these practices, with limited success. In most of Northwestern Europe, marriages at very early ages were rare. One thousand marriage certificates from 1619 to 1660 in the Archdiocese of Canterbury show that only one bride was 13 years old, four were 15, twelve were 16, and seventeen were 17 years old; while the other 966 brides were at least 19 years old. In England and Wales, the Marriage Act 1753 required a marriage to be covered by a licence (requiring parental consent for those under 21) or the publication of bans (which parents of those under 21 could forbid). Additionally, the Church of England dictated that both the bride and groom must be at least 21 years of age to marry without the consent of their families. In the certificates, the most common age for the brides is 22 years. For the grooms 24 years was the most common age, with average ages of 24 years for the brides and 27 for the grooms. While European noblewomen often married early, they were a small minority of the population, and the marriage certificates from Canterbury show that even among nobility it was very rare to marry women off at very early ages. The minimum age requirements of 12 and 14 were eventually written into English civil law. By default, these provisions became the minimum marriageable ages in colonial America. On the average, marriages occurred several years earlier in colonial America than in Europe, and much higher proportions of the population eventually got married. Community-based studies suggest an average age at marriage of about 20 years old for women in the early colonial period and about 26 years old for men. In the late 19th century and throughout the 20th century, U.S. states began to slowly raise the minimum legal age at which individuals were allowed to marry. Age restrictions, as in most developed countries, have been revised upward so that they are now between 15 and 21 years of age. Before 1929, the Scottish law adopted the Roman law in allowing a girl to marry at twelve years of age and a boy at fourteen, without any requirement for parental consent. However, in practice, marriages in Scotland at such young ages was almost unknown. The highest average age at first marriage was in the Netherlands: on average 27 years for women and 30 years for men in both the rural and urban population from the late 1400's onward till the end of WWII, rising at times to 30 years for women and 32 years for men. On average 25-30% of people in the Netherlands remained unmarried throughout their life between 1500 and 1950. In Amsterdam the mean age at first marriage for women fluctuated between 23.5 and 25 years old from the late 15th century until the 1660s, when it started to rise even further. From early on the Roman Catholic Church promoted sexual abstinence over marriage, but marriage over sexual promiscuity. This meant that remaining unmarried became socially acceptable in Western Europe. In the Middle Ages marriage was often not recorded and therefore could depend on the word of the couple that could either confirm or deny it having taken place. A majority of unmarried women would be in the service of the church as nuns or as lay women. A vast number of women also provided for themselves in specialised professions until the financial freedoms of women were curtailed by the guilds in the late Middle Ages. This meant that until the late Middle Ages many women could also run businesses to sustain themselves outside of marriage. After the 1400's the first marriage age became better recorded and seems to be influenced largely by the economic situation. In times of economical uncertainty both women and men tended to marry younger (between 20–25 years old for women) but the age gap was somewhat larger. A major factor was that by marrying their daughter off young the parents had one mouth less to feed and the dowry was often lower for younger girls who had learned less skills and build up less savings. This also explains the larger age gap between husband and wife in economical harsher times: an older husband would already have established himself an income to sustain a wife and thus children. Though for political reasons nobility often engaged and married far younger than the general population in many cases the actual consummation of the marriage was postponed until both marriage partners had reached a more mature age. Another contributing factor to later marriage age is that in the Middle Ages a culture of nuclear family structures developed from the multiple generational extended family structures that were common in pre-Christian tribal societies in Western Europe. Both men and women would typically spend several years of working as a maid, farmhand, labourer or apprentice in order to gain work experience, develop skills and save up money to sustain their own nuclear family, rather than continuing to live in multigenerational household. This development raised the socially accepted first marriage age of women from puberty onset (12–14 years old) in the early Middle Ages up to their late teens and older by the late medieval period, and during the renaissance up to their middle twenties on average. This development also brought the first marriage age of women and men far closer together. The great general wealth in the Netherlands from the spice trade also meant that women married later in life. The highest marriage ages for both men and women was passed 30 years old and are found in times of national financial prosperity. An other contributing reason was that late marriage age was a recognised method of birth control. The later a woman married the less children she would birth and the less children a couple had to raise. It was also generally recognised that giving birth at a very young age was detrimental for the woman's health and therefore socially disapproved of. Social disapproval of a young marriage age for the woman and a large age gap between the marriage partners can still be recognised in sayings originating in those centuries. A well known example from neighbouring Britain is the cautionary tale of the play Romeo and Juliet by William Shakespeare of whom the young ages were considered scandalous at the time. In France, until the French Revolution, the marriageable age was 12 years for females and 14 for males. Revolutionary legislation in 1792 increased the age to 13 years for females and 15 for males. Under the Napoleonic Code in 1804, the marriageable age was set at 15 years old for females and 18 years old for males. In 2006, the marriageable age for females was increased to 18, the same as for males. In jurisdictions where the ages are not the same, the marriageable age for females is more commonly two or three years lower than that of males. In 17th century Poland, in the Warsaw parish of St John, the average age of women entering marriage was 20.1, and that of men was 23.7. In the second half of the eighteenth century, women in the parish of Holy Cross married at 21.8, while men at 29. In medieval Eastern Europe, the Slavic traditions of patrilocality of early and universal marriage (usually of a bride aged 13–15 years, with menarche occurring on average at age 14) lingered; the manorial system had yet to penetrate into Eastern Europe and generally had less effect on clan systems there. The bans on cross-cousin marriages had also not been firmly enforced. In Russia, before 1830 the age of consent for marriage was 15 years old for males and 13 years old for females (though 15 years old was preferred for females, so much so that it was written into the Law Code of 1649). Teenage marriage was practised for chastity. Both the female and the male teenager needed consent of their parents to marry because they were under 20 years old, the age of majority. In 1830, the age of consent for marriage was raised to 18 years old for males and 16 years old for females Though 18 years old was preferred for females, the average age of marriage for females was around 19 years old. Aztec family law generally followed customary law. Men got married between the ages of 20–22, and women generally got married at 15 to 18 years of age. Maya family law appears to have been based on customary law. Maya men and women usually got married at around the age of 20, though women sometimes got married at the age of 16 or 17. Marriageable age as a right vs exceptions In majority of countries, a right to marry at age 18 is enshrined along with all other rights and responsibilities of adulthood. However, most of these countries allow those younger than that age to marry, usually with parental consent or judicial authorization. These exceptions vary considerably by country. The United Nations Population Fund stated: In 2010, 158 countries reported that 18 years was the minimum legal age for marriage for women without parental consent or approval by a pertinent authority. However, in 146 [of those] countries, state or customary law allows girls younger than 18 to marry with the consent of parents or other authorities; in 52 countries, girls under age 15 can marry with parental consent. In contrast, 18 is the legal age for marriage without consent among males in 180 countries. Additionally, in 105 countries, boys can marry with the consent of a parent or a pertinent authority, and in 23 countries, boys under age 15 can marry with parental consent. In recent years, many countries in the EU have tightened their marriage laws, either banning marriage under 18 completely, or requiring judicial approval for such marriages. Countries which have reformed their marriage laws in recent years include Sweden (2014), Denmark (2017), Germany (2017), Luxembourg (2014), Spain (2015), Netherlands (2015), Finland (2019) and Ireland (2019). Many developing countries have also enacted similar laws in recent years: Honduras (2017), Ecuador (2015), Costa Rica (2017), Panama (2015), Trinidad & Tobago (2017), Malawi (2017). The minimum age requirements of 12 years old for females and 14 years old for males were written into English civil law. By default, these provisions became the minimum marriageable ages in colonial America. This English common law inherited from the British remained in force in America unless a specific state law was enacted to replace them. In the United States, as in most developed countries, age restrictions have been revised upward so that they are now between 15 and 21 years of age. In Western countries, marriages of teenagers have become rare in recent years, with their frequency declining during the past few decades. For instance, in Finland, where in the early 21st century underage youth could obtain a special judicial authorization to marry, there were only 30–40 such marriages per year during that period (with most of the spouses being aged 17), while in the early 1990s, more than 100 such marriages were registered each year. Since 1 June 2019 Finland has banned marriages of anyone under 18 with no exemptions. Relation to the age of majority Marriageable age as a right is usually the same with the age of majority which is 18 years old in most countries. However, in some countries, the age of majority is under 18, while in others it is 19, 20 or 21 years. In Canada for example, the age of majority is 19 in Nova Scotia, New Brunswick, British Columbia, Newfoundland and Labrador, Northwest Territories, Yukon and Nunavut. Marriage under 19 years in these provinces requires parental or court consent (see Marriage in Canada). In USA for example, the age of majority is 21 in Mississippi and 19 in Nebraska and requires parental consent. In many jurisdictions of North America, married minors become legally emancipated. Listed by country 21 in Puerto Rico Minors under 18 cannot marry in the states of New York State, Pennsylvania, New Jersey, Delaware, Minnesota, Rhode Island, Connecticut, Massachusetts, Virginia, New Hampshire, Washington State, Michigan and Vermont under any circumstance. This also holds true for the territories of the U.S. Virgin Islands and American Samoa. On 30 November 2022, The High court of Jharkhand reported that a Muslim Woman can marry a person of her choice after attaining 15 years. However, the next article allows persons between the ages of 16–18 to be married if they have been “commissioned the right of full legal capacity” in accordance to the Civil Code. Notwithstanding anything contained in clause (b) of sub-section (1), nothing shall bar the conclusion, or causing the conclusion of, a marriage within the relationship that is allowed to marry in accordance with the practices prevailing in their ethnic community or clan. The marriageable age as a right is 18 years in all European countries, with the exception of Scotland where it is 16 (regardless of gender). Existing exceptions to this general rule (usually requiring special judicial or parental consent) are discussed below. In both the European Union and the Council of Europe the marriage act states: The Istanbul convention, the first legally binding instrument in Europe in the field of violence against women and domestic violence, only requires countries which ratify it to prohibit forced marriage (Article 37) and to ensure that forced marriages can be easily voided without further victimization (Article 32), but does not make any reference to a minimum age of marriage. England and Wales: 18 Scotland: 16 Northern Ireland: 16 years with parental consent (with the court able to give consent in some cases). By religion In ancient Israel men twenty years old and older would become warriors and when they get married they would get one year leave of absence to be with their wife. Rabbis estimated the age of maturity from about the beginning of the thirteenth year for women and about the beginning of the fourteenth year for men. On the practice of Levirate marriage, the Talmud advised against a large age gap between a man and his brother's widow. A younger woman marrying a significantly older man, however, is especially problematic: marrying one's young daughter to an old man was declared by the Sanhedrin as reprehensible as forcing her into prostitution. In Rabbinic Judaism, males cannot consent to marriage until they reach the age of 13 years and a day and have undergone puberty and females cannot consent to marriage until they reach the age of 12 years and a day and have undergone puberty. Males and females are considered minors until the age of twenty. After twenty, males are not considered adults if they show signs of impotence. If males show no signs of puberty or do show impotence, they automatically become adults by age 35 and can marry. Marriage involved a double ceremony, which included the formal betrothal and wedding rites. The minimum age for marriage was 13 years old for males and 12 years old for females but formal betrothal could take place before that and often did. Talmud advises males to get married at 18 years old or between 16 years old and 24 years old. A ketannah (literally meaning "little [one]") was any girl between the age of 3 years and that of 12 years plus one day; she was subject to her father's authority, and he could arrange a marriage for her without her agreement, and that marriage remains binding even after reaching the age of maturity. If a girl was orphaned from her father, or she was married by his authority and subsequently divorced, she, her mother, or her brother could marry her in a quasi-binding fashion. Until the age of maturity, she could annul the marriage retroactively. After reaching the age of maturity, intercourse with her husband renders her officially married. Catholic Canon law adopted Roman law, which set the minimum age of marriage at 12 years old for females and 14 years old for males. The Roman Catholic Church raised the minimum age of marriage to 14 years old for females and to 16 years old for males in 1917 and lowered the age of majority to 18 years old in 1983. The Code of Canons of the Eastern Churches states the same requirements in canon 800. Büchler and Schlater state that "marriageable age according to classical Islamic law coincides with the occurrence of puberty. The notion of puberty refers to signs of physical maturity such as the emission of semen or the onset of menstruation". Hanafi school of classical Islamic jurisprudence interpret the "age of marriage", in the Quran (24:59;65:4), as the beginning of puberty. Shafiʽi, Hanbali, Maliki, and Ja'fari schools of classical Islamic jurisprudence interpret the "age of marriage", in the Quran (24:59), as completion of puberty. For Shafiʽi, Hanbali, and Maliki schools of Islamic jurisprudence, in Sunni Islam, the condition for marriage is physical (bulugh) maturity and mental (rushd) maturity. In his Shafiʽi jurisprudential compilation, The Stocks of the Sojourner, Ahmad Ibn Naqib Al-Misri (died 1368 A.D.) writes: Guardians are, moreover, two types, a binder and a non-binder. The binder is the father and the grandfather, mainly as to the marriage of a virgin, and so is the master as to the marriage of his slave girl. The meaning of "binder" is that he may marry her off without her consent. The non-binder may not marry her off without her consent and permission. When virgin, though, the father or the grandfather may marry her off without her permission, but it is commendable to ask her, and her silence should signify acquiescence. The sane-minded non-virgin, however, may not be married off by anyone after maturity unless with her express consent, be it by the father, the grandfather, or anyone else. Before maturity, the non-virgin may not be married off at all. Marriages are traditionally contracted by the father or guardian of the bride and her intended husband. The 1917 codification of Islamic family law in the Ottoman Empire distinguished between the age of competence for marriage, which was set at 18 years for boys and 17 years for girls, and the minimum age for marriage, which followed the traditional Hanafi minimum ages of 12 for boys and 9 for girls. Marriage below the age of competence was permissible only if proof of sexual maturity was accepted in court, while marriage under the minimum age was forbidden. During the 20th century, most countries in the Middle East followed the Ottoman precedent in defining the age of competence, while raising the minimum age to 15 or 16 for boys and 15–16 for girls. Marriage below the age of competence is subject to approval by a judge and the legal guardian of the child. Egypt diverged from this pattern by setting the age limits of 18 years for boys and 16 years for girls, without a distinction between competence for marriage and minimum age. Many senior clerics in Saudi Arabia have opposed setting a minimum age for marriage, arguing that a girl reaches adulthood at puberty. However in 2019, members of the Saudi Shoura Council in 2019 approved fresh regulations for child marriage that will see to outlaw marrying off 15-year-old children and force the need for court approval for those under 18 years. The Chairman of the Human Rights Committee at the Shoura Council, Dr. Hadi Al-Yami, said that introduced controls were based on in-depth studies presented to the body. He pointed out that the regulation, vetted by the Islamic Affairs Committee at the Shoura Council, has raised the age of marriage to 18 years and prohibited it for those under 15 years. In the Baháʼí Faith's religious book Kitáb-i-Aqdas, the age of marriage is set at 15 years for both boys and girls. It is forbidden to become engaged before the age of 15 years. Hindu text Manusmriti states that a female aged 8 year should marry a man aged 24, while a 12 year old female should marry a 30 year old man. See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Python_(programming_language)#cite_note-alt-sources-history-2] | [TOKENS: 4314]
Contents Python (programming language) Python is a high-level, general-purpose programming language. Its design philosophy emphasizes code readability with the use of significant indentation. Python is dynamically type-checked and garbage-collected. It supports multiple programming paradigms, including structured (particularly procedural), object-oriented and functional programming. Guido van Rossum began working on Python in the late 1980s as a successor to the ABC programming language. Python 3.0, released in 2008, was a major revision and not completely backward-compatible with earlier versions. Beginning with Python 3.5, capabilities and keywords for typing were added to the language, allowing optional static typing. As of 2026[update], the Python Software Foundation supports Python 3.10, 3.11, 3.12, 3.13, and 3.14, following the project's annual release cycle and five-year support policy. Python 3.15 is currently in the alpha development phase, and the stable release is expected to come out in October 2026. Earlier versions in the 3.x series have reached end-of-life and no longer receive security updates. Python has gained widespread use in the machine learning community. It is widely taught as an introductory programming language. Since 2003, Python has consistently ranked in the top ten of the most popular programming languages in the TIOBE Programming Community Index, which ranks based on searches in 24 platforms. History Python was conceived in the late 1980s by Guido van Rossum at Centrum Wiskunde & Informatica (CWI) in the Netherlands. It was designed as a successor to the ABC programming language, which was inspired by SETL, capable of exception handling and interfacing with the Amoeba operating system. Python implementation began in December 1989. Van Rossum first released it in 1991 as Python 0.9.0. Van Rossum assumed sole responsibility for the project, as the lead developer, until 12 July 2018, when he announced his "permanent vacation" from responsibilities as Python's "benevolent dictator for life" (BDFL); this title was bestowed on him by the Python community to reflect his long-term commitment as the project's chief decision-maker. (He has since come out of retirement and is self-titled "BDFL-emeritus".) In January 2019, active Python core developers elected a five-member Steering Council to lead the project. The name Python derives from the British comedy series Monty Python's Flying Circus. (See § Naming.) Python 2.0 was released on 16 October 2000, featuring many new features such as list comprehensions, cycle-detecting garbage collection, reference counting, and Unicode support. Python 2.7's end-of-life was initially set for 2015, and then postponed to 2020 out of concern that a large body of existing code could not easily be forward-ported to Python 3. It no longer receives security patches or updates. While Python 2.7 and older versions are officially unsupported, a different unofficial Python implementation, PyPy, continues to support Python 2, i.e., "2.7.18+" (plus 3.11), with the plus signifying (at least some) "backported security updates". Python 3.0 was released on 3 December 2008, and was a major revision and not completely backward-compatible with earlier versions, with some new semantics and changed syntax. Python 2.7.18, released in 2020, was the last release of Python 2. Several releases in the Python 3.x series have added new syntax to the language, and made a few (considered very minor) backward-incompatible changes. As of January 2026[update], Python 3.14.3 is the latest stable release. All older 3.x versions had a security update down to Python 3.9.24 then again with 3.9.25, the final version in 3.9 series. Python 3.10 is, since November 2025, the oldest supported branch. Python 3.15 has an alpha released, and Android has an official downloadable executable available for Python 3.14. Releases receive two years of full support followed by three years of security support. Design philosophy and features Python is a multi-paradigm programming language. Object-oriented programming and structured programming are fully supported, and many of their features support functional programming and aspect-oriented programming – including metaprogramming and metaobjects. Many other paradigms are supported via extensions, including design by contract and logic programming. Python is often referred to as a 'glue language' because it is purposely designed to be able to integrate components written in other languages. Python uses dynamic typing and a combination of reference counting and a cycle-detecting garbage collector for memory management. It uses dynamic name resolution (late binding), which binds method and variable names during program execution. Python's design offers some support for functional programming in the "Lisp tradition". It has filter, map, and reduce functions; list comprehensions, dictionaries, sets, and generator expressions. The standard library has two modules (itertools and functools) that implement functional tools borrowed from Haskell and Standard ML. Python's core philosophy is summarized in the Zen of Python (PEP 20) written by Tim Peters, which includes aphorisms such as these: However, Python has received criticism for violating these principles and adding unnecessary language bloat. Responses to these criticisms note that the Zen of Python is a guideline rather than a rule. The addition of some new features had been controversial: Guido van Rossum resigned as Benevolent Dictator for Life after conflict about adding the assignment expression operator in Python 3.8. Nevertheless, rather than building all functionality into its core, Python was designed to be highly extensible via modules. This compact modularity has made it particularly popular as a means of adding programmable interfaces to existing applications. Van Rossum's vision of a small core language with a large standard library and easily extensible interpreter stemmed from his frustrations with ABC, which represented the opposite approach. Python claims to strive for a simpler, less-cluttered syntax and grammar, while giving developers a choice in their coding methodology. Python lacks do .. while loops, which Rossum considered harmful. In contrast to Perl's motto "there is more than one way to do it", Python advocates an approach where "there should be one – and preferably only one – obvious way to do it". In practice, however, Python provides many ways to achieve a given goal. There are at least three ways to format a string literal, with no certainty as to which one a programmer should use. Alex Martelli is a Fellow at the Python Software Foundation and Python book author; he wrote that "To describe something as 'clever' is not considered a compliment in the Python culture." Python's developers typically prioritize readability over performance. For example, they reject patches to non-critical parts of the CPython reference implementation that would offer increases in speed that do not justify the cost of clarity and readability.[failed verification] Execution speed can be improved by moving speed-critical functions to extension modules written in languages such as C, or by using a just-in-time compiler like PyPy. Also, it is possible to transpile to other languages. However, this approach either fails to achieve the expected speed-up, since Python is a very dynamic language, or only a restricted subset of Python is compiled (with potential minor semantic changes). Python is meant to be a fun language to use. This goal is reflected in the name – a tribute to the British comedy group Monty Python – and in playful approaches to some tutorials and reference materials. For instance, some code examples use the terms "spam" and "eggs" (in reference to a Monty Python sketch), rather than the typical terms "foo" and "bar". A common neologism in the Python community is pythonic, which has a broad range of meanings related to program style: Pythonic code may use Python idioms well; be natural or show fluency in the language; or conform with Python's minimalist philosophy and emphasis on readability. Syntax and semantics Python is meant to be an easily readable language. Its formatting is visually uncluttered and often uses English keywords where other languages use punctuation. Unlike many other languages, it does not use curly brackets to delimit blocks, and semicolons after statements are allowed but rarely used. It has fewer syntactic exceptions and special cases than C or Pascal. Python uses whitespace indentation, rather than curly brackets or keywords, to delimit blocks. An increase in indentation comes after certain statements; a decrease in indentation signifies the end of the current block. Thus, the program's visual structure accurately represents its semantic structure. This feature is sometimes termed the off-side rule. Some other languages use indentation this way; but in most, indentation has no semantic meaning. The recommended indent size is four spaces. Python's statements include the following: The assignment statement (=) binds a name as a reference to a separate, dynamically allocated object. Variables may subsequently be rebound at any time to any object. In Python, a variable name is a generic reference holder without a fixed data type; however, it always refers to some object with a type. This is called dynamic typing—in contrast to statically-typed languages, where each variable may contain only a value of a certain type. Python does not support tail call optimization or first-class continuations; according to Van Rossum, the language never will. However, better support for coroutine-like functionality is provided by extending Python's generators. Before 2.5, generators were lazy iterators; data was passed unidirectionally out of the generator. From Python 2.5 on, it is possible to pass data back into a generator function; and from version 3.3, data can be passed through multiple stack levels. Python's expressions include the following: In Python, a distinction between expressions and statements is rigidly enforced, in contrast to languages such as Common Lisp, Scheme, or Ruby. This distinction leads to duplicating some functionality, for example: A statement cannot be part of an expression; because of this restriction, expressions such as list and dict comprehensions (and lambda expressions) cannot contain statements. As a particular case, an assignment statement such as a = 1 cannot be part of the conditional expression of a conditional statement. Python uses duck typing, and it has typed objects but untyped variable names. Type constraints are not checked at definition time; rather, operations on an object may fail at usage time, indicating that the object is not of an appropriate type. Despite being dynamically typed, Python is strongly typed, forbidding operations that are poorly defined (e.g., adding a number and a string) rather than quietly attempting to interpret them. Python allows programmers to define their own types using classes, most often for object-oriented programming. New instances of classes are constructed by calling the class, for example, SpamClass() or EggsClass()); the classes are instances of the metaclass type (which is an instance of itself), thereby allowing metaprogramming and reflection. Before version 3.0, Python had two kinds of classes, both using the same syntax: old-style and new-style. Current Python versions support the semantics of only the new style. Python supports optional type annotations. These annotations are not enforced by the language, but may be used by external tools such as mypy to catch errors. Python includes a module typing including several type names for type annotations. Also, mypy supports a Python compiler called mypyc, which leverages type annotations for optimization. 1.33333 frozenset() Python includes conventional symbols for arithmetic operators (+, -, *, /), the floor-division operator //, and the modulo operator %. (With the modulo operator, a remainder can be negative, e.g., 4 % -3 == -2.) Also, Python offers the ** symbol for exponentiation, e.g. 5**3 == 125 and 9**0.5 == 3.0. Also, it offers the matrix‑multiplication operator @ . These operators work as in traditional mathematics; with the same precedence rules, the infix operators + and - can also be unary, to represent positive and negative numbers respectively. Division between integers produces floating-point results. The behavior of division has changed significantly over time: In Python terms, the / operator represents true division (or simply division), while the // operator represents floor division. Before version 3.0, the / operator represents classic division. Rounding towards negative infinity, though a different method than in most languages, adds consistency to Python. For instance, this rounding implies that the equation (a + b)//b == a//b + 1 is always true. Also, the rounding implies that the equation b*(a//b) + a%b == a is valid for both positive and negative values of a. As expected, the result of a%b lies in the half-open interval [0, b), where b is a positive integer; however, maintaining the validity of the equation requires that the result must lie in the interval (b, 0] when b is negative. Python provides a round function for rounding a float to the nearest integer. For tie-breaking, Python 3 uses the round to even method: round(1.5) and round(2.5) both produce 2. Python versions before 3 used the round-away-from-zero method: round(0.5) is 1.0, and round(-0.5) is −1.0. Python allows Boolean expressions that contain multiple equality relations to be consistent with general usage in mathematics. For example, the expression a < b < c tests whether a is less than b and b is less than c. C-derived languages interpret this expression differently: in C, the expression would first evaluate a < b, resulting in 0 or 1, and that result would then be compared with c. Python uses arbitrary-precision arithmetic for all integer operations. The Decimal type/class in the decimal module provides decimal floating-point numbers to a pre-defined arbitrary precision with several rounding modes. The Fraction class in the fractions module provides arbitrary precision for rational numbers. Due to Python's extensive mathematics library and the third-party library NumPy, the language is frequently used for scientific scripting in tasks such as numerical data processing and manipulation. Functions are created in Python by using the def keyword. A function is defined similarly to how it is called, by first providing the function name and then the required parameters. Here is an example of a function that prints its inputs: To assign a default value to a function parameter in case no actual value is provided at run time, variable-definition syntax can be used inside the function header. Code examples "Hello, World!" program: Program to calculate the factorial of a non-negative integer: Libraries Python's large standard library is commonly cited as one of its greatest strengths. For Internet-facing applications, many standard formats and protocols such as MIME and HTTP are supported. The language includes modules for creating graphical user interfaces, connecting to relational databases, generating pseudorandom numbers, arithmetic with arbitrary-precision decimals, manipulating regular expressions, and unit testing. Some parts of the standard library are covered by specifications—for example, the Web Server Gateway Interface (WSGI) implementation wsgiref follows PEP 333—but most parts are specified by their code, internal documentation, and test suites. However, because most of the standard library is cross-platform Python code, only a few modules must be altered or rewritten for variant implementations. As of 13 March 2025,[update] the Python Package Index (PyPI), the official repository for third-party Python software, contains over 614,339 packages. Development environments Most[which?] Python implementations (including CPython) include a read–eval–print loop (REPL); this permits the environment to function as a command line interpreter, with which users enter statements sequentially and receive results immediately. Also, CPython is bundled with an integrated development environment (IDE) called IDLE, which is oriented toward beginners.[citation needed] Other shells, including IDLE and IPython, add additional capabilities such as improved auto-completion, session-state retention, and syntax highlighting. Standard desktop IDEs include PyCharm, Spyder, and Visual Studio Code; there are web browser-based IDEs, such as the following environments: Implementations CPython is the reference implementation of Python. This implementation is written in C, meeting the C11 standard since version 3.11. Older versions use the C89 standard with several select C99 features, but third-party extensions are not limited to older C versions—e.g., they can be implemented using C11 or C++. CPython compiles Python programs into an intermediate bytecode, which is then executed by a virtual machine. CPython is distributed with a large standard library written in a combination of C and native Python. CPython is available for many platforms, including Windows and most modern Unix-like systems, including macOS (and Apple M1 Macs, since Python 3.9.1, using an experimental installer). Starting with Python 3.9, the Python installer intentionally fails to install on Windows 7 and 8; Windows XP was supported until Python 3.5, with unofficial support for VMS. Platform portability was one of Python's earliest priorities. During development of Python 1 and 2, even OS/2 and Solaris were supported; since that time, support has been dropped for many platforms. All current Python versions (since 3.7) support only operating systems that feature multithreading, by now supporting not nearly as many operating systems (dropping many outdated) than in the past. All alternative implementations have at least slightly different semantics. For example, an alternative may include unordered dictionaries, in contrast to other current Python versions. As another example in the larger Python ecosystem, PyPy does not support the full C Python API. Creating an executable with Python often is done by bundling an entire Python interpreter into the executable, which causes binary sizes to be massive for small programs, yet there exist implementations that are capable of truly compiling Python. Alternative implementations include the following: Stackless Python is a significant fork of CPython that implements microthreads. This implementation uses the call stack differently, thus allowing massively concurrent programs. PyPy also offers a stackless version. Just-in-time Python compilers have been developed, but are now unsupported: There are several compilers/transpilers to high-level object languages; the source language is unrestricted Python, a subset of Python, or a language similar to Python: There are also specialized compilers: Some older projects existed, as well as compilers not designed for use with Python 3.x and related syntax: A performance comparison among various Python implementations, using a non-numerical (combinatorial) workload, was presented at EuroSciPy '13. In addition, Python's performance relative to other programming languages is benchmarked by The Computer Language Benchmarks Game. There are several approaches to optimizing Python performance, despite the inherent slowness of an interpreted language. These approaches include the following strategies or tools: Language Development Python's development is conducted mostly through the Python Enhancement Proposal (PEP) process; this process is the primary mechanism for proposing major new features, collecting community input on issues, and documenting Python design decisions. Python coding style is covered in PEP 8. Outstanding PEPs are reviewed and commented on by the Python community and the steering council. Enhancement of the language corresponds with development of the CPython reference implementation. The mailing list python-dev is the primary forum for the language's development. Specific issues were originally discussed in the Roundup bug tracker hosted by the foundation. In 2022, all issues and discussions were migrated to GitHub. Development originally took place on a self-hosted source-code repository running Mercurial, until Python moved to GitHub in January 2017. CPython's public releases have three types, distinguished by which part of the version number is incremented: Many alpha, beta, and release-candidates are also released as previews and for testing before final releases. Although there is a rough schedule for releases, they are often delayed if the code is not ready yet. Python's development team monitors the state of the code by running a large unit test suite during development. The major academic conference on Python is PyCon. Also, there are special Python mentoring programs, such as PyLadies. Naming Python's name is inspired by the British comedy group Monty Python, whom Python creator Guido van Rossum enjoyed while developing the language. Monty Python references appear frequently in Python code and culture; for example, the metasyntactic variables often used in Python literature are spam and eggs, rather than the traditional foo and bar. Also, the official Python documentation contains various references to Monty Python routines. Python users are sometimes referred to as "Pythonistas". Languages influenced by Python See also Notes References Further reading External links
========================================
[SOURCE: https://www.fast.ai/posts/2016-10-07-fastai-launch.html] | [TOKENS: 1049]
Launching fast.ai Jeremy Howard October 7, 2016 Jeremy is the past president of Kaggle, founder of Enlitic, FastMail.FM, and Optimal Decisions Group, and is on the faculty at Singularity University. See fast.ai’s About page for a brief bio. About six months ago, I resigned from my position as CEO of Enlitic, the company that I founded to bring medical diagnostics and treatment planning into the data driven world. I created Enlitic because there are 4 billion people in the world without access to modern medical diagnostics, and it will take about 300 years to train enough doctors to fill this gap – but with deep learning, we can make doctors 10 times more productive, and therefore bring modern medicine to these people within 5 to 10 years. This is only possible thanks to the power of deep learning and neural networks, a technology which I have been using for over 20 years, but which just in the last couple of years has reached a point where it can help solve many previously unsolved problems. I discussed the implications of this a couple of years ago in my TED.com talk, back when I was first launching Enlitic. And indeed, many of the predictions I made then, have since come to pass. Deep learning is now becoming embedded in products such as Apple’s Siri, Google photos, and self-driving cars. (Some people are even claiming that deep learning is “overhyped”. This is as ridiculous a claim as somebody in the early 90s claiming that the Internet was overhyped. Deep learning is clearly going to be even more widely used and far-reaching and transformative than the Internet.) But for all the successes, I discovered during my two years at Enlitic that deep learning has a very long way to go before it can help most people. Creating a deep learning model is, ironically, a highly manual process. Training a model takes a long time, and even for the top practitioners, it is a hit or miss affair where you don’t know whether it will work until the end. No mature tools exist to ensure models train successfully, or to ensure that the original set up is done appropriately for the data. Therefore, Dr. Rachel Thomas (a math PhD with past experience as a quant, Uber data scientist, full-stack developer, and educator) and I decided to create fast.ai, a research lab dedicated to doing everything necessary to allow deep learning to meet its enormous potential. We believe that this requires allowing domain experts to be able to leverage the technology themselves, rather than leaving it in the hands of a small and exclusive group of mathematicians. Only domain experts: fully understand and appreciate what are the most important problems in their field; have access to the data necessary to solve those problems; and understand the opportunities and constraints to implementing data driven solutions. Consider this chart shared by Jeff Dean, leader of Google Brain: At the start of 2012, deep learning was not being used at Google outside of Google Brain research. Since then, it’s use has grown exponentially, and it was being used in aprox 1,200 different projects by late 2015. Now imagine the impact deep learning can have as it spreads beyond the Bay Area tech elite, and we see this exponential growth in every organization around the world. The impact will be greatest in the two-thirds world, where resources are most constrained. For instance: there are only 14 pediatric radiologists for the entire continent of Africa (and half of those are in a single country, South Africa); many African countries have none! What if medical technology could read x-rays? Tens of millions of children would have access to medical image diagnostics for the first time. And the value of this technology to automate identifying tuberculosis, a disease that receives little research attention in the west but is the leading cause of death from infectious disease worldwide, killing almost 4,000 people daily, could be even higher. In India, Indonesia, and China, there are over 3 billion people, most of whom live in areas with similarly poor access to medical image diagnostics. During my eight years in management consulting I worked with hundreds of domain experts across dozens of fields and industries. I saw people who were highly creative in figuring out how to solve their problems, given the tools that they were familiar with. Nowadays we receive requests for help nearly every day, from people who want to use deep learning by solving everything from helping treat mental illness, to increasing agricultural yields in the developing world, identifying and treating plant disease, and developing adaptive educational materials. The best way we can help these people is by giving them the tools and knowledge to solve their own problems, using their own expertise and experience. We believe that the steps necessary to meet our goal of democratising deep learning are as follows: So we’re starting at the start! In our next post, we’ll talk about how we’re trying to deal with step 1 - fixing the shortage of data scientists with deep learning expertise. If you can’t wait, check out the official USF Data Institute description of our upcoming deep learning course on Monday evenings and send your resume to [email protected] to apply.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Electronic_voice_phenomenon] | [TOKENS: 5061]
Contents Electronic voice phenomenon Within ghost hunting and parapsychology, electronic voice phenomena (EVP) are sounds found on electronic recordings that are interpreted as spirit voices. Parapsychologist Konstantīns Raudive, who popularized the idea in the 1970s, described EVP as typically brief, usually the length of a word or short phrase. Enthusiasts consider EVP to be a form of paranormal phenomenon often found in recordings with static or other background noise. Scientists regard EVP as a form of auditory pareidolia (interpreting random sounds as voices in one's own language) and a pseudoscience promulgated by popular culture. Prosaic explanations for EVP include apophenia (perceiving patterns in random information), equipment artifacts, and hoaxes. History As the Spiritualist religious movement became prominent in the 1840s–1940s with a distinguishing belief that the spirits of the dead can be contacted by mediums, new technologies of the era, including photography, were employed by spiritualists in an effort to demonstrate contact with a spirit world. So popular were such ideas that Thomas Edison was asked in an interview with Scientific American to comment on the possibility of using his inventions to communicate with spirits. He replied that if the spirits were only capable of subtle influences, a sensitive recording device would provide a better chance of spirit communication than the table tipping and ouija boards mediums employed at the time. However, there is no indication that Edison ever designed or constructed a device for such a purpose. As sound recording became widespread, mediums explored using this technology to demonstrate communication with the dead as well. Spiritualism declined in the latter part of the 20th century, but attempts to use portable recording devices and modern digital technologies to communicate with spirits continued.: 352–381 American photographer Attila von Szalay was among the first to try recording what he believed to be voices of the dead as a way to augment his investigations in photographing ghosts. He began his attempts in 1941 using a 78 rpm record, but it wasn't until 1956 – after switching to a reel-to-reel tape recorder – that he believed he was successful. Working with Raymond Bayless, von Szalay conducted several recording sessions with a custom-made apparatus, consisting of a microphone in an insulated cabinet connected to an external recording device and speaker. Szalay reported finding many sounds on the tape that could not be heard on the speaker at the time of recording, some of which were recorded when there was no one in the cabinet. He believed these sounds to be the voices of discarnate spirits. Among the first recordings believed to be spirit voices were such messages as "This is G!", "Hot dog, Art!", and "Merry Christmas and Happy New Year to you all". Von Szalay and Raymond Bayless's work was published by the Journal of the American Society for Psychical Research in 1959. Bayless later went on to co-author the 1979 book, Phone Calls From the Dead. In 1959, Swedish painter and film producer Friedrich Jürgenson was recording bird songs. Upon playing the tape later, he heard what he interpreted to be his dead father's voice and then the spirit of his deceased wife calling his name. He went on to make several more recordings, including one that he said contained a message from his late mother. Konstantin Raudive, a Latvian psychologist who had taught at Uppsala University, Sweden, and who had worked in conjunction with Jürgenson, made over 100,000 recordings which he described as being communications with discarnate people. Some of these recordings were conducted in an RF-screened laboratory and contained words Raudive said were identifiable.: 352–381 In an attempt to confirm the content of his collection of recordings, Raudive invited listeners to hear and interpret them.: 353, 496 He believed that the clarity of the voices heard in his recordings implied that they could not be readily explained by normal means.: 352–381 Raudive published his first book, Breakthrough: An Amazing Experiment in Electronic Communication with the Dead in 1968 and it was translated into English in 1971. In 1980, William O'Neil constructed an electronic audio device called "The Spiricom". O'Neil claimed the device was built to specifications which he received psychically from George Mueller, a scientist who had died six years previously.: 352–381 At a Washington, DC press conference on April 6, 1982, O'Neil stated that he was able to hold two-way conversations with spirits through the Spiricom device, and provided the design specifications to researchers for free. However, nobody is known to have replicated the results O'Neil claimed using his own Spiricom devices. O'Neil's partner, retired industrialist George Meek, attributed O'Neil's success, and the inability of others to replicate it, to O'Neil's mediumistic abilities forming part of the loop that made the system work. In 2020 Kenny Biddle wrote a comprehensive article explaining the origins of the Spiricom as developed by O'Neil and Meek. He was prompted to do so by the re-emergence of the device on the television series Ghosthunters. He comprehensively debunked the "science" behind the device in both the original development and the Ghosthunters episode. Another electronic device specifically constructed in an attempt to capture EVP is "Frank's Box" or the "Ghost Box", created in 2002 by EVP enthusiast Frank Sumption for supposed real-time communication with the dead. Sumption claims he received his design instructions from the spirit world. The device is described as a combination white noise generator and AM radio receiver modified to sweep back and forth through the AM band selecting split-second snippets of sound. Critics of the device say its effect is subjective and incapable of being replicated, and since it relies on radio noise, any meaningful response a user gets is purely coincidental, or simply the result of pareidolia. Paranormal researcher Ben Radford writes that Frank's Box is a "modern version of the Ouija board... also known as the 'broken radio'". In 1982, Sarah Estep founded the American Association of Electronic Voice Phenomena (AA-EVP) in Severna Park, Maryland, a nonprofit organization with the purpose of increasing awareness of EVP, and of teaching standardized methods for capturing it. Estep began her exploration of EVP in 1976, and says she has made hundreds of recordings of messages from deceased friends, relatives, and extraterrestrials whom she speculated originated from other planets or dimensions. The term Instrumental Trans-Communication (ITC) was coined by Ernst Senkowski in the 1970s to refer more generally to communication through any sort of electronic device such as tape recorders, fax machines, television sets or computers between spirits or other discarnate entities and the living. One particularly famous claimed incidence of ITC occurred when the image of EVP enthusiast Friedrich Jürgenson (whose funeral was held that day) was said to have appeared on a television in the home of a colleague, which had been purposefully tuned to a vacant channel. ITC enthusiasts also look at the TV and video camera feedback loop of the Droste effect. In 1979, parapsychologist D. Scott Rogo described an alleged paranormal phenomenon in which people report that they receive simple, brief, and usually single-occurrence telephone calls from spirits of deceased relatives, friends, or strangers. Rosemary Guiley has written "within the parapsychology establishment, Rogo was often faulted for poor scholarship, which, critics said, led to erroneous conclusions." In 1995, the parapsychologist David Fontana proposed in an article that poltergeists could haunt tape recorders. He speculated that this may have happened to the parapsychologist Maurice Grosse who investigated the Enfield Poltergeist case. However, Tom Flynn, a media expert for the Committee for Skeptical Inquiry, examined Fontana's article and suggested an entirely naturalistic explanation for the phenomena. According to the skeptical investigator Joe Nickell "Occasionally, especially with older tape and under humid conditions, as the tape travels it can adhere to one of the guide posts. When this happens on a deck where both supply and take-up spindles are powered, the tape continues to feed, creating a fold. It was such a loop of tape, Flynn theorizes, that threaded its way amid the works of Grosse's recorder." In 1997, Imants Barušs, of the Department of Psychology at the University of Western Ontario, conducted a series of experiments using the methods of EVP investigator Konstantin Raudive, and the work of "instrumental transcommunication researcher" Mark Macy, as a guide. A radio was tuned to an empty frequency, and over 81 sessions a total of 60 hours and 11 minutes of recordings were collected. During recordings, a person either sat in silence or attempted to make verbal contact with potential sources of EVP. Barušs stated that he did record several events that sounded like voices, but they were too few and too random to represent viable data and too open to interpretation to be described definitively as EVP. He concluded: "While we did replicate EVP in the weak sense of finding voices on audio tapes, none of the phenomena found in our study was clearly anomalous, let alone attributable to discarnate beings. Hence we have failed to replicate EVP in the strong sense." The findings were published in the Journal of Scientific Exploration in 2001, and include a literature review. In 2005, the Journal of the Society for Psychical Research published a report by paranormal investigator Alexander MacRae. MacRae conducted recording sessions using a device of his own design that generated EVP. In an attempt to demonstrate that different individuals would interpret EVP in the recordings the same way, MacRae asked seven people to compare some selections to a list of five phrases he provided, and to choose the best match. MacRae said the results of the listening panels indicated that the selections were of paranormal origin. Portable digital voice recorders are currently the technology of choice for some EVP investigators. Since some of these devices are very susceptible to Radio Frequency (RF) contamination, EVP enthusiasts sometimes try to record EVP in RF- and sound-screened rooms. Some EVP enthusiasts describe hearing the words in EVP as an ability, much like learning a new language. Skeptics suggest that the claimed instances may be misinterpretations of natural phenomena, inadvertent influence of the electronic equipment by researchers, or deliberate influencing of the researchers and the equipment by third parties. EVP and ITC are seldom researched within the scientific community, so most research in the field is carried out by amateur researchers who lack training and resources to conduct scientific research, and who are motivated by subjective notions. Explanations and origins Paranormal claims for the origin of EVP include living humans imprinting thoughts directly on an electronic medium through psychokinesis and communication by discarnate entities such as spirits, nature energies, beings from other dimensions, or extraterrestrials. Paranormal explanations for EVP generally assume production of EVP by a communicating intelligence through means other than the typical functioning of communication technologies. Natural explanations for reported instances of EVP tend to dispute this assumption explicitly and provide explanations which do not require novel mechanisms that are not based on recognized scientific phenomena. One study, by psychologist Imants Barušs, was unable to replicate suggested paranormal origins for EVP recorded under controlled conditions. Brian Regal in Pseudoscience: A Critical Encyclopedia (2009) has written "A case can be made for the idea that many EVPs are artifacts of the recording process itself with which the operators are unfamiliar. The majority of EVPs have alternative, nonspiritual sources; anomalous ones have no clear proof they are of spiritual origin." There are a number of simple scientific explanations that can account for why some listeners to the static on audio devices may believe they hear voices, including radio interference and the tendency of the human brain to recognize patterns in random stimuli. Some recordings may be hoaxes created by frauds or pranksters. Auditory pareidolia is a situation created when the brain incorrectly interprets random patterns as being familiar patterns. In the case of EVP it could result in an observer interpreting random noise on an audio recording as being the familiar sound of a human voice. The propensity for an apparent voice heard in white noise recordings to be in a language understood well by those researching it, rather than in an unfamiliar language, has been cited as evidence of this, and a broad class of phenomena referred to by author Joe Banks as Rorschach Audio has been described as a global explanation for all manifestations of EVP. In a 2019 investigation of a supposed haunted painting in a West Virginia museum, paranormal researcher Kenny Biddle investigated the claims made by the museum owner and ghost hunters that an EVP recording clearly saying the woman's name, "Annie", is really the voice of the woman in the portrait. The name Annie is written on the back of the portrait, which primes anyone listening for the name, to know what name to listen for. The EVP was created using a Radio Shack radio "modified to allow it to continually scan through the available AM or FM frequencies without muting the sound." Regarding a general question by the ghost hunter "What is your name?", Biddle writes, "I can guarantee sooner or later you'll hear something that sounds like a name, and there is a good chance of being a name, because you're listening to radio broadcasts, news reports, commercials, and so on—which often include names." Biddle lists words such as "company, anything, anyone, mahogany, many, or even any" as words that can be commonly heard while listening to the radio. The phrase '"... and he ..."' would also sound like "Annie" to anyone primed to listen for the name Annie. Skeptics such as David Federlein, Chris French, Terence Hines and Michael Shermer say that EVP are usually recorded by raising the "noise floor"⁠ — the electrical noise created by all electrical devices — in order to create white noise. When this noise is filtered, it can be made to produce noises which sound like speech. Federlein says that this is no different from using a wah pedal on a guitar, which is a focused sweep filter which moves around the spectrum and creates open vowel sounds. This, according to Federlein, sounds exactly like some EVP. This, in combination with such things as cross modulation of radio stations or faulty ground loops can cause the impression of paranormal voices. The human brain evolved to recognize patterns, and if a person listens to enough noise the brain will detect words, even when there is no intelligent source for them. Expectation also plays an important part in making people believe they are hearing voices in random noise. Apophenia is related to, but distinct from pareidolia. Apophenia is defined as "the spontaneous finding of connections or meaning in things which are random, unconnected or meaningless", and has been put forward as a possible explanation. According to the psychologist James Alcock what people hear in EVP recordings can best be explained by apophenia, cross-modulation or expectation and wishful thinking. Alcock concluded "Electronic Voice Phenomena are the products of hope and expectation; the claims wither away under the light of scientific scrutiny." Interference, for example, is seen in EVP recordings, especially those recorded on devices which contain RLC circuitry. These cases represent radio signals of voices or other sounds from broadcast sources. Interference from CB Radio transmissions and wireless baby monitors, or anomalies generated through cross modulation from other electronic devices, are all documented phenomena. It is even possible for circuits to resonate without any internal power source by means of radio reception. Capture errors are anomalies created by the method used to capture audio signals, such as noise generated through the over-amplification of a signal at the point of recording. Artifacts created during attempts to boost the clarity of an existing recording might explain some EVP. Methods include re-sampling, frequency isolation, and noise reduction or enhancement, which can cause recordings to take on qualities significantly different from those that were present in the original recording. The first EVP recordings may have originated from the use of tape recording equipment with poorly aligned erasure and recording heads, resulting in the incomplete erasure of previous audio recordings on the tape. This could allow a small percentage of previous content to be superimposed or mixed into a new 'silent' recording.[citation needed] For all radio transmissions above 30 MHz (which are not reflected by the ionosphere) there is a possibility of meteor reflection of the radio signal. Meteors leave a trail of ionised particles and electrons as they pass through the upper atmosphere (a process called ablation) which reflect transmission radio waves which would usually flow into space. These reflected waves are from transmitters which are below the horizon of the received meteor reflection. In Europe this means the brief scattered wave may carry a foreign voice which can interfere with radio receivers. Meteor reflected radio waves last between 0.05 seconds and 1 second, depending on the size of the meteor. Organizations that show interest in EVP There are a number of organizations dedicated to studying EVP and instrumental transcommunication, or which otherwise express interest in the subject. Individuals within these organizations may participate in investigations, author books or journal articles, deliver presentations, and hold conferences where they share experiences. In addition, organizations exist which dispute the validity of the phenomena on scientific grounds. The Association TransCommunication (ATransC), formerly the American Association of Electronic Voice Phenomena (AA-EVP), and the International Ghost Hunters Society conduct ongoing investigations of EVP and ITC including collecting examples of purported EVP available over the internet. The Rorschach Audio Project, initiated by sound artist Joe Banks, which presents EVP as a product of radio interference combined with auditory pareidolia and the Interdisciplinary Laboratory for Biopsychocybernetics Research, a non-profit organization dedicated to studying anomalous phenomena related to neurophysiological conditions. According to the AA-EVP it is "the only organized group of researchers we know of specializing in the study of ITC". Parapsychologists and spiritualists have an ongoing interest in EVP. Many spiritualists experiment with a variety of techniques for spirit communication which they believe provide evidence of the continuation of life. According to the National Spiritualist Association of Churches, "An important modern day development in mediumship is spirit communications via an electronic device. This is most commonly known as Electronic Voice Phenomena (EVP)". An informal survey by the organization's Department Of Phenomenal Evidence cites that 1/3 of churches conduct sessions in which participants seek to communicate with spirit entities using EVP. The James Randi Educational Foundation offered a million dollars for proof that any phenomena, including EVP, are caused paranormally. Demographics In 2015, an investigation by sociologist Marc Eaton on the demography of United States paranormal groups that used electronic voice phenomenon found an overrepresentation of white participants, raised in the Roman Catholic Church (which is only 21% of the U.S. population), mainly with some post-secondary education. Although a preponderance of research shows that women and "less socially integrated individuals" are more likely to believe in ghosts, the demographic samples in Eaton's research did not reflect this. Cultural impact The concept of EVP has influenced popular culture. It is popular as an entertaining pursuit, as in ghost hunting, and as a means of dealing with grief. It has influenced literature, radio, film, television, and music. Investigation of EVP is the subject of hundreds of regional and national groups and Internet message boards. Paranormal investigator John Zaffis claims, "There's been a boom in ghost hunting ever since the Internet took off." Investigators, equipped with electronic gear – like EMF meters, video cameras, and audio recorders – scour reportedly haunted venues, trying to uncover visual and audio evidence of ghosts. Many use portable recording devices in an attempt to capture EVP. Films involving EVP include Poltergeist, The Sixth Sense, and White Noise. Sylvio is an indie-developed first-person horror adventure video game released on Steam in June 2015 for Microsoft Windows, PlayStation 4, Xbox One and, OS X, utilizing the Unity engine. The game is about an audio recordist called Juliette Waters, who records the voices of ghosts through electronic voice phenomenon. She finds herself trapped in an old family park, shut down since a landslide in 1971, and she now needs to use her recorder to survive the night. A sequel, Sylvio 2, was released on October 11, 2017.[citation needed] Phasmophobia is a co-op horror video game, in which a team of one to four players play as ghost hunters who try to identify hostile ghosts in varying locations. The game features a Spirit Box item used to capture EVPs of certain ghost types, which helps the players identify the type of the ghost they're dealing with. EVPs in Phasmophobia consist recorded lines from news broadcasts: "Act of killing", "Elderly victim", "From far away" are a few of the examples the ghost might give. Alternatively, the game has a sound recorder that can be used to record words directly from a ghost or laughing. [citation needed] It has been featured on television series like Ghost Whisperer, In Search Of… (1981), The Omega Factor, A Haunting, Ghost Hunters, MonsterQuest, Ghost Adventures, The Secret Saturdays, Fact or Faked: Paranormal Files, Supernatural, Derren Brown Investigates, Ghost Lab and Buzzfeed Unsolved: Supernatural Legion, a 1983 novel by William Peter Blatty, contains a subplot where Dr. Vincent Amfortas, a terminally ill neurologist, leaves a "to-be-opened-upon-my-death" letter for Father Dyer detailing his accounts of contact with the dead, including the doctor's recently deceased wife, Ann, through EVP recordings. Amfortas' character and the EVP subplot do not appear in the film version of the novel, The Exorcist III, although in Kinderman's dream dead people are seen trying to communicate with the living by radio. In Pattern Recognition, a 2003 novel by William Gibson, the main character's mother tries to convince her that her father is communicating with her from recordings after his death/disappearance in the September 11, 2001 attacks. In Nyctivoe, a 2001 vampire-inspired play by Dimitris Lyacos, the male character as well as his deceased companion are speaking from a recording device amidst a static/white noise background. In With the people from the bridge, a 2014 play by Dimitris Lyacos based on the idea of the return of the dead, the voice of the female character NCTV is transmitted from a television monitor amidst a static/white noise background. EVP is the subject of Vyktoria Pratt Keating's song "Disembodied Voices on Tape" from her 2003 album Things that Fall from the Sky, produced by Andrew Giddings of Jethro Tull. Laurie Anderson's "Example #22", from her 1981 album Big Science, interposes spoken sentences and phrases in German with sung passages in English representing EVP. During the outro to "Rubber Ring" by The Smiths, a sample from an EVP recording is repeated. The phrase "You are sleeping, you do not want to believe," is a 'translation' of the 'spirit voices' from a 1970s flexitape. The original recording is from the 1971 record which accompanied Raudive's book 'Breakthrough', and which was re-issued as a flexi-disc in the 1980s free with The Unexplained magazine. Bass Communion's 2004 album Ghosts on Magnetic Tape was inspired by EVP. The band Giles Corey composed the song "Empty Churches" which features track 2 called 'Raymond Cass', track 36 called 'Justified Theft' and track 38 called 'Tramping' from the album An Introduction to EVP by The Ghost Orchid which features excerpts from different EVP experiments produced by many researchers, although most are unknown, some have been pointed out to be better known researchers who studied EVP recordings including Friedrich Jurgenson, Raymond Cass and Konstantin Raudive. The 2017 album Katharsis (A Small Victory) of Polish theatre group Teatr Tworzenia by Jarosław Pijarowski contains EVP recordings in the background of its second track "Katharsis – Pandemonium". See also References
========================================
[SOURCE: https://techcrunch.com/author/anthony-ha/] | [TOKENS: 245]
Save up to $680 on your pass with Super Early Bird rates. REGISTER NOW. Save up to $680 on your Disrupt 2026 pass. Ends February 27. REGISTER NOW. Latest AI Amazon Apps Biotech & Health Climate Cloud Computing Commerce Crypto Enterprise EVs Fintech Fundraising Gadgets Gaming Google Government & Policy Hardware Instagram Layoffs Media & Entertainment Meta Microsoft Privacy Robotics Security Social Space Startups TikTok Transportation Venture Staff Events Startup Battlefield StrictlyVC Newsletters Podcasts Videos Partner Content TechCrunch Brand Studio Crunchboard Contact Us Anthony Ha TechCrunch Anthony Ha is TechCrunch’s weekend editor. Previously, he worked as a tech reporter at Adweek, a senior editor at VentureBeat, a local government reporter at the Hollister Free Lance, and vice president of content at a VC firm. He lives in New York City. You can contact or verify outreach from Anthony by emailing anthony.ha@techcrunch.com. Latest from Anthony Ha 1,065 Episodes Last update: Feb 2026 Equity is TechCrunch’s flagship podcast about the business of startups, unpacked by the writers who know best. Produced by Theresa… © 2025 TechCrunch Media LLC.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Joke#cite_note-FOOTNOTEJolles1930-4] | [TOKENS: 8460]
Contents Joke A joke is a display of humour in which words are used within a specific and well-defined narrative structure to make people laugh and is usually not meant to be interpreted literally. It usually takes the form of a story, often with dialogue, and ends in a punch line, whereby the humorous element of the story is revealed; this can be done using a pun or other type of word play, irony or sarcasm, logical incompatibility, hyperbole, or other means. Linguist Robert Hetzron offers the definition: A joke is a short humorous piece of oral literature in which the funniness culminates in the final sentence, called the punchline… In fact, the main condition is that the tension should reach its highest level at the very end. No continuation relieving the tension should be added. As for its being "oral," it is true that jokes may appear printed, but when further transferred, there is no obligation to reproduce the text verbatim, as in the case of poetry. It is generally held that jokes benefit from brevity, containing no more detail than is needed to set the scene for the punchline at the end. In the case of riddle jokes or one-liners, the setting is implicitly understood, leaving only the dialogue and punchline to be verbalised. However, subverting these and other common guidelines can also be a source of humour—the shaggy dog story is an example of an anti-joke; although presented as a joke, it contains a long drawn-out narrative of time, place and character, rambles through many pointless inclusions and finally fails to deliver a punchline. Jokes are a form of humour, but not all humour is in the form of a joke. Some humorous forms which are not verbal jokes are: involuntary humour, situational humour, practical jokes, slapstick and anecdotes. Identified as one of the simple forms of oral literature by the Dutch linguist André Jolles, jokes are passed along anonymously. They are told in both private and public settings; a single person tells a joke to his friend in the natural flow of conversation, or a set of jokes is told to a group as part of scripted entertainment. Jokes are also passed along in written form or, more recently, through the internet. Stand-up comics, comedians and slapstick work with comic timing and rhythm in their performance, and may rely on actions as well as on the verbal punchline to evoke laughter. This distinction has been formulated in the popular saying "A comic says funny things; a comedian says things funny".[note 1] History in print Jokes do not belong to refined culture, but rather to the entertainment and leisure of all classes. As such, any printed versions were considered ephemera, i.e., temporary documents created for a specific purpose and intended to be thrown away. Many of these early jokes deal with scatological and sexual topics, entertaining to all social classes but not to be valued and saved.[citation needed] Various kinds of jokes have been identified in ancient pre-classical texts.[note 2] The oldest identified joke is an ancient Sumerian proverb from 1900 BC containing toilet humour: "Something which has never occurred since time immemorial; a young woman did not fart in her husband's lap." Its records were dated to the Old Babylonian period and the joke may go as far back as 2300 BC. The second oldest joke found, discovered on the Westcar Papyrus and believed to be about Sneferu, was from Ancient Egypt c. 1600 BC: "How do you entertain a bored pharaoh? You sail a boatload of young women dressed only in fishing nets down the Nile and urge the pharaoh to go catch a fish." The tale of the three ox drivers from Adab completes the three known oldest jokes in the world. This is a comic triple dating back to 1200 BC Adab. It concerns three men seeking justice from a king on the matter of ownership over a newborn calf, for whose birth they all consider themselves to be partially responsible. The king seeks advice from a priestess on how to rule the case, and she suggests a series of events involving the men's households and wives. The final portion of the story (which included the punch line), has not survived intact, though legible fragments suggest it was bawdy in nature. Jokes can be notoriously difficult to translate from language to language; particularly puns, which depend on specific words and not just on their meanings. For instance, Julius Caesar once sold land at a surprisingly cheap price to his lover Servilia, who was rumoured to be prostituting her daughter Tertia to Caesar in order to keep his favour. Cicero remarked that "conparavit Servilia hunc fundum tertia deducta." The punny phrase, "tertia deducta", can be translated as "with one-third off (in price)", or "with Tertia putting out." The earliest extant joke book is the Philogelos (Greek for The Laughter-Lover), a collection of 265 jokes written in crude ancient Greek dating to the fourth or fifth century AD. The author of the collection is obscure and a number of different authors are attributed to it, including "Hierokles and Philagros the grammatikos", just "Hierokles", or, in the Suda, "Philistion". British classicist Mary Beard states that the Philogelos may have been intended as a jokester's handbook of quips to say on the fly, rather than a book meant to be read straight through. Many of the jokes in this collection are surprisingly familiar, even though the typical protagonists are less recognisable to contemporary readers: the absent-minded professor, the eunuch, and people with hernias or bad breath. The Philogelos even contains a joke similar to Monty Python's "Dead Parrot Sketch". During the 15th century, the printing revolution spread across Europe following the development of the movable type printing press. This was coupled with the growth of literacy in all social classes. Printers turned out Jestbooks along with Bibles to meet both lowbrow and highbrow interests of the populace. One early anthology of jokes was the Facetiae by the Italian Poggio Bracciolini, first published in 1470. The popularity of this jest book can be measured on the twenty editions of the book documented alone for the 15th century. Another popular form was a collection of jests, jokes and funny situations attributed to a single character in a more connected, narrative form of the picaresque novel. Examples of this are the characters of Rabelais in France, Till Eulenspiegel in Germany, Lazarillo de Tormes in Spain and Master Skelton in England. There is also a jest book ascribed to William Shakespeare, the contents of which appear to both inform and borrow from his plays. All of these early jestbooks corroborate both the rise in the literacy of the European populations and the general quest for leisure activities during the Renaissance in Europe. The practice of printers using jokes and cartoons as page fillers was also widely used in the broadsides and chapbooks of the 19th century and earlier. With the increase in literacy in the general population and the growth of the printing industry, these publications were the most common forms of printed material between the 16th and 19th centuries throughout Europe and North America. Along with reports of events, executions, ballads and verse, they also contained jokes. Only one of many broadsides archived in the Harvard library is described as "1706. Grinning made easy; or, Funny Dick's unrivalled collection of curious, comical, odd, droll, humorous, witty, whimsical, laughable, and eccentric jests, jokes, bulls, epigrams, &c. With many other descriptions of wit and humour." These cheap publications, ephemera intended for mass distribution, were read alone, read aloud, posted and discarded. There are many types of joke books in print today; a search on the internet provides a plethora of titles available for purchase. They can be read alone for solitary entertainment, or used to stock up on new jokes to entertain friends. Some people try to find a deeper meaning in jokes, as in "Plato and a Platypus Walk into a Bar... Understanding Philosophy Through Jokes".[note 3] However a deeper meaning is not necessary to appreciate their inherent entertainment value. Magazines frequently use jokes and cartoons as filler for the printed page. Reader's Digest closes out many articles with an (unrelated) joke at the bottom of the article. The New Yorker was first published in 1925 with the stated goal of being a "sophisticated humour magazine" and is still known for its cartoons. Telling jokes Telling a joke is a cooperative effort; it requires that the teller and the audience mutually agree in one form or another to understand the narrative which follows as a joke. In a study of conversation analysis, the sociologist Harvey Sacks describes in detail the sequential organisation in the telling of a single joke. "This telling is composed, as for stories, of three serially ordered and adjacently placed types of sequences … the preface [framing], the telling, and the response sequences." Folklorists expand this to include the context of the joking. Who is telling what jokes to whom? And why is he telling them when? The context of the joke-telling in turn leads into a study of joking relationships, a term coined by anthropologists to refer to social groups within a culture who engage in institutionalised banter and joking. Framing is done with a (frequently formulaic) expression which keys the audience in to expect a joke. "Have you heard the one…", "Reminds me of a joke I heard…", "So, a lawyer and a doctor…"; these conversational markers are just a few examples of linguistic frames used to start a joke. Regardless of the frame used, it creates a social space and clear boundaries around the narrative which follows. Audience response to this initial frame can be acknowledgement and anticipation of the joke to follow. It can also be a dismissal, as in "this is no joking matter" or "this is no time for jokes". The performance frame serves to label joke-telling as a culturally marked form of communication. Both the performer and audience understand it to be set apart from the "real" world. "An elephant walks into a bar…"; a person sufficiently familiar with both the English language and the way jokes are told automatically understands that such a compressed and formulaic story, being told with no substantiating details, and placing an unlikely combination of characters into an unlikely setting and involving them in an unrealistic plot, is the start of a joke, and the story that follows is not meant to be taken at face value (i.e. it is non-bona-fide communication). The framing itself invokes a play mode; if the audience is unable or unwilling to move into play, then nothing will seem funny. Following its linguistic framing the joke, in the form of a story, can be told. It is not required to be verbatim text like other forms of oral literature such as riddles and proverbs. The teller can and does modify the text of the joke, depending both on memory and the present audience. The important characteristic is that the narrative is succinct, containing only those details which lead directly to an understanding and decoding of the punchline. This requires that it support the same (or similar) divergent scripts which are to be embodied in the punchline. The punchline is intended to make the audience laugh. A linguistic interpretation of this punchline/response is elucidated by Victor Raskin in his Script-based Semantic Theory of Humour. Humour is evoked when a trigger contained in the punchline causes the audience to abruptly shift its understanding of the story from the primary (or more obvious) interpretation to a secondary, opposing interpretation. "The punchline is the pivot on which the joke text turns as it signals the shift between the [semantic] scripts necessary to interpret [re-interpret] the joke text." To produce the humour in the verbal joke, the two interpretations (i.e. scripts) need to both be compatible with the joke text and opposite or incompatible with each other. Thomas R. Shultz, a psychologist, independently expands Raskin's linguistic theory to include "two stages of incongruity: perception and resolution." He explains that "… incongruity alone is insufficient to account for the structure of humour. […] Within this framework, humour appreciation is conceptualized as a biphasic sequence involving first the discovery of incongruity followed by a resolution of the incongruity." In the case of a joke, that resolution generates laughter. This is the point at which the field of neurolinguistics offers some insight into the cognitive processing involved in this abrupt laughter at the punchline. Studies by the cognitive science researchers Coulson and Kutas directly address the theory of script switching articulated by Raskin in their work. The article "Getting it: Human event-related brain response to jokes in good and poor comprehenders" measures brain activity in response to reading jokes. Additional studies by others in the field support more generally the theory of two-stage processing of humour, as evidenced in the longer processing time they require. In the related field of neuroscience, it has been shown that the expression of laughter is caused by two partially independent neuronal pathways: an "involuntary" or "emotionally driven" system and a "voluntary" system. This study adds credence to the common experience when exposed to an off-colour joke; a laugh is followed in the next breath by a disclaimer: "Oh, that's bad…" Here the multiple steps in cognition are clearly evident in the stepped response, the perception being processed just a breath faster than the resolution of the moral/ethical content in the joke. Expected response to a joke is laughter. The joke teller hopes the audience "gets it" and is entertained. This leads to the premise that a joke is actually an "understanding test" between individuals and groups. If the listeners do not get the joke, they are not understanding the two scripts which are contained in the narrative as they were intended. Or they do "get it" and do not laugh; it might be too obscene, too gross or too dumb for the current audience. A woman might respond differently to a joke told by a male colleague around the water cooler than she would to the same joke overheard in a women's lavatory. A joke involving toilet humour may be funnier told on the playground at elementary school than on a college campus. The same joke will elicit different responses in different settings. The punchline in the joke remains the same, however, it is more or less appropriate depending on the current context. The context explores the specific social situation in which joking occurs. The narrator automatically modifies the text of the joke to be acceptable to different audiences, while at the same time supporting the same divergent scripts in the punchline. The vocabulary used in telling the same joke at a university fraternity party and to one's grandmother might well vary. In each situation, it is important to identify both the narrator and the audience as well as their relationship with each other. This varies to reflect the complexities of a matrix of different social factors: age, sex, race, ethnicity, kinship, political views, religion, power relationships, etc. When all the potential combinations of such factors between the narrator and the audience are considered, then a single joke can take on infinite shades of meaning for each unique social setting. The context, however, should not be confused with the function of the joking. "Function is essentially an abstraction made on the basis of a number of contexts". In one long-term observation of men coming off the late shift at a local café, joking with the waitresses was used to ascertain sexual availability for the evening. Different types of jokes, going from general to topical into explicitly sexual humour signalled openness on the part of the waitress for a connection. This study describes how jokes and joking are used to communicate much more than just good humour. That is a single example of the function of joking in a social setting, but there are others. Sometimes jokes are used simply to get to know someone better. What makes them laugh, what do they find funny? Jokes concerning politics, religion or sexual topics can be used effectively to gauge the attitude of the audience to any one of these topics. They can also be used as a marker of group identity, signalling either inclusion or exclusion for the group. Among pre-adolescents, "dirty" jokes allow them to share information about their changing bodies. And sometimes joking is just simple entertainment for a group of friends. Relationships The context of joking in turn leads to a study of joking relationships, a term coined by anthropologists to refer to social groups within a culture who take part in institutionalised banter and joking. These relationships can be either one-way or a mutual back and forth between partners. The joking relationship is defined as a peculiar combination of friendliness and antagonism. The behaviour is such that in any other social context it would express and arouse hostility; but it is not meant seriously and must not be taken seriously. There is a pretence of hostility along with a real friendliness. To put it in another way, the relationship is one of permitted disrespect. Joking relationships were first described by anthropologists within kinship groups in Africa. But they have since been identified in cultures around the world, where jokes and joking are used to mark and reinforce appropriate boundaries of a relationship. Electronic The advent of electronic communications at the end of the 20th century introduced new traditions into jokes. A verbal joke or cartoon is emailed to a friend or posted on a bulletin board; reactions include a replied email with a :-) or LOL, or a forward on to further recipients. Interaction is limited to the computer screen and for the most part solitary. While preserving the text of a joke, both context and variants are lost in internet joking; for the most part, emailed jokes are passed along verbatim. The framing of the joke frequently occurs in the subject line: "RE: laugh for the day" or something similar. The forward of an email joke can increase the number of recipients exponentially. Internet joking forces a re-evaluation of social spaces and social groups. They are no longer only defined by physical presence and locality, they also exist in the connectivity in cyberspace. "The computer networks appear to make possible communities that, although physically dispersed, display attributes of the direct, unconstrained, unofficial exchanges folklorists typically concern themselves with". This is particularly evident in the spread of topical jokes, "that genre of lore in which whole crops of jokes spring up seemingly overnight around some sensational event … flourish briefly and then disappear, as the mass media move on to fresh maimings and new collective tragedies". This correlates with the new understanding of the internet as an "active folkloric space" with evolving social and cultural forces and clearly identifiable performers and audiences. A study by the folklorist Bill Ellis documented how an evolving cycle was circulated over the internet. By accessing message boards that specialised in humour immediately following the 9/11 disaster, Ellis was able to observe in real-time both the topical jokes being posted electronically and responses to the jokes. Previous folklore research has been limited to collecting and documenting successful jokes, and only after they had emerged and come to folklorists' attention. Now, an Internet-enhanced collection creates a time machine, as it were, where we can observe what happens in the period before the risible moment, when attempts at humour are unsuccessful Access to archived message boards also enables us to track the development of a single joke thread in the context of a more complicated virtual conversation. Joke cycles A joke cycle is a collection of jokes about a single target or situation which displays consistent narrative structure and type of humour. Some well-known cycles are elephant jokes using nonsense humour, dead baby jokes incorporating black humour, and light bulb jokes, which describe all kinds of operational stupidity. Joke cycles can centre on ethnic groups, professions (viola jokes), catastrophes, settings (…walks into a bar), absurd characters (wind-up dolls), or logical mechanisms which generate the humour (knock-knock jokes). A joke can be reused in different joke cycles; an example of this is the same Head & Shoulders joke refitted to the tragedies of Vic Morrow, Admiral Mountbatten and the crew of the Challenger space shuttle.[note 4] These cycles seem to appear spontaneously, spread rapidly across countries and borders only to dissipate after some time. Folklorists and others have studied individual joke cycles in an attempt to understand their function and significance within the culture. Joke cycles circulated in the recent past include: As with the 9/11 disaster discussed above, cycles attach themselves to celebrities or national catastrophes such as the death of Diana, Princess of Wales, the death of Michael Jackson, and the Space Shuttle Challenger disaster. These cycles arise regularly as a response to terrible unexpected events which command the national news. An in-depth analysis of the Challenger joke cycle documents a change in the type of humour circulated following the disaster, from February to March 1986. "It shows that the jokes appeared in distinct 'waves', the first responding to the disaster with clever wordplay and the second playing with grim and troubling images associated with the event…The primary social function of disaster jokes appears to be to provide closure to an event that provoked communal grieving, by signalling that it was time to move on and pay attention to more immediate concerns". The sociologist Christie Davies has written extensively on ethnic jokes told in countries around the world. In ethnic jokes he finds that the "stupid" ethnic target in the joke is no stranger to the culture, but rather a peripheral social group (geographic, economic, cultural, linguistic) well known to the joke tellers. So Americans tell jokes about Polacks and Italians, Germans tell jokes about Ostfriesens, and the English tell jokes about the Irish. In a review of Davies' theories it is said that "For Davies, [ethnic] jokes are more about how joke tellers imagine themselves than about how they imagine those others who serve as their putative targets…The jokes thus serve to center one in the world – to remind people of their place and to reassure them that they are in it." A third category of joke cycles identifies absurd characters as the butt: for example the grape, the dead baby or the elephant. Beginning in the 1960s, social and cultural interpretations of these joke cycles, spearheaded by the folklorist Alan Dundes, began to appear in academic journals. Dead baby jokes are posited to reflect societal changes and guilt caused by widespread use of contraception and abortion beginning in the 1960s.[note 5] Elephant jokes have been interpreted variously as stand-ins for American blacks during the Civil Rights Era or as an "image of something large and wild abroad in the land captur[ing] the sense of counterculture" of the sixties. These interpretations strive for a cultural understanding of the themes of these jokes which go beyond the simple collection and documentation undertaken previously by folklorists and ethnologists. Classification systems As folktales and other types of oral literature became collectables throughout Europe in the 19th century (Brothers Grimm et al.), folklorists and anthropologists of the time needed a system to organise these items. The Aarne–Thompson classification system was first published in 1910 by Antti Aarne, and later expanded by Stith Thompson to become the most renowned classification system for European folktales and other types of oral literature. Its final section addresses anecdotes and jokes, listing traditional humorous tales ordered by their protagonist; "This section of the Index is essentially a classification of the older European jests, or merry tales – humorous stories characterized by short, fairly simple plots. …" Due to its focus on older tale types and obsolete actors (e.g., numbskull), the Aarne–Thompson Index does not provide much help in identifying and classifying the modern joke. A more granular classification system used widely by folklorists and cultural anthropologists is the Thompson Motif Index, which separates tales into their individual story elements. This system enables jokes to be classified according to individual motifs included in the narrative: actors, items and incidents. It does not provide a system to classify the text by more than one element at a time while at the same time making it theoretically possible to classify the same text under multiple motifs. The Thompson Motif Index has spawned further specialised motif indices, each of which focuses on a single aspect of one subset of jokes. A sampling of just a few of these specialised indices have been listed under other motif indices. Here one can select an index for medieval Spanish folk narratives, another index for linguistic verbal jokes, and a third one for sexual humour. To assist the researcher with this increasingly confusing situation, there are also multiple bibliographies of indices as well as a how-to guide on creating your own index. Several difficulties have been identified with these systems of identifying oral narratives according to either tale types or story elements. A first major problem is their hierarchical organisation; one element of the narrative is selected as the major element, while all other parts are arrayed subordinate to this. A second problem with these systems is that the listed motifs are not qualitatively equal; actors, items and incidents are all considered side-by-side. And because incidents will always have at least one actor and usually have an item, most narratives can be ordered under multiple headings. This leads to confusion about both where to order an item and where to find it. A third significant problem is that the "excessive prudery" common in the middle of the 20th century means that obscene, sexual and scatological elements were regularly ignored in many of the indices. The folklorist Robert Georges has summed up the concerns with these existing classification systems: …Yet what the multiplicity and variety of sets and subsets reveal is that folklore [jokes] not only takes many forms, but that it is also multifaceted, with purpose, use, structure, content, style, and function all being relevant and important. Any one or combination of these multiple and varied aspects of a folklore example [such as jokes] might emerge as dominant in a specific situation or for a particular inquiry. It has proven difficult to organise all different elements of a joke into a multi-dimensional classification system which could be of real value in the study and evaluation of this (primarily oral) complex narrative form. The General Theory of Verbal Humour or GTVH, developed by the linguists Victor Raskin and Salvatore Attardo, attempts to do exactly this. This classification system was developed specifically for jokes and later expanded to include longer types of humorous narratives. Six different aspects of the narrative, labelled Knowledge Resources or KRs, can be evaluated largely independently of each other, and then combined into a concatenated classification label. These six KRs of the joke structure include: As development of the GTVH progressed, a hierarchy of the KRs was established to partially restrict the options for lower-level KRs depending on the KRs defined above them. For example, a lightbulb joke (SI) will always be in the form of a riddle (NS). Outside of these restrictions, the KRs can create a multitude of combinations, enabling a researcher to select jokes for analysis which contain only one or two defined KRs. It also allows for an evaluation of the similarity or dissimilarity of jokes depending on the similarity of their labels. "The GTVH presents itself as a mechanism … of generating [or describing] an infinite number of jokes by combining the various values that each parameter can take. … Descriptively, to analyze a joke in the GTVH consists of listing the values of the 6 KRs (with the caveat that TA and LM may be empty)." This classification system provides a functional multi-dimensional label for any joke, and indeed any verbal humour. Joke and humour research Many academic disciplines lay claim to the study of jokes (and other forms of humour) as within their purview. Fortunately, there are enough jokes, good, bad and worse, to go around. The studies of jokes from each of the interested disciplines bring to mind the tale of the blind men and an elephant where the observations, although accurate reflections of their own competent methodological inquiry, frequently fail to grasp the beast in its entirety. This attests to the joke as a traditional narrative form which is indeed complex, concise and complete in and of itself. It requires a "multidisciplinary, interdisciplinary, and cross-disciplinary field of inquiry" to truly appreciate these nuggets of cultural insight.[note 6] Sigmund Freud was one of the first modern scholars to recognise jokes as an important object of investigation. In his 1905 study Jokes and their Relation to the Unconscious Freud describes the social nature of humour and illustrates his text with many examples of contemporary Viennese jokes. His work is particularly noteworthy in this context because Freud distinguishes in his writings between jokes, humour and the comic. These are distinctions which become easily blurred in many subsequent studies where everything funny tends to be gathered under the umbrella term of "humour", making for a much more diffuse discussion. Since the publication of Freud's study, psychologists have continued to explore humour and jokes in their quest to explain, predict and control an individual's "sense of humour". Why do people laugh? Why do people find something funny? Can jokes predict character, or vice versa, can character predict the jokes an individual laughs at? What is a "sense of humour"? A current review of the popular magazine Psychology Today lists over 200 articles discussing various aspects of humour; in psychological jargon, the subject area has become both an emotion to measure and a tool to use in diagnostics and treatment. A new psychological assessment tool, the Values in Action Inventory developed by the American psychologists Christopher Peterson and Martin Seligman includes humour (and playfulness) as one of the core character strengths of an individual. As such, it could be a good predictor of life satisfaction. For psychologists, it would be useful to measure both how much of this strength an individual has and how it can be measurably increased. A 2007 survey of existing tools to measure humour identified more than 60 psychological measurement instruments. These measurement tools use many different approaches to quantify humour along with its related states and traits. There are tools to measure an individual's physical response by their smile; the Facial Action Coding System (FACS) is one of several tools used to identify any one of multiple types of smiles. Or the laugh can be measured to calculate the funniness response of an individual; multiple types of laughter have been identified. It must be stressed here that both smiles and laughter are not always a response to something funny. In trying to develop a measurement tool, most systems use "jokes and cartoons" as their test materials. However, because no two tools use the same jokes, and across languages this would not be feasible, how does one determine that the assessment objects are comparable? Moving on, whom does one ask to rate the sense of humour of an individual? Does one ask the person themselves, an impartial observer, or their family, friends and colleagues? Furthermore, has the current mood of the test subjects been considered; someone with a recent death in the family might not be much prone to laughter. Given the plethora of variants revealed by even a superficial glance at the problem, it becomes evident that these paths of scientific inquiry are mined with problematic pitfalls and questionable solutions. The psychologist Willibald Ruch [de] has been very active in the research of humour. He has collaborated with the linguists Raskin and Attardo on their General Theory of Verbal Humour (GTVH) classification system. Their goal is to empirically test both the six autonomous classification types (KRs) and the hierarchical ordering of these KRs. Advancement in this direction would be a win-win for both fields of study; linguistics would have empirical verification of this multi-dimensional classification system for jokes, and psychology would have a standardised joke classification with which they could develop verifiably comparable measurement tools. "The linguistics of humor has made gigantic strides forward in the last decade and a half and replaced the psychology of humor as the most advanced theoretical approach to the study of this important and universal human faculty." This recent statement by one noted linguist and humour researcher describes, from his perspective, contemporary linguistic humour research. Linguists study words, how words are strung together to build sentences, how sentences create meaning which can be communicated from one individual to another, and how our interaction with each other using words creates discourse. Jokes have been defined above as oral narratives in which words and sentences are engineered to build toward a punchline. The linguist's question is: what exactly makes the punchline funny? This question focuses on how the words used in the punchline create humour, in contrast to the psychologist's concern (see above) with the audience's response to the punchline. The assessment of humour by psychologists "is made from the individual's perspective; e.g. the phenomenon associated with responding to or creating humor and not a description of humor itself." Linguistics, on the other hand, endeavours to provide a precise description of what makes a text funny. Two major new linguistic theories have been developed and tested within the last decades. The first was advanced by Victor Raskin in "Semantic Mechanisms of Humor", published 1985. While being a variant on the more general concepts of the incongruity theory of humour, it is the first theory to identify its approach as exclusively linguistic. The Script-based Semantic Theory of Humour (SSTH) begins by identifying two linguistic conditions which make a text funny. It then goes on to identify the mechanisms involved in creating the punchline. This theory established the semantic/pragmatic foundation of humour as well as the humour competence of speakers.[note 7] Several years later the SSTH was incorporated into a more expansive theory of jokes put forth by Raskin and his colleague Salvatore Attardo. In the General Theory of Verbal Humour, the SSTH was relabelled as a Logical Mechanism (LM) (referring to the mechanism which connects the different linguistic scripts in the joke) and added to five other independent Knowledge Resources (KR). Together these six KRs could now function as a multi-dimensional descriptive label for any piece of humorous text. Linguistics has developed further methodological tools which can be applied to jokes: discourse analysis and conversation analysis of joking. Both of these subspecialties within the field focus on "naturally occurring" language use, i.e. the analysis of real (usually recorded) conversations. One of these studies has already been discussed above, where Harvey Sacks describes in detail the sequential organisation in telling a single joke. Discourse analysis emphasises the entire context of social joking, the social interaction which cradles the words. Folklore and cultural anthropology have perhaps the strongest claims on jokes as belonging to their bailiwick. Jokes remain one of the few remaining forms of traditional folk literature transmitted orally in western cultures. Identified as one of the "simple forms" of oral literature by André Jolles in 1930, they have been collected and studied since there were folklorists and anthropologists abroad in the lands. As a genre they were important enough at the beginning of the 20th century to be included under their own heading in the Aarne–Thompson index first published in 1910: Anecdotes and jokes. Beginning in the 1960s, cultural researchers began to expand their role from collectors and archivists of "folk ideas" to a more active role of interpreters of cultural artefacts. One of the foremost scholars active during this transitional time was the folklorist Alan Dundes. He started asking questions of tradition and transmission with the key observation that "No piece of folklore continues to be transmitted unless it means something, even if neither the speaker nor the audience can articulate what that meaning might be." In the context of jokes, this then becomes the basis for further research. Why is the joke told right now? Only in this expanded perspective is an understanding of its meaning to the participants possible. This questioning resulted in a blossoming of monographs to explore the significance of many joke cycles. What is so funny about absurd nonsense elephant jokes? Why make light of dead babies? In an article on contemporary German jokes about Auschwitz and the Holocaust, Dundes justifies this research: Whether one finds Auschwitz jokes funny or not is not an issue. This material exists and should be recorded. Jokes are always an important barometer of the attitudes of a group. The jokes exist and they obviously must fill some psychic need for those individuals who tell them and those who listen to them. A stimulating generation of new humour theories flourishes like mushrooms in the undergrowth: Elliott Oring's theoretical discussions on "appropriate ambiguity" and Amy Carrell's hypothesis of an "audience-based theory of verbal humor (1993)" to name just a few. In his book Humor and Laughter: An Anthropological Approach, the anthropologist Mahadev Apte presents a solid case for his own academic perspective. "Two axioms underlie my discussion, namely, that humor is by and large culture based and that humor can be a major conceptual and methodological tool for gaining insights into cultural systems." Apte goes on to call for legitimising the field of humour research as "humorology"; this would be a field of study incorporating an interdisciplinary character of humour studies. While the label "humorology" has yet to become a household word, great strides are being made in the international recognition of this interdisciplinary field of research. The International Society for Humor Studies was founded in 1989 with the stated purpose to "promote, stimulate and encourage the interdisciplinary study of humour; to support and cooperate with local, national, and international organizations having similar purposes; to organize and arrange meetings; and to issue and encourage publications concerning the purpose of the society". It also publishes Humor: International Journal of Humor Research and holds yearly conferences to promote and inform its speciality. In 1872, Charles Darwin published one of the first "comprehensive and in many ways remarkably accurate description of laughter in terms of respiration, vocalization, facial action and gesture and posture" (Laughter) in The Expression of the Emotions in Man and Animals. In this early study Darwin raises further questions about who laughs and why they laugh; the myriad responses since then illustrate the complexities of this behaviour. To understand laughter in humans and other primates, the science of gelotology (from the Greek gelos, meaning laughter) has been established; it is the study of laughter and its effects on the body from both a psychological and physiological perspective. While jokes can provoke laughter, laughter cannot be used as a one-to-one marker of jokes because there are multiple stimuli to laughter, humour being just one of them. The other six causes of laughter listed are social context, ignorance, anxiety, derision, acting apology, and tickling. As such, the study of laughter is a secondary albeit entertaining perspective in an understanding of jokes. Computational humour is a new field of study which uses computers to model humour; it bridges the disciplines of computational linguistics and artificial intelligence. A primary ambition of this field is to develop computer programs which can both generate a joke and recognise a text snippet as a joke. Early programming attempts have dealt almost exclusively with punning because this lends itself to simple straightforward rules. These primitive programs display no intelligence; instead, they work off a template with a finite set of pre-defined punning options upon which to build. More sophisticated computer joke programs have yet to be developed. Based on our understanding of the SSTH / GTVH humour theories, it is easy to see why. The linguistic scripts (a.k.a. frames) referenced in these theories include, for any given word, a "large chunk of semantic information surrounding the word and evoked by it [...] a cognitive structure internalized by the native speaker". These scripts extend much further than the lexical definition of a word; they contain the speaker's complete knowledge of the concept as it exists in his world. As insentient machines, computers lack the encyclopaedic scripts which humans gain through life experience. They also lack the ability to gather the experiences needed to build wide-ranging semantic scripts and understand language in a broader context, a context that any child picks up in daily interaction with his environment. Further development in this field must wait until computational linguists have succeeded in programming a computer with an ontological semantic natural language processing system. It is only "the most complex linguistic structures [which] can serve any formal and/or computational treatment of humor well". Toy systems (i.e. dummy punning programs) are completely inadequate to the task. Despite the fact that the field of computational humour is small and underdeveloped, it is encouraging to note the many interdisciplinary efforts which are currently underway. See also Notes References Further reading
========================================
[SOURCE: https://en.wikipedia.org/wiki/Andr%C3%A9_Jolles] | [TOKENS: 1143]
Contents André Jolles Johannes Andreas Jolles, known as André Jolles (August 7, 1874 – February 22, 1946) was a Dutch-German art historian, literary critic and linguist who was affiliated with the Nazi Party. He is best known for his work on Einfache Formen [de] (transl. Simple Forms). Life Jolles was born on August 7, 1874, in Den Helder, Netherlands. His father, Hendrik Jolle Jolles, died on February 25, 1888, in Naples. Jolles grew up as an only child with his mother Jacoba Cornelia Singles (1847–1901) in Amsterdam, where he attended the Barlaeus Gymnasium. In the 1890s, he worked on magazines such as Van Nu en Straks and De Kroniek in 1893 and 1895 respectively, and was the editor for art and science at De Telegraaf from 1897 to 1898. He studied Egyptian and Semitic languages in Paris and Amsterdam from 1893 to 1894 and again in 1899 at the University of Leiden. In 1896, Jolles met Johan Huizinga in Groningen, who became a long-time friend. On a trip to Italy with Huizinga in 1899, he met his future wife Mathilde Tilli Mönckeberg (1879–1958). They married in September 1900. Their first son, Hendrik, was born in June 1901 but died a year later. After that, they had five children: Jeltje, Jacoba, Jan Andries, Matthijs and Ruth. Jolles, who became wealthy after his mother's death in 1901, began studying at the University of Freiburg im Breisgau, where he received his doctorate on August 3, 1905, with a thesis on Vitruvian aesthetics with Otto Puchstein. He gave his habilitation lecture, "On the narrative and the descriptive element in the fine arts in antiquity and the Middle Ages" in Freiburg (January 1907), and his habilitation thesis, The Egyptian-Mycenaean Ceremonial Vessels, appeared in 1908. Additionally, he co-wrote the pieces Vielliebchen and Alkestis with Carl Mönckeberg, both of which were staged in Hamburg.[clarification needed] His family moved to Berlin in 1908, where he taught from 1909 as a private lecturer on ancient art history at Friedrich-Wilhelm University. When World War I began, he registered as a 40-year-old and became a Dutch volunteer. An artillery regiment accepted him after several rejections. Jolles was naturalised and initially participated in the First World War as a soldier and finally as a lieutenant in the Landwehr. In 1916, as an officer in the occupying forces, he accepted a professorship in classical archaeology and art history at the University of Ghent. In 1920, he was sentenced in absentia to 15 years of forced labour in Ghent.[citation needed] While in Ghent, he lived with Margarethe Grittli Boecklen (1895–1967). After he divorced Mönckeberg, he married Boecklen in August 1918, shortly after the birth of their first child Barbara. Jolles became a professor in Flemish and Dutch language and literature at Leipzig University. In 1923 he also became the professor of comparative history of literature. In 1930, he published his main work Simple Forms (Einfache Formen), in which he set out a typology of oral narrative forms (myth, legend, fairy tale, memorable, case, riddle, saying, joke). As stated in the preface, the book originated from Jolles' lectures, which Drs. Elisabeth Kutzer and Otto Görner wrote down and edited. Jolles' further considerations about the art forms were not substantial enough to be published.[citation needed] Simple Forms place Jolles in the company of Ernst Cassirer, Vladimir Propp, and other precursors of Structuralism. Using anthropology and literary theory, they investigated the origins of aesthetics. The book was translated into English only in 2017, "too late", in the words of Fredric Jameson. On May 1, 1933, he joined the Nazi Party, estranging several friends and his children from his first marriage: Jeltje was married to a Jewish engineer, and Jan Andries was forced to go into exile as a communist to South America under a Spanish Passport with the alias ‘Manuel Enrique Cazón Arribar’. In 1937, Jolles joined the Sicherheitsdienst (SD) – the intelligence agency of the Nazi Party and the SS. He retired in 1941, and from 1942 worked on a study on behalf of the SD on Freemasonry. On his 70th birthday, he received the Goethe Medal for Art and Science from Hitler in 1944.[citation needed] On a questionnaire he filled out in May 1945 about his Nazi past, it is noted in handwriting: "is still a Nazi – too old (71 years) to be arrested". André Jolles committed suicide on February 22, 1946, in Leipzig, Germany. Publications (selection) Literature References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Software_release_life_cycle] | [TOKENS: 2518]
Contents Software release life cycle Page version status This is an accepted version of this page The software release life cycle is the process of developing, testing, and distributing a software product (e.g., an operating system). It typically consists of several stages, such as pre-alpha, alpha, beta, and release candidate, before the final version, or "gold", is released to the public. Pre-alpha refers to the early stages of development, when the software is still being designed and built. Alpha testing is the first phase of formal testing, during which the software is tested internally using white-box techniques. Beta testing is the next phase, in which the software is tested by a larger group of users, typically outside the organization that developed it. The beta phase is focused on reducing impacts on users and may include usability testing. After beta testing, the software may go through one or more release candidate phases, in which it is refined and tested further, before the final version is released. Some software, particularly in the internet and technology industries, is released in a perpetual beta state, meaning that it is continuously being updated and improved, and is never considered to be a fully completed product. This approach allows for a more agile development process and enables the software to be released and used by users earlier in the development cycle. Stages of development Pre-alpha refers to all activities performed during the software project before formal testing. These activities can include requirements analysis, software design, software development, and unit testing. In typical open source development, there are several types of pre-alpha versions. Milestone versions include specific sets of functions and are released as soon as the feature is complete.[citation needed] The alpha phase of the release life cycle is the first phase of software testing (alpha is the first letter of the Greek alphabet, used as the number 1). In this phase, developers generally test the software using white-box techniques. Additional validation is then performed using black-box or gray-box techniques, by another testing team. Moving to black-box testing inside the organization is known as alpha release. Alpha software is not thoroughly tested by the developer before it is released to customers. Alpha software may contain serious errors, and any resulting instability could cause crashes or data loss. Alpha software may not contain all of the features that are planned for the final version. In general, external availability of alpha software is uncommon for proprietary software, while open source software often has publicly available alpha versions. The alpha phase usually ends with a feature freeze, indicating that no more features will be added to the software. At this time, the software is said to be feature-complete. A beta test is carried out following acceptance testing at the supplier's site (the alpha test) and immediately before the general release of the software as a product. A feature-complete (FC) version of a piece of software has all of its planned or primary features implemented but is not yet final due to bugs, performance or stability issues. This occurs at the end of alpha testing in development. Usually, feature-complete software still has to undergo beta testing and bug fixing, as well as performance or stability enhancement before it can go to release candidate, and finally gold status. Beta, named after the second letter of the Greek alphabet, is the software development phase following alpha. A beta phase generally begins when the software is feature-complete but likely to contain several known or unknown bugs. Software in the beta phase will generally have many more bugs in it than completed software and speed or performance issues, and may still cause crashes or data loss. The focus of beta testing is reducing impacts on users, often incorporating usability testing. The process of delivering a beta version to the users is called beta release and is typically the first time that the software is available outside of the organization that developed it. Software beta releases can be either open or closed, depending on whether they are openly available or only available to a limited audience. Beta version software is often useful for demonstrations and previews within an organization and to prospective customers. Some developers refer to this stage as a preview, preview release, prototype, technical preview or technology preview (TP). Beta testers are people who actively report issues with beta software. They are usually customers or representatives of prospective customers of the organization that develops the software. Beta testers tend to volunteer their services free of charge but often receive versions of the product they test, discounts on the release version, or other incentives. Some software is kept in so-called perpetual beta, where new features are continually added to the software without establishing a final "stable" release. As the Internet has facilitated the rapid and inexpensive distribution of software, companies have begun to take a looser approach to the use of the word beta. A release candidate (RC), also known as gamma testing or "going silver", is a beta version with the potential to be a stable product, which is ready to release unless significant bugs emerge. In this stage of product stabilization, all product features have been designed, coded, and tested through one or more beta cycles with no known showstopper-class bugs. A release is called code complete when the development team agrees that no entirely new source code will be added to this release. There could still be source code changes to fix defects, changes to documentation and data files, and peripheral code for test cases or utilities.[citation needed] Also called production release, the stable release is the last release candidate (RC) which has passed all stages of verification and tests. Any known remaining bugs are considered acceptable. This release goes to production. Some software products (e.g. Linux distributions like Debian) also have long-term support (LTS) releases which are based on full releases that have already been tried and tested and receive only security updates.[citation needed] Release Once released, the software is generally known as a "stable release". The formal term often depends on the method of release: physical media, online release, or a web application. Usually the released software is assigned an official version name or version number. (Pre-release software may or may not have a separate internal project code name or internal version number). The term "release to manufacturing" (RTM), also known as "going gold", is a term used when a software product is ready to be delivered. This build may be digitally signed, allowing the end user to verify the integrity and authenticity of the software purchase. The RTM build is known as the "gold master" or GM is sent for mass duplication or disc replication if applicable. The terminology is taken from the audio record-making industry, specifically the process of mastering. RTM precedes general availability (GA) when the product is released to the public. A golden master build (GM) is typically the final build of a piece of software in the beta stages for developers. Typically, for iOS, it is the final build before a major release, however, there have been a few exceptions. RTM is typically used in certain retail mass-production software contexts—as opposed to a specialized software production or project in a commercial or government production and distribution—where the software is sold as part of a bundle in a related computer hardware sale and typically where the software and related hardware is ultimately to be available and sold on mass/public basis at retail stores to indicate that the software has met a defined quality level and is ready for mass retail distribution. RTM could also mean in other contexts that the software has been delivered or released to a client or customer for installation or distribution to the related hardware end user computers or machines. The term does not define the delivery mechanism or volume; it only states that the quality is sufficient for mass distribution. The deliverable from the engineering organization is frequently in the form of a golden master media used for duplication or to produce the image for the web. General availability (GA) is the marketing stage at which all necessary commercialization activities have been completed and a software product is available for purchase, depending, however, on language, region, and electronic vs. media availability. Commercialization activities could include security and compliance tests, as well as localization and worldwide availability. The time between RTM and GA can take from days to months before a generally available release can be declared, due to the time needed to complete all commercialization activities required by GA. At this stage, the software has "gone live". Release to the Web (RTW) or Web release is a means of software delivery that utilizes the Internet for distribution. No physical media are produced in this type of release mechanism by the manufacturer. Web releases have become more common as Internet usage has grown.[citation needed] Support During its supported lifetime, the software is sometimes subjected to service releases, patches or service packs, sometimes also called "interim releases" or "maintenance releases" (MR). For example, Microsoft released three major service packs for the 32-bit editions of Windows XP and two service packs for the 64-bit editions. Such service releases contain a collection of updates, fixes, and enhancements, delivered in the form of a single installable package. They may also implement new features. Some software is released with the expectation of regular support. Classes of software that generally involve protracted support as the norm include anti-virus suites and massively multiplayer online games. Continuing with this Windows XP example, Microsoft did offer paid updates for five more years after the end of extended support. This means that support ended on April 8, 2019. When software is no longer sold or supported, the product is said to have reached end-of-life, to be discontinued, retired, deprecated, abandoned, or obsolete, but user loyalty may continue its existence for some time, even long after its platform is obsolete—e.g., the Common Desktop Environment and Sinclair ZX Spectrum. After the end-of-life date, the developer will usually not implement any new features, fix existing defects, bugs, or vulnerabilities (whether known before that date or not), or provide any support for the product. If the developer wishes, they may release the source code, so that the platform may be maintained by volunteers. History Usage of the "alpha/beta" test terminology originated at IBM.[citation needed] Similar terminologies for IBM's software development were used by people involved with IBM from at least the 1950s (and probably earlier). "A" test was the verification of a new product before the public announcement. The "B" test was the verification before releasing the product to be manufactured. The "C" test was the final test before the general availability of the product. As software became a significant part of IBM's offerings, the alpha test terminology was used to denote the pre-announcement test and the beta test was used to show product readiness for general availability. Martin Belsky, a manager on some of IBM's earlier software projects claimed to have invented the terminology. IBM dropped the alpha/beta terminology during the 1960s, but by then it had received fairly wide notice. The usage of "beta test" to refer to testing done by customers was not done in IBM. Rather, IBM used the term "field test". Major public betas developed afterward, with early customers having purchased a "pioneer edition" of the WordVision word processor for the IBM PC for $49.95. In 1984, Stephen Manes wrote that "in a brilliant marketing coup, Bruce and James Program Publishers managed to get people to pay for the privilege of testing the product." In September 2000, a boxed version of Apple's Mac OS X Public Beta operating system was released. Between September 2005 and May 2006, Microsoft released community technology previews (CTPs) for Windows Vista. From 2009 to 2011, Minecraft was in public beta. In February 2005, ZDNet published an article about the phenomenon of a beta version often staying for years and being used as if it were at the production level. It noted that Gmail and Google News, for example, had been in beta for a long time although widely used; Google News left beta in January 2006, followed by Google Apps (now named Google Workspace), including Gmail, in July 2009. Since the introduction of Windows 8, Microsoft has called pre-release software a preview rather than beta. All pre-release builds released through the Windows Insider Program launched in 2014 are termed "Insider Preview builds". "Beta" may also indicate something more like a release candidate, or as a form of time-limited demo, or marketing technique. See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Talk:United_States] | [TOKENS: 2341]
Contents Talk:United States Article Name, Article Introduction, Human Rights, Culture Within Wikipedia articles it may be appropriate to add a modifier such as "oldest continuous, federal ..."'; however, it is more useful to explain the strength and influence of the U.S. Constitution and political system both domestically and globally. One must also be careful using the word "democratic" due to the limited franchise in early U.S. history and better explain the pioneering expansion of the democratic system and subsequent influence. References Warning: active arbitration remedies The following restrictions apply to everyone editing this article: Editors who repeatedly or seriously fail to adhere to the purpose of Wikipedia, any expected standards of behaviour, or any normal editorial process may be blocked or restricted by an administrator. Demonym I think we should reconsider having only "American" listed as the Demonym for this entry. As animosity is mounting against this country more accurate names are being used intentionally to differentiate between inhabitants of the Americas (Americans) and citizens of the United States of America (also inaccurately referred to as Americans). "Citizens of the United States" etc. would be verbose and differ from each usage of the demonym. I've seen "Usanian" used a lot recently and that could perhaps be listed as an alternative to "American". ~2026-37622-9 (talk) 01:47, 18 January 2026 (UTC)[reply] "The United States is now a competitive authoritarian system" Several sources on this, including this one. I'm curious how many sources it's going to take to change this article. Trump's America is a postliberal, electoral autocracy. There's no doubt about it. Viriditas (talk) 22:51, 22 January 2026 (UTC)[reply] @Tarlby: I believe this source might meet your criteria. (Oct 2025) It certainly meets my own. Viriditas (talk) 23:54, 22 January 2026 (UTC)[reply] What a load of bullshit you're writing, then you can say it's like the Chinese government!!!--Dorian88A (talk) 03:00, 23 January 2026 (UTC)[reply] Quote: The USA dropped below the "democracy threshold" (+6) on the POLITY scale in 2020 and was considered an anocracy (+5) at the end of the year 2020; the USA score for 2021 returned to democracy (+8). Beginning on 1 July 2024, due to the US Supreme Court ruling granting the US Presidency broad, legal immunity, the USA is noted by the Polity Project as experiencing a regime transition through, at least, 20 January 2025. As of the latter date, the USA is coded EXREC=8, "Competitive Elections"; EXCONST=1 "Unlimited Executive Authority"; and POLCOMP=6 "Factional/Restricted Competition." Polity scores: DEMOC=4; AUTOC=4; POLITY=0. The USA is no longer considered a democracy and lies at the cusp of autocracy; it has experienced a Presidential Coup and an Adverse Regime Change event (8-point drop in its POLITY score).}} Quote: Regarding the United States, once a global symbol of democracy, Lindberg said, “The United States, by my analysis, at this point is no longer a democracy.” He went further to claim that the US is an “electoral autocracy.” An electoral autocracy is a country that formally holds elections, but those elections are not fair or just and thereby do not guarantee actual democratic competition. He went on to clarify that it’s possible to defend democracy without the US Quote: As the graph below indicates, the current U.S. democracy rating of 54 among experts is closest to the 44 rating experts gave to our hypothetical illiberal democracy (“Country B”). Experts put the U.S. at approximately equal distance from the strong democracy (“Country A”), which received an average rating of 92, and the non-democracy (“Country C”), which received an average rating of 18. Quote: Century’s New Democracy Meter Shows America Took an Authoritarian Turn in 2025. In the first year of Trump 2.0, the United States went from being a passing if imperfect democracy to behaving like an authoritarian state: breaking the law, ignoring court rulings, engaging in grand corruption, targeting critics for persecution, and conducting a campaign against immigrants [...] that flagrantly violates civil rights. Crucially, elections are still free, providing for the time being an avenue to reverse the democratic decline. Quote: Professor Christina Pagel mapped the first actions of the Trump administration in a Venn diagram that identifies "five broad domains that correspond to features of proto-authoritarian states". These five domains are: undermining democratic institutions and the rule of law, dismantling federal government; dismantling social protections and rights, enrichment and corruption; suppressing dissent and controlling information; attacking science, environment, health, arts and education, particularly universities; aggressive foreign policy and global destabilization. Quote: "[It is] a process of regime change towards autocracy that makes the exercise of political power more arbitrary and repressive and that restricts the space for public contestation and political participation in the process of government selection". Quote: Backsliding entails a deterioration of qualities associated with democratic governance, within any regime. In democratic regimes, it is a decline in the quality of democracy; in autocracies, it is a decline in democratic qualities of governance. ETQueEsteveEmVarginha (talk) 16:18, 25 January 2026 (UTC)[reply] User:ETQueEsteveEmVarginha, could you add to this list please? Viriditas (talk) 03:26, 27 January 2026 (UTC)[reply] Has China joined the US as a Superpower? I note that there are posts/reverts going on on whether China has joined the US as a superpower. I note the Superpower article seems to now agree yes though still somewhat argued in Talk. At least when the US is stated to be the sole superpower this should be hedged; e.g., In the intro "The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower." to perhaps "The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole undisputed superpower until the recent rise of China." This also means extending the history section to a bit past 2021 and mentioning China there. Erp (talk) 15:46, 7 February 2026 (UTC)[reply] References Moxy🍁 19:13, 7 February 2026 (UTC)[reply] English official language infobox An executive order can not establish an official language. The wiki page for that executive order even states so. There is no congressional legislation so there is no official language. Same as Gulf of Mexico and Kennedy Center, which there’s already been much discussion about. ~2026-87337-6 (talk) 02:38, 9 February 2026 (UTC)[reply] American Music and Musical History While I understand the importance and popularity of country/folk music to the American public, as well as the history of the Nashville scene, I don't think a photo of the Country Hall of Fame is the right choice at all for a brief overview of American music. Even ignoring the importance of black art forms in the history of country music, folk styles developed more concretely from European traditions than a significant amount of new American arts (see Blues, Jazz, Rap, Electronic, so on). Blues and Jazz are much more indicative of not just American innovation but also how American and world music would develop (British rock taking the blues, pop music adopting jazz harmonic sensibilities, etc). Even then, to talk more about the modern face of American music through a genre like hip-hop I think would make significantly more sense. An overwhelming amount of American music, past and present, is due to African-American musical tradition and as popular as country is, it feels strange to mitigate that. I'd nominate artists like Miles Davis (decades long career across multiple American genres and forms) or a Tribe Called Quest (first world exporters of hip-hop with feet dipped across the American musical tradition) as much better representatives of American music. ~2026-94665-0 (talk) 18:58, 11 February 2026 (UTC)[reply] Extended-confirmed-protected edit request on 14 February 2026 Please change ‘The United States of America (USA), also known as the United States (U.S.) or America’ to ‘The United States of America (USA), also known as the United States (U.S.)’. Cite error: There are <ref> tags on this page without content in them (see the help page).The term America is misleading in this case because it is a term that refers to the whole continent of the New World since 1507, named by Flemish cartographer Gerardus Mercator. This is only until The second world war, when most American geographers agreed on seperating South- and North-America, now also referred to as ‘the Americas’. The United States of America is not ‘also known as’ America itself, this would be paradoxical. Sources: <Lester, Toby. “The Waldseemüller Map: Charting the New World.” Smithsonian Magazine, December 2009. Web. November 5, 2014. http://www.smithsonianmag.com/history/the-waldseemuller-map-charting-the-new-world-148815355/> <Crane, Nicholas. Mercator: The Man Who Mapped the Planet. London: Phoenix, 2003. Print.> <The Myth of Continents. (z.d.). https://archive.nytimes.com/www.nytimes.com/books/first/l/lewis-myth.html> Roosvdm (talk) 20:21, 14 February 2026 (UTC)[reply] Extended-confirmed-protected edit request on 16 February 2026 Change "...allowing the United States to outpace the economies of England, France and Germany combined." to "...allowing the United States to outpace the economies of Britain, France and Germany combined." PurpleLynx797 (talk) 14:30, 16 February 2026 (UTC)[reply] "States (United)" listed at Redirects for discussion The redirect States (United) has been listed at redirects for discussion to determine whether its use and function meets the redirect guidelines. Readers of this page are welcome to comment on this redirect at Wikipedia:Redirects for discussion/Log/2026 February 17 § Various parenthesis redirects until a consensus is reached. I am bad at usernames (talk · contribs) 22:13, 17 February 2026 (UTC)[reply] Extended-confirmed-protected edit request on 19 February 2026 may i please edit this page Doctor dingleberry (talk) 16:32, 19 February 2026 (UTC)[reply] Extended-confirmed-protected edit request on 19 February 2026 (2) leader1 {kolton donally}
========================================
[SOURCE: https://en.wikipedia.org/wiki/Chairman] | [TOKENS: 1700]
Contents Chair (officer) The chair, also chairman, chairwoman, or chairperson, is the presiding officer of an organized group such as a board, committee, or deliberative assembly. The person holding the office, who is typically elected or appointed by members of the group or organisation, presides over meetings of the group, and is required to conduct the group's business in an orderly fashion. In some organizations, the chair is also known as president (or other title). In others, where a board appoints a president (or other title), the two terms are used for distinct positions. The term chairman may be used in a neutral manner, not directly implying the gender of the holder. In meetings or conferences, to "chair" something (chairing) means to lead the event. Terminology Terms for the office and its holder include chair, chairman, chairwoman, chairperson, convenor, facilitator, moderator, president, and presiding officer. The chair of a parliamentary chamber is sometimes called the speaker. Chair has been used to refer to a seat or office of authority since the middle of the 17th century; its earliest citation in the Oxford English Dictionary dates to 1658–1659, four years after the first citation for chairman. Feminist critiques have analysed Chairman as a possible example of sexist language, associating the male gender with the exercise of authority, this has led to some use of the generic "Chairperson". In World Schools Style debating, as of 2009, chair or chairman refers to the person who controls the debate; it recommends using Madame Chair or Mr. Chairman to address the chair. The FranklinCovey Style Guide for Business and Technical Communication and the American Psychological Association style guide advocate using chair or chairperson. The Oxford Dictionary of American Usage and Style (2000) suggested that the gender-neutral forms were gaining ground; it advocated chair for both men and women. The Daily Telegraph's style guide bans the use of chair and chairperson; the newspaper's position, as of 2018, is that "chairman is correct English". The National Association of Parliamentarians adopted a resolution in 1975 discouraging the use of chairperson and rescinded it in 2017. The word chair can refer to the place from which the holder of the office presides, whether on a chair, at a lectern, or elsewhere. During meetings, the person presiding is said to be "in the chair" and is also referred to as "the chair". Parliamentary procedure requires that members address the "chair" as "Mr. (or Madam) Chairman (or Chair or Chairperson)" rather than using a name – one of many customs intended to maintain the presiding officer's impartiality and to ensure an objective and impersonal approach. In the British music hall tradition, the chairman was the master of ceremonies who announced the performances and was responsible for controlling any rowdy elements in the audience. The role was popularised on British TV in the 1960s and 1970s by Leonard Sachs, the chairman on the variety show The Good Old Days. "Chairman" as a quasi-title gained particular resonance when socialist states from 1917 onwards shunned more traditional leadership labels and stressed the collective control of Soviets (councils or committees) by beginning to refer to executive figureheads as "Chairman of the X Committee". Vladimir Lenin, for example, officially functioned as the head of Soviet Russian government not as prime minister or as president, but as "Chairman of the Council of People's Commissars". At the same time, the head of the state was first called "Chairman of the Central Executive Committee" (until 1938) and then "Chairman of the Presidium of the Presidium of the Supreme Soviet". In China, Mao Zedong was commonly called "Chairman Mao", as he was officially Chairman of the Chinese Communist Party and Chairman of the Central Military Commission. Roles and responsibilities In addition to the administrative or executive duties in organizations, the chair presides over meetings. Such duties at meetings include: While presiding, the chair should remain impartial and not interrupt a speaker if the speaker has the floor and is following the rules of the group. In committees or small boards, the chair votes along with the other members; in assemblies or larger boards, the chair should vote only when it can affect the result. At a meeting, the chair only has one vote (i.e. the chair cannot vote twice and cannot override the decision of the group unless the organization has specifically given the chair such authority). The powers of the chair vary widely across organizations. In some organizations they have the authority to hire staff and make financial decisions. In others they only make recommendations to a board of directors, or may have no executive powers, in which case they are mainly a spokesperson for the organization. The power given depends upon the type of organization, its structure, and the rules it has created for itself. If the chair exceeds their authority, engages in misconduct, or fails to perform their duties, they may face disciplinary procedures. Such procedures may include censure, suspension, or removal from office. The rules of the organization would provide details on who can perform these disciplinary procedures. Usually, whoever appointed or elected the chair has the power to discipline them. Public corporations There are three common types of chair in public corporations. The chief executive officer (CEO) may also hold the title of chair, in which case the board frequently names an independent member of the board as a lead independent director. This position is equivalent to the position of président-directeur général in France.[citation needed] Executive chair is an office separate from that of CEO, where the titleholder wields influence over company operations, such as Larry Ellison of Oracle, Douglas Flint of HSBC and Steve Case of AOL Time Warner. In particular, the group chair of HSBC is considered the top position of that institution, outranking the chief executive, and is responsible for leading the board and representing the company in meetings with government figures. Before the creation of the group management board in 2006, HSBC's chair essentially held the duties of a chief executive at an equivalent institution, while HSBC's chief executive served as the deputy. After the 2006 reorganization, the management cadre ran the business, while the chair oversaw the controls of the business through compliance and audit and the direction of the business. Non-executive chair is also a separate post from the CEO; unlike an executive chair, a non-executive chair does not interfere in day-to-day company matters. Across the world, many companies have separated the roles of chair and CEO, saying that this move improves corporate governance. The non-executive chair's duties are typically limited to matters directly related to the board, such as: Many companies in the US have an executive chair; this method of organization is sometimes called the American model. Having a non-executive chair is common in the UK and Canada; this is sometimes called the British model. Expert opinion is rather evenly divided over which is the preferable model. There is a growing push by public market investors for companies with an executive chair to have a lead independent director to provide some element of an independent perspective. The role of the chair in a private equity-backed board differs from the role in non-profit or publicly listed organizations in several ways, including the pay, role and what makes an effective private-equity chair. Companies with both an executive chair and a CEO include Ford, HSBC, Alphabet Inc., and HP. Vice-chair and deputy chair A vice- or deputy chair, subordinate to the chair, is sometimes chosen to assist and to serve as chair in the latter's absence, or when a motion involving the chair is being discussed. In the absence of the chair and vice-chair, groups sometimes elect a chair pro tempore to fill the role for a single meeting. In some organizations that have both titles, deputy chair ranks higher than vice-chair, as there are often multiple vice-chairs but only a single deputy chair. This type of deputy chair title on its own usually has only an advisory role and not an operational one (such as Ted Turner at Time Warner). An unrelated definition of vice- and deputy chairs describes an executive who is higher ranking or has more seniority than an executive vice-president (EVP). See also References Further reading
========================================
[SOURCE: https://en.wikipedia.org/w/index.php?title=United_States&action=edit] | [TOKENS: 115]
View source for United States You do not have permission to edit this page, for the following reason: Why is the page protected? What can I do? Submit an edit request This article relates to post-1992 politics of the United States and closely related people, which is a contentious topic. Your behaviour on this article is subject to special rules. You must follow: If you do not follow those rules then you may be banned from editing on the topic or blocked from editing entirely. Pages transcluded onto the current version of this page (help): Return to United States.
========================================
[SOURCE: https://techcrunch.com/author/rebecca-bellan/] | [TOKENS: 254]
Save up to $680 on your pass with Super Early Bird rates. REGISTER NOW. Save up to $680 on your Disrupt 2026 pass. Ends February 27. REGISTER NOW. Latest AI Amazon Apps Biotech & Health Climate Cloud Computing Commerce Crypto Enterprise EVs Fintech Fundraising Gadgets Gaming Google Government & Policy Hardware Instagram Layoffs Media & Entertainment Meta Microsoft Privacy Robotics Security Social Space Startups TikTok Transportation Venture Staff Events Startup Battlefield StrictlyVC Newsletters Podcasts Videos Partner Content TechCrunch Brand Studio Crunchboard Contact Us Rebecca Bellan Senior Reporter, TechCrunch Rebecca Bellan is a senior reporter at TechCrunch where she covers the business, policy, and emerging trends shaping artificial intelligence. Her work has also appeared in Forbes, Bloomberg, The Atlantic, The Daily Beast, and other publications. You can contact or verify outreach from Rebecca by emailing rebecca.bellan@techcrunch.com or via encrypted message at rebeccabellan.491 on Signal. Latest from Rebecca Bellan 1,065 Episodes Last update: Feb 2026 Equity is TechCrunch’s flagship podcast about the business of startups, unpacked by the writers who know best. Produced by Theresa… © 2025 TechCrunch Media LLC.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Sony_Electronics] | [TOKENS: 11890]
Contents Sony Sony Group Corporation[b], commonly referred to as Sony[c], is a Japanese multinational conglomerate headquartered at Sony City in Minato, Tokyo, Japan. The Sony Group encompasses various businesses, including electronics (Sony Corporation), imaging and sensing (Sony Semiconductor Solutions), film (Sony Pictures Entertainment), music (Sony Music Group and Sony Music Entertainment Japan), video games (Sony Interactive Entertainment), and others. Sony was founded in 1946 as initially Tokyo Tsushin Kogyo K.K.[d] by Masaru Ibuka and Akio Morita. In 1958, the company adopted the name Sony Corporation.[e] Initially an electronics firm, it gained early recognition for products such as the TR-55 transistor radio and the CV-2000 home video tape recorder, contributing significantly to Japan's post-war economic recovery. After Ibuka's retirement in the 1970s, Morita served as chairman until 1994, overseeing Sony's rise as a global brand recognized for innovation in consumer electronics. Landmark products included the Trinitron color television, the Walkman portable audio player, and the co-development of the compact disc. Expanding beyond electronics, Sony acquired Columbia Records in 1988 and Columbia Pictures in 1989, while also entering the home video game console market with the launch of the PlayStation in 1994. In Japan, the company further diversified by establishing a financial services division, that would be turned into a separate company in September 2025, with the group maintaining 20% of the shares. In 2021, the company was renamed Sony Group Corporation as it transitioned into a holding company structure, with its electronics business continuing under the name Sony Corporation. As of 2020[update], Sony holds a 55% share of the global image sensor market, making it the largest image sensor manufacturer, the second largest camera manufacturer, a semiconductor sales leader, and the world's third-largest television manufacturer by sales. Although Sony is not part of a traditional keiretsu, it has historical ties to the Sumitomo Mitsui Financial Group, dating back to the 1950s when it relied exclusively on Mitsui Bank for financing. Sony is publicly traded on the Tokyo Stock Exchange (a component of the Nikkei 225 and TOPIX Core30 indices) and also maintains American depositary receipts on the New York Stock Exchange, where it has been listed since 1961. As of 2021, it ranked 88th on the Fortune Global 500 and 57th on the 2023 Forbes Global 2000 list. History Sony began in the wake of World War II. In 1946, Masaru Ibuka started an electronics shop in Shirokiya, a department store building in the Nihonbashi area of Tokyo. The company started with a capital of ¥190,000 and a total of eight employees. On 7 May 1946, Ibuka was joined by Akio Morita to establish a company called Tokyo Tsushin Kogyo (東京通信工業, Tōkyō Tsūshin Kōgyō; Tokyo Telecommunications Engineering Corporation). The company built Japan's first tape recorder, called the Type-G. In 1958, the company changed its name to "Sony". Tokyo Tsushin Kogyo founders Morita and Ibuka realized that to achieve success and grow, their business had to expand to the global market, which required labeling their products with a short and easy brand name. While looking for a romanized name, they at first strongly considered using their initials, TTK. The primary reason they did not is that the railway company Tokyo Kyuko was known as TTK. The company occasionally used the syllabic acronym "Totsuko" in Japan, but during his visit to the United States, Morita discovered that Americans had trouble pronouncing that name. Another early name that was tried out for a while was "Tokyo Teletech" until Akio Morita discovered that there was an American company already using Teletech as a brand name. The name "Sony" was chosen for the brand as a mix of two words: one was the Latin word "sonus", which is the root of sonic and sound, and the other was "sonny", a common slang term used in 1950s America to call a young boy. In 1950s Japan, "sonny boys" was a loan word in Japanese, which connoted smart and presentable young men, which Akio Morita and Masaru Ibuka considered themselves to be. The first Sony-branded product, the TR-55 transistor radio, appeared in 1955, but the company name did not change to Sony until January 1958. At the time of the change, it was extremely unusual for a Japanese company to use Roman letters to spell its name instead of writing it in kanji. The move was not without opposition: TTK's principal bank at the time, Mitsui, had strong feelings about the name. They pushed for a name such as Sony Electronic Industries, or Sony Teletech. Akio Morita was firm, however, as he did not want the company name tied to any particular industry. Eventually, both Ibuka and Mitsui Bank's chairman gave their approval. According to Schiffer, Sony's TR-63 radio "cracked open the U.S. market and launched the new industry of consumer microelectronics." By the mid-1950s, American teens had begun buying portable transistor radios in huge numbers, helping to propel the fledgling industry from an estimated 100,000 units in 1955 to 5 million units by the end of 1968. Sony co-founder Akio Morita founded Sony Corporation of America in 1960. In the process, he was struck by the mobility of employees between American companies, which was unheard of in Japan at that time. When he returned to Japan, he encouraged experienced, middle-aged employees of other companies to reevaluate their careers and consider joining Sony. The company filled many positions in this manner, and inspired other Japanese companies to do the same. Moreover, Sony played a major role in the development of Japan as a powerful exporter during the 1960s, 1970s and 1980s, supplying the U.S. Military with bomb parts used in the Vietnam War. It also helped to significantly improve American perceptions of "made in Japan" products. Known for its production quality, Sony was able to charge above-market prices for its consumer electronics and resisted lowering prices. In 1971, Masaru Ibuka handed the position of president over to his co-founder Akio Morita. Sony began a life insurance company in 1979, one of its many peripheral businesses. Amid a global recession in the early 1980s, electronics sales dropped and the company was forced to cut prices. Sony's profits fell sharply. "It's over for Sony", one analyst concluded. "The company's best days are behind it." Around that time, Norio Ohga took up the role of president. He encouraged the development of the compact disc (CD) in the 1970s and 1980s, and of the PlayStation in the early 1990s. Ohga went on to purchase CBS Records in 1988 and Columbia Pictures in 1989, greatly expanding Sony's media presence. Ohga would succeed Morita as chief executive officer in 1989.[citation needed] Under the vision of co-founder Akio Morita and his successors, the company had aggressively expanded into new businesses. Part of its motivation for doing so was the pursuit of "convergence", linking film, music and digital electronics via the Internet. This expansion proved unrewarding and unprofitable, threatening Sony's ability to charge a premium on its products as well as its brand name. In 2005, Howard Stringer replaced Nobuyuki Idei as chief executive officer, marking the first time that a foreigner had run a major Japanese electronics firm. Stringer helped to reinvigorate the company's struggling media businesses, encouraging blockbusters such as Spider-Man while cutting 9,000 jobs. He hoped to sell off peripheral business and focus the company again on electronics. Furthermore, he aimed to increase cooperation between business units, which he described as "silos" operating in isolation from one another. In a bid to provide a unified brand for its global operations, Sony introduced a slogan known as "make.believe" in 2009.[citation needed] Despite some successes, the company faced continued struggles in the mid- to late-2000s. In 2012, Kazuo Hirai was promoted to president and CEO, replacing Stringer. Shortly thereafter, Hirai outlined his company-wide initiative, named "One Sony" to revive Sony from years of financial losses and bureaucratic management structure, which proved difficult for former CEO Stringer to accomplish, partly due to differences in business culture and native languages between Stringer and some of Sony's Japanese divisions and subsidiaries. Hirai outlined three major areas of focus for Sony's electronics business, which include imaging technology, gaming and mobile technology, as well as a focus on reducing the major losses from the television business. In February 2014, Sony announced the sale of its Vaio PC division to a new corporation owned by investment fund Japan Industrial Partners and spinning its TV division into its own corporation as to make it more nimble to turn the unit around from past losses totaling $7.8 billion over a decade. Later that month, they announced that they would be closing 20 stores. In April, the company announced that they would be selling 9.5 million shares in Square Enix (roughly 8.2 percent of the game company's total shares) in a deal worth approximately $48 million. In May 2014 the company announced it was forming two joint ventures with Shanghai Oriental Pearl Group to manufacture and market Sony's PlayStation game consoles and associated software in China. In 2015, Sony purchased Toshiba's image sensor business. It was reported in December 2016 by multiple news outlets that Sony was considering restructuring its U.S. operations by merging its TV & film business, Sony Pictures Entertainment, with its gaming business, Sony Interactive Entertainment. According to the reports, such a restructuring would have placed Sony Pictures under Sony Interactive's CEO, Andrew House, though House would not have taken over day-to-day operations of the film studio. According to one report, Sony was set to make a final decision on the possibility of the merger of the TV, film, & gaming businesses by the end of its fiscal year in March of the following year (2017). In 2017, Sony sold its lithium-ion battery business to Murata Manufacturing. In 2019, Sony merged its mobile, TV and camera businesses. On 1 April 2020, Sony Electronics Corporation was established as an intermediate holding company to own and oversee its electronics and IT solutions businesses. On 19 May 2020, the company announced that it would change its name to Sony Group Corporation as of 1 April 2021. Subsequently, Sony Electronics Corporation would be renamed to Sony Corporation. On the same day the company announced that it would turn Sony Financial Holdings (currently Sony Financial Group), of which Sony already owns 65.06% of shares, to a wholly owned subsidiary through a takeover bid. On 1 April 2021, Sony Corporation was renamed Sony Group Corporation. On the same day, Sony Mobile Communications Inc. absorbed Sony Electronics Corporation, Sony Imaging Products & Solutions Inc., and Sony Home Entertainment & Sound Products Inc. and changed its trade name to Sony Corporation. Formats and technologies Sony has historically been notable for creating its own in-house standards for new recording and storage technologies, instead of adopting those of other manufacturers and standards bodies, while its success in the early years owes to a smooth capitalization on the Digital Compact Cassette standard introduced by Philips, with which Sony went on to enjoy a decades-long technological relationship in various areas. Sony (either alone or with partners) has introduced several of the most popular recording formats, including the 3.5-inch floppy disk, compact disc and Blu-ray disc. Sony introduced U-matic, the world's first videocassette format, in 1971, but the standard was unpopular for domestic use due to the high price. The company subsequently launched the Betamax format in 1975. Sony was involved in the videotape format war of the early 1980s, when they were marketing the Betamax system for video cassette recorders against the VHS format developed by JVC. In the end, VHS gained critical mass in the marketbase and became the worldwide standard for consumer VCRs. Betamax is, for all practical purposes, an obsolete format. Sony's professional-oriented component video format called Betacam, which was derived from Betamax, was used until 2016 when Sony announced it was stopping production of all remaining 1/2-inch video tape recorders and players, including the Digital Betacam format. In 1985, Sony launched their Handycam products and the Video8 format. Video8 and the follow-on hi-band Hi8 format became popular in the consumer camcorder market. In 1987 Sony launched the 4 mm DAT or Digital Audio Tape as a new digital audio tape standard. Sony held a patent for its proprietary Trinitron until 1996. Sony introduced the Triluminos Display, the company's proprietary color reproduction enhancing technology, in 2004, featured in the world's first LED-backlit LCD televisions. It was widely used in other Sony's products as well, including computer monitors, laptops, and smartphones. In 2013, Sony released a new line of televisions with an improved version of the technology, which incorporated quantum dots in the backlight system. It was the first commercial use of quantum dots. In 2012, the company revealed a prototype of an ultrafine RGB LED display, which it calls the Crystal LED Display. Sony used the Compact Cassette format in many of its tape recorders and players, including the Walkman, the world's first portable music player. Sony introduced the MiniDisc format in 1992 as an alternative to Philips DCC or Digital Compact Cassette and as a successor to the Compact Cassette. Since the introduction of MiniDisc, Sony has attempted to promote its own audio compression technologies under the ATRAC brand, against the more widely used MP3. Until late 2004, Sony's Network Walkman line of digital portable music players did not support the MP3 standard natively. In 2004, Sony built upon the MiniDisc format by releasing Hi-MD. Hi-MD allows the playback and recording of audio on newly introduced 1 GB Hi-MD discs in addition to playback and recording on regular MiniDiscs. In addition to saving audio on the discs, Hi-MD allows the storage of computer files such as documents, videos and photos. In 1993, Sony challenged the industry standard Dolby Digital 5.1 surround sound format with a newer and more advanced proprietary motion picture digital audio format called SDDS (Sony Dynamic Digital Sound). This format employed eight channels (7.1) of audio opposed to just six used in Dolby Digital 5.1 at the time. Ultimately, SDDS has been vastly overshadowed by the preferred DTS (Digital Theatre System) and Dolby Digital standards in the motion picture industry. SDDS was solely developed for use in the theatre circuit; Sony never intended to develop a home theatre version of SDDS. Sony and Philips jointly developed the Sony-Philips digital interface format (S/PDIF) and the high-fidelity audio system SACD. The latter became entrenched in a format war with DVD-Audio. Still, neither gained a major foothold with the general public. CDs had been preferred by consumers because of the ubiquitous presence of CD drives in consumer devices until the early 2000s when the iPod and streaming services became available. In 2015, Sony introduced LDAC, a proprietary audio coding technology which allows streaming high-resolution audio over Bluetooth connections at up to 990 kbit/s at 32 bit/96 kHz. Sony also contributed it as part of the Android Open Source Project starting from Android 8.0 "Oreo", enabling every OEM to integrate this standard into their own Android devices freely. However the decoder library is proprietary, so receiving devices require licenses. On 17 September 2019, the Japan Audio Society (JAS) certified LDAC with their Hi-Res Audio Wireless certification. Currently the only codecs with the Hi-Res Audio Wireless certification are LDAC and LHDC, another competing standard. Sony demonstrated an optical digital audio disc in 1977 and soon joined hands with Philips, another major contender for the storage technology, to establish a worldwide standard. In 1983, the two companies jointly announced the Compact Disc (CD). In 1984, Sony launched the Discman series, an expansion of the Walkman brand to portable CD players. Sony began to improve performance and capacity of the novel format. It launched write-once optical discs (WO) and magneto-optical discs which were around 125MB size for the specific use of archival data storage, in 1986 and 1988 respectively. In the early 1990s, two high-density optical storage standards were being developed: one was the MultiMedia Compact Disc (MMCD), backed by Philips and Sony, and the other was the Super Density Disc (SD), supported by Toshiba and many others. Philips and Sony abandoned their MMCD format and agreed upon Toshiba's SD format with only one modification. The unified disc format was called DVD and was introduced in 1997. Sony was one of the leading developers of the Blu-ray optical disc format, the newest standard for disc-based content delivery. The first Blu-ray players became commercially available in 2006. The format emerged as the standard for HD media over the competing format, Toshiba's HD DVD, after a two-year-long high-definition optical disc format war. Sony's laser communication devices for small satellites rely on the technologies developed for the company's optical disc products. In 1983, Sony introduced 90 mm micro diskettes, better known as 3.5-inch (89 mm) floppy disks, which it had developed at a time when there were 4" floppy disks, and many variations from different companies, to replace the then on-going 5.25" floppy disks. Sony had great success and the format became dominant. 3.5" floppy disks gradually became obsolete as they were replaced by current media formats. Sony held more than a 70 percent share of the market when it decided to pull the plug on the format in 2010. Sony still develops magnetic tape storage technologies along with IBM, and are one of only two manufacturers of Linear Tape-Open (LTO) cartridges. In 1998, Sony launched the Memory Stick format, the flash memory cards for use in Sony lines of digital cameras and portable music players. It has seen little support outside of Sony's own products, with Secure Digital cards (SD) commanding considerably greater popularity. Sony has made updates to the Memory Stick format with Memory Stick Duo and Memory Stick Micro. The company has also released USB flash drive products, branded under the Micro Vault line. Sony introduced FeliCa, a contactless IC card technology primarily used in contactless payment, as a result of the company's joint development and commercialization of Near-Field Communication (NFC) with Philips. The standard is largely offered in two forms, either chips embedded in smartphones or plastic cards with chips embedded in them. Sony plans to implement this technology in train systems across Asia. In 2019, Sony launched the ELTRES, the company's proprietary low-power wide-area wireless communication (LPWAN) standard. Until 1991, Sony had little direct involvement with the video game industry. The company supplied components for other consoles, such as the sound chip for the Super Famicom from Nintendo, and operated a video game studio, Sony Imagesoft. As part of a joint project between Nintendo and Sony that began as early as 1988, the two companies worked to create a CD-ROM version of the Super Famicom, though Nintendo denied the existence of the Sony deal as late as March 1991. At the Consumer Electronics Show in June 1991, Sony revealed a Super Famicom with a built-in CD-ROM drive, named the "Play Station" (also known as SNES-CD). However, a day after the announcement at CES, Nintendo announced that it would be breaking its partnership with Sony, opting to go with Philips instead but using the same technology. The deal was broken by Nintendo after they were unable to come to an agreement on how revenue would be split between the two companies. The breaking of the partnership infuriated Sony President Norio Ohga, who responded by appointing Kutaragi with the responsibility of developing the PlayStation project to rival Nintendo. At that time, negotiations were still on-going between Nintendo and Sony, with Nintendo offering Sony a "non-gaming role" regarding their new partnership with Philips. This proposal was swiftly rejected by Kutaragi who was facing increasing criticism over his work with regard to entering the video game industry from within Sony. Negotiations officially ended in May 1992 and in order to decide the fate of the PlayStation project, a meeting was held in June 1992, consisting of Sony President Ohga, PlayStation Head Kutaragi and several senior members of Sony's board. At the meeting, Kutaragi unveiled a proprietary CD-ROM-based system he had been working on which involved playing video games with 3D graphics to the board. Eventually, Sony President Ohga decided to retain the project after being reminded by Kutaragi of the humiliation he suffered from Nintendo. Nevertheless, due to strong opposition from a majority present at the meeting as well as widespread internal opposition to the project by the older generation of Sony executives, Kutaragi and his team had to be shifted from Sony's headquarters to Sony Music, a completely separate financial entity owned by Sony, so as to retain the project and maintain relationships with Philips for the MMCD development project (which helped lead to the creation of the DVD) In 2021, the WIPO's annual review of the World Intellectual Property Indicators report ranked Sony's as ninth in the world for the number of patent applications published under the PCT System. 1,793 patent applications were published by Sony during 2020. This position is up from their previous ranking as 13th in 2019 with 1,566 applications. Business units Sony offers a wide variety of product lines in many areas. At its peak, it was dubbed as a "corporate octopus", for its sprawling ventures from private insurance to chemicals to cosmetics to home shopping to a Tokyo-based French food joint, in addition its core businesses such as electronics and entertainment. Even after it has unwound many business units including Sony Chemicals and Vaio PC, Sony still runs diverse businesses. As of 2020, Sony is organized into the following business segments: Game & Network Services (G&NS), Music, Pictures, Electronics Products & Solutions (EP&S), Imaging & Sensing Solutions (I&SS), Financial Services, and Others. Usually, each business segment has a handful of corresponding intermediate holding companies under which all the related businesses are folded into, such as Columbia Records being part of Sony Music Group, a subsidiary and, at the same time, a holding company for Sony's music businesses, along with SMEJ. Sony Corporation (Sony Electronics Corporation until 1 April 2021) is the electronics business unit of the Sony Group. It primarily conducts research and development (R&D), planning, designing, manufacturing and marketing for electronics products. Sony Global Manufacturing & Operations Corporation (SGMO) is a wholly owned subsidiary of Sony Corporation and responsible for managing manufacturing operations both in Japan and overseas, through its own factories as well as third party contract manufacturers. In 1979, Sony released the world's first portable music player, the Walkman, bundled with the MDL-3L2 headphones. This line fostered a fundamental change in music listening habits by allowing people to carry music with them and listen to music through lightweight headphones. Originally used to refer to portable audio cassette players, the Walkman brand has been widely adopted by the company to encompass its portable digital audio and video players as well as a line of former Sony Ericsson mobile phones. In the case of optical disc players, the Discman brand was used until the late 1990s. In 2025, a model TPS-L2 cassette Walkman from 1979 was included in Pirouette: Turning Points in Design, an exhibition at the Museum of Modern Art featuring "widely recognized design icons [...] highlighting pivotal moments in design history." In 1999 Sony's first portable digital audio players were introduced; one was a player using Memory Stick flash storage created by the Walkman division, and the other was a smaller pen-sized player with embedded flash storage created by the Vaio division; both accompanied with Sony's OpenMG copyright protection technology and PC software for music transfer. Sony continue to develop Walkman digital audio players, although it was unable to capture the large share and influence in the digital world as it did in the cassette era. Sony is a major audio products manufacturer and one of the active noise control technology leaders. Sony's high-end microphones and headphones for professional use are produced at Sony/Taiyo Corporation, a designated special subsidiary at which 67% of employees have a disability, in Ōita Prefecture, Japan. Sony partnered with Chery to provide OEM car audio for vehicles such as the Tiggo 8 and Tiggo 9, Ford to provide OEM audio for their vehicles such as the F-150, Fiesta, Focus, Mondeo and Taurus, Jaecoo with the Jaecoo 7, Omoda with their Omoda 5 and Omoda 9, Toyota with their Avensis and the Celica, Volkswagen with their Golf, Passat and also the Polo. A specialist Sony Xplod audio system was fitted to the Ford GTX1 supercar. Sony currently produces aftermarket car audio with their Mobile ES series. Sony produced the TV8-301, the world's first all-transistor television, in 1959. In 1968, the company introduced the Trinitron brand name for its lines of aperture grille cathode-ray tube televisions and afterwards computer monitors. Sony stopped production of Trinitron for most markets, but continued producing sets for markets such as Pakistan, Bangladesh and China. Sony discontinued its series of Trinitron computer monitors in 2005. The company discontinued the last Trinitron-based television set in the U.S. in early 2007. The end of Trinitron marked the end of Sony's analog television sets and monitors. Sony used the LCD WEGA name for its LCD TVs until summer 2005. The company then introduced the BRAVIA name. BRAVIA is an in-house brand owned by Sony which produces high-definition LCD televisions, projection TVs and front projectors, home cinemas and the BRAVIA home theatre range. All Sony high-definition flat-panel LCD televisions in North America have carried the logo for BRAVIA since 2005. In 2006, Sony lost its decades-long No.1 market share in the global television market. In November 2007, the Sony XEL-1, the first OLED television, was released and manufactured for two years. Later in 2013, Sony demonstrated the first 4K OLED television. As of 2012, Sony was the third-largest maker of televisions in the world and the business unit had been unprofitable for eight consecutive years. From 2011, Sony started restructuring of its loss-making television business, mainly by downsizing business units and outsourcing the manufacturing of display panels to the companies like Sharp Corporation, LG Display, and Samsung Electronics. In December 2011, Sony agreed to sell all stake in an LCD joint venture with Samsung Electronics (S-LCD) for about $940 million. On 28 March 2012, Sony and Sharp announced that they have agreed to further amend the joint venture agreement originally executed by the parties in July 2009, as amended in April 2011, for the establishment and operation of Sharp Display Products Corporation ("SDP"), a joint venture to produce and sell large-sized LCD panels and modules. The agreement was eventually terminated as Sony parted ways. Sony's small-sized LCD business subsidiary and medium-to-large-sized OLED display business unit were spun off and became part of Japan Display and JOLED, respectively. In 2017, Sony launched OLED televisions under the BRAVIA brand. Also, Sony has sold a range of tapes, discs, recorders and players for videocassette, DVD, and Blu-ray formats for decades. Sony offers a wide range of digital cameras. Its point-and-shoot models are branded Cyber-shot, while DSLRs and mirrorless models are branded Alpha, though Sony no longer makes DSLRs. It also produces action cameras and camcorders, with the company's cinema-grade products being sold under the CineAlta name. Sony demonstrated a prototype of the Sony Mavica in 1981 and released it for the consumer market in 1988. The first Cyber-shot was introduced in 1996. Sony's market share of the digital camera market fell from a high of 20% to 9% by 2005. Sony entered the market for digital single-lens reflex cameras in 2006 when it acquired the camera business of Konica Minolta. Sony rebranded the company's line of cameras as its Alpha line. Sony is the world's third largest manufacturer of the cameras, behind Canon and Nikon respectively. In 2010, Sony introduced their first mirrorless interchangeable-lens cameras, which were the NEX-3 and the NEX-5. They also started a new lens mount system, which was the E-mount. There were quite a few NEX models out there, when Sony decided to melt the NEX series into the Alpha series. The first Alpha MILC was the α3000, which was introduced in August 2013. It was followed by the Full-Frame α7 and α7R in October, then the successors of the NEX-5, the NEX-6 and NEX-7, the α5000 and the α6000 in 2014. The α6000 became the most popular MILC ever and Sony became the largest MILC manufacturer. Sony produced computers (SMC-777 [jp] personal computer, MSX home computers and NEWS workstations) during the 1980s. The company withdrew from the computer business around 1990. Sony entered again into the global computer market under the new VAIO brand, began in 1996. Short for "Video Audio Integrated Operation", the line was the first computer brand to highlight visual-audio features. Sony faced considerable controversy when some of its laptop batteries exploded and caught fire in 2006, resulting in the largest computer-related recall to that point in history. In a bid to join the tablet computer market, the company launched its Sony Tablet line of Android tablets in 2011. Since 2012, Sony's Android products have been marketed under the Xperia brand used for its smartphones. On 4 February 2014, Sony announced that it would sell its VAIO PC business due to poor sales and Japanese company Japan Industrial Partners (JIP) will purchase the VAIO brand, with the deal finalized by the end of March 2014. As of 2018, Sony maintained a 5% stake in the new, independent company. In the 1990s, Sony was contracted to manufacture laptop computers for Apple and Dell. The Raspberry Pi Foundation delegates the manufacture of its single-board computers to Sony. Most Raspberry Pi computers are made at Sony UK Technology Centre in Pencoed, Wales, UK. Sony has targeted medical, healthcare and biotechnology business as a growth sector in the future. The company acquired iCyt Mission Technology, Inc. (renamed Sony Biotechnology Inc. in 2012), a manufacturer of flow cytometers, in 2010 and Micronics, Inc., a developer of microfluidics-based diagnostic tools, in 2011. In 2012, Sony announced that it would acquire all shares of So-net Entertainment Corporation, the largest shareholder of M3, Inc., an operator of portal sites (m3.com, MR-kun, MDLinx and MEDI:GATE) for healthcare professionals. On 28 September 2012, Olympus and Sony announced that the two companies will establish a joint venture to develop new surgical endoscopes with 4K resolution (or higher) and 3D capability. Sony Olympus Medical Solutions Inc. (Sony 51%, Olympus 49%) was established on 16 April 2013. On 28 February 2014, Sony, M3 and Illumina established a joint venture called P5, Inc. to provide a genome analysis service for research institutions and enterprises in Japan. In 2000, Sony was a marginal player in the mobile phone market with a share of less than 1 percent. In 2001, Sony entered into a joint venture with Swedish telecommunications company Ericsson, forming Sony Ericsson Mobile Communications. Initial sales were rocky, and the company posted losses in 2001 and 2002. However, Sony Ericsson reached a profit in 2003. The company distinguished itself with multimedia-capable mobile phones, which included features such as cameras. These were unusual at the time. Despite their innovations, Sony Ericsson faced intense competition from Apple's iPhone, which was released in 2007. From 2008 to 2010, amid a global recession, Sony Ericsson slashed its workforce by several thousand. In 2009, Sony Ericsson was the fourth-largest mobile phone manufacturer in the world (after Nokia, Samsung and LG). By 2010, its market share had fallen to sixth place. Sony acquired Ericsson's share of the venture in 2012 for over US$1 billion. Sony Mobile focuses exclusively on the smartphone market under the Xperia brand. In 2013, Sony contributed to around two percent of the mobile phone market with 37 million mobile phones sold. Sony Mobile's sales reached a peak in 2014 with 40 million handsets, the volume has since decreased. Sony shipped 13.5 million phones in 2017, 6.5 million in 2018, and 2.9 million handsets in FY 2020. Since the late 1990s, Sony has released numerous consumer robots, including dog-shaped robots called AIBO, a music playing robot called Rolly, and a humanoid robot called QRIO. Despite being a pioneer in the field, Sony had ceased robotics-related operations for 10 years due to financial difficulties, until it decided to revive them in 2016. In 2015, Sony partnered with an autonomous driving startup ZMP INC. to establish an aerial surveillance and reconnaissance drone manufacturer named Aerosense. At the CES 2021, Sony unveiled a drone with the brand Airpeak, the smallest of its kind that can incorporate a Sony Alpha camera according to the company, entering the drone business on its own for the first time. In 2019, as part of the London Design Festival, Sony Design showcased Affinity in Autonomy, a conceptual environmental art installation in the Prince Consort Gallery of the Victoria and Albert Museum that represented the company's vision of the future of AI and Robotics. Sony traces its roots in the semiconductor business back to 1954, when it became the first Japanese company to commercialize the transistor, invented and licensed by Bell Labs, whilst some of the biggest and well-established names in Japan at the time like Toshiba and Mitsubishi Electric initially stuck with vacuum tubes they had been thriving on; despite being an expert on the vacuum tube himself, Ibuka saw potential of the novel technology and had Morita negotiate the terms for licensing, making Sony into one of the earliest and the youngest licensees of the transistor, together with Texas Instruments. In 1957, Sony employee Leo Esaki and his colleagues invented a tunnel diode (usually referred to as Esaki diode) by which they discovered the quantum tunneling effect in solids, for which Esaki received the Nobel prize in Physics in 1973. Sony has commanded a dominant share in the charge-coupled device market. As of 2020, Sony is the world's largest manufacturer of CMOS image sensors as its chips are widely used in digital cameras, tablet computers, smartphones, drones and more recently, self-driving systems in automobiles. As of 2020, the company, through its chip business arm Sony Semiconductor Solutions, designs, manufactures, and sells a wide range of semiconductors and electronic components, including image sensors (HAD CCD, Exmor), image processors (BIONZ), laser diodes, system LSIs, mixed-signal LSIs, emerging memory storage, emerging displays (microLED, microOLED, and holographic display), multi-functional microcomputer (SPRESENSE), etc. In 2020, Sony has launched the first intelligent vision sensors with AI edge computing capabilies. Sony Interactive Entertainment (formerly Sony Computer Entertainment) is most notable for producing PlayStation consoles. The line grew out of a failed partnership with Nintendo. Originally, Nintendo requested Sony to develop an add-on for its Super Nintendo Entertainment System that would play CD-ROMs. In 1991 Sony announced the add-on, as well as a dedicated console known as the "Play Station". However, a disagreement over software licensing for the console caused the partnership to fall through. Sony then continued the project independently. Launched in 1994, the first PlayStation gained 61% of global console sales and broke Nintendo's long-standing lead in the market. Sony followed up with the PlayStation 2 in 2000, which was even more successful. The console has become the most successful of all time, selling over 150 million units as of 2011[update]. Sony released the PlayStation 3, a high-definition console, in 2006. It was the first console to use the Blu-ray format, and was considerably more expensive than the competitors Xbox 360 and Wii due to the Cell processor. Early on, poor sales performance resulted in significant losses for the company, pushing it to sell the console at a loss. The PlayStation 3 sold generally more poorly than its competitors in the early years of its release but managed to overtake the Xbox 360 in global sales later on. It later introduced the PlayStation Move, an accessory that allows players to control video games using motion gestures. Sony extended the brand to the portable games market in 2004 with the PlayStation Portable (PSP). The console has sold reasonably, but has taken a second place to a rival handheld, the Nintendo DS. Sony developed the Universal Media Disc (UMD) optical disc medium for use on the PlayStation Portable. Early on, the format was used for movies, but it has since lost major studio support. Sony released a disc-less version of its PlayStation Portable, the PSP Go, in 2009. The company went on to release its second portable video game system, PlayStation Vita, in 2011 and 2012. Sony launched its fourth console, the PlayStation 4, on 15 November 2013, which as of 31 December 2017 has sold 73.6 million units globally. On 18 March 2014, at GDC, president of SCE Worldwide Studios Shuhei Yoshida announced their new virtual reality technology dubbed Project Morpheus, and later named PlayStation VR, for PlayStation 4. The headset brought VR gaming and non-gaming software to the company's console. According to a report released by Houston-based patent consulting firm LexInnova in May 2015, Sony is leading the virtual reality patent race. According to the firm's analysis of nearly 12,000 patents or patent applications, Sony has 366 virtual reality patents or patent applications. PlayStation VR was released worldwide on 13 October 2016. On 31 March 2019, the successor to the PlayStation 4 was announced and on 12 November 2020, the PlayStation 5 was released in North America, Australia, New Zealand, Japan, South Korea, and Singapore. The console was launched in Indonesia on 22 January 2021. Upon completion of the fiscal quarter, Sony sold 4.5 million PlayStation 5 consoles, keeping pace with the best-selling console of all time, the PlayStation 2. Sony Entertainment has two divisions: Sony Pictures Entertainment, Sony Music Group (Sony Music Entertainment, Sony Music Publishing). Sony USA previously owned and operated Sony Trans Com: a technology business that provided in-flight entertainment programming as well as video and audio playback equipment for the airline industry. Sony had purchased the business from Sundstrand Corp. in 1989 and subsequently sold it to Rockwell Collins in 2000. In 2012, Sony rolled most of its consumer content services (including video, music and gaming) into the Sony Entertainment Network, the predecessor of PlayStation Network. Sony Pictures Entertainment Inc. (SPE) is the television and film production/distribution unit of Sony. With 12.5% box office market share in 2011, the company was ranked third among movie studios. Its group sales in 2010 were US$7.2 billion. The company has produced many notable movie franchises, including Spider-Man, The Karate Kid and Men in Black. It has also produced the popular television game shows Jeopardy! and Wheel of Fortune. Sony entered the television and film production market when it acquired Columbia Pictures Entertainment in 1989 for $3.4 billion. Columbia lives on in the Sony Pictures Motion Picture Group, a division of SPE which in turn owns Columbia Pictures and TriStar Pictures among other film production and distribution companies such as Screen Gems, Sony Pictures Classics, Sony Pictures Home Entertainment. SPE's television division is known as Sony Pictures Television. For the first several years of its existence, Sony Pictures Entertainment performed poorly, leading many to suspect the company would sell off the division. In 2006, Sony started using ARccOS Protection on some of their film DVDs, but later issued a recall. In late 2014, Sony Pictures became the target of a hack attack from a clandestine group called Guardians of Peace, weeks before releasing the anti-North Korean comedy film The Interview. In February 2024, Sony entered into an agreement with Disney under which Sony Pictures Home Entertainment Corporation of Japan will handle the release of Disney products on DVD and Blu-ray on a licensing model, as well as production on physical media. Sony will market, sell and distribute new Disney releases and catalog films on DVD, Blu-ray and 4K Ultra DVD through Canadian and American retailers and distributors. Sony Music Entertainment (also known as SME or Sony Music) is the largest global recorded music company of the "big three" record companies and is controlled by Sony Corporation of America, the United States subsidiary of Sony. In one of its largest-ever acquisitions, Sony purchased CBS Record Group in 1988 for US$2 billion. In the process, Sony partnered and gained the rights to the ATV catalogue of Michael Jackson, considered by the Guinness Book of World Records to be the most successful entertainer of all time. The acquisition of CBS Records provided the foundation for the formation of Sony Music Entertainment, which Sony established in 1991. In 1968, Sony and CBS Records had formed a 50:50 joint-venture CBS/Sony Records, later renamed CBS/Sony Group, in Japan. When CBS Records was acquired, a 50% stake in CBS/Sony Group owned by CBS was also transferred to Sony. In March 1988, four wholly owned subsidiaries were folded into CBS/Sony Group and the company was renamed as Sony Music Entertainment Japan (SMEJ). It operates independently of Sony Music as it is directly owned by Japanese Sony. In 2004, Sony entered into a joint venture with Bertelsmann AG, merging Sony Music Entertainment with Bertelsmann Music Group to create Sony BMG. In 2005, Sony BMG faced a copy protection scandal, because its music CDs had installed malware on users' computers that was posing a security risk to affected customers. In 2007, the company acquired Famous Music for US$370 million, gaining the rights to the catalogues of Eminem and Akon, among others. Sony bought out Bertelsmann's share in Sony BMG and formed a new Sony Music Entertainment in 2008. Since then, the company has undergone management changes. Sony purchased digital music recognition company Gracenote for US$260 million in 2008. Tribune Media Company acquired Gracenote from Sony in 2014 for $170 million. Besides its record label, Sony operates other music businesses. In 1995, Sony merged its publisher with Michael Jackson's ATV Music Publishing, forming Sony/ATV Music Publishing. At the time, the publishing company was the second largest of its kind in the world. The company owns the publishing rights to over 4 million compositions, including The Beatles' Lennon–McCartney catalogue, Bob Dylan, Eminem, Lady Gaga, Sam Smith, Ed Sheeran, and Taylor Swift. In 2012, Sony/ATV acquired a majority stake in EMI Music Publishing, becoming the world's largest music publishing company. In 2018, Sony bought the rest of the shares in the publisher, making it a wholly owned subsidiary. Since 2016, Sony owns all of Sony/ATV. Sony's entering into the Japanese animation, or anime, business happened in 1995 when its Sony Music Entertainment Japan (SMEJ) division established Aniplex as its subsidiary managing creative productions, which founded A-1 Pictures, the first anime studio of Sony, ten years later. Since then, through group-wide and international ventures, Sony has solidified its position in the industry, elevating the business to what is called the "fourth pillar of its entertainment portfolio" according to The Nikkei. The anime business operations of Sony are scattered around the group, mainly in its Pictures and Music units, as follows: SMEJ's notable related businesses include Aniplex and its subsidiaries CloverWorks and A-1 Pictures; Aniplex and U.S.-headquartered Sony Pictures co-own U.S.-based anime distribution company Crunchyroll, which since 2022, has become the successor company to Funimation, which it acquired in 2017 and included subsidiaries such as Wakanim (absorbing into Crunchyroll itself) and Madman Anime (to be rebranded as Crunchyroll Pty. Ltd.) In December 2020, Funimation announced that it would buy AT&T's animation business Crunchyroll for $1.175 billion, which would help the company to compete more globally with entertainment giants such as Netflix. This acquisition was completed in August 2021. Sony Financial Group is a holding company for Sony's financial services business which includes Sony Life (in Japan and the Philippines), Sony Assurance, Sony Bank, etc. The unit proved to be the most profitable of Sony's businesses in FY 2005, earning $1.7 billion in profit. Sony Financial's low fees have aided the unit's popularity while threatening Sony's premium brand name. A company behind the commercialization of lithium-ion battery, Sony had been exploring the possibility to manufacture the batteries for electric vehicles. In 2014, Sony participated within NRG Energy eVgo Ready for Electric Vehicle (REV) program, for EV charging parking lots. However, the company then decided to sell its lithium-ion battery business to Murata Manufacturing in 2016. In 2015, Sony invested $842,000 in ZMP INC., drawing speculations that it is contemplating developing self-driving cars. In January 2020, Sony unveiled a concept electric car at the Consumer Electronics Show, named Vision-S, designed in collaboration with components manufacturer Magna International. At the occasion, Sony also stated its goal of developing technology for the automotive sector, especially concerning autonomous driving, sensors, and in-car entertainment. In 2022, Sony Group and Honda launched a joint venture for their electric vehicle partnership, Sony Honda Mobility (SHM), which would deliver its first electric vehicles by 2026 and sell them online, starting in the United States and Japan. The joint venture announced their new "Afeela" brand and its first prototype model at the CES 2023. Corporate information Sony is a kabushiki kaisha (joint stock company) listed on the Tokyo Stock Exchange in Japan with American depositary receipts listed on New York Stock Exchange. As of 31 March 2023[update], the largest shareholders of Sony are as follows: The key trends for Sony are (as of the financial year ending March 31): As of January 2024, Sony, one of the largest Japanese companies by market capitalization and operating profit, was valued at over $112 billion. At the same period, it was also recognized as the most cash-rich Japanese company, with its net cash reserves of ¥1.8 trillion. The company was immensely profitable throughout the 1990s and early 2000s in part because of the success of its new PlayStation line. The company encountered financial difficulty in the mid- to late-2000s due to several factors: the Great Recession, increased competition for PlayStation, and the 2011 Tōhoku earthquake and tsunami. The company faced three consecutive years of losses leading up to 2011. While noting the negative effects of intervening circumstances such as natural disasters and fluctuating currency exchange rates, the Financial Times criticized the company for its "lack of resilience" and "inability to gauge the economy," voicing skepticism about Sony's revitalization efforts, given a lack of tangible results. In September 2000 Sony had a market capitalization of $100 billion; but by December 2011 it had plunged to $18 billion, reflecting falling prospects for Sony but also reflecting grossly inflated share prices of the 'dot-com bubble' years. Net worth, as measured by stockholder equity, has steadily grown from $17.9 billion in March 2002 to $35.6 billion through December 2011. Earnings yield (inverse of the price to earnings ratio) has never been more than 5% and usually much less; thus Sony has always traded in over-priced ranges with the exception of the 2009 market bottom. On 9 December 2008, Sony announced that it would be cutting 8,000 jobs, dropping 8,000 contractors and reducing its global manufacturing sites by 10% to save $1.1 billion per year. In April 2012, Sony announced that it would reduce its workforce by 10,000 (6% of its employee base) as part of CEO Kaz Hirai's effort to get the company back into the black. This came after a loss of 520 billion yen (roughly US$6.36 billion) for fiscal 2012, the worst since the company was founded. Accumulation loss for the past four years was 919.32 billion-yen. Sony planned to increase its marketing expenses by 30% in 2012. 1,000 of the jobs cut came from the company's mobile phone unit's workforce, 700 jobs being cut in the 2012–2013 fiscal year and the remaining 300 in the following fiscal year. Sony had revenues of ¥6.493 trillion in 2012 and maintained large reserves of cash, with ¥895 billion on hand as of 2012. In May 2012, Sony's market capitalization was valued at about $15 billion. In January 2013, Sony announced it was selling its US headquarters building for $1.1 billion to a consortium led by real estate developer The Chetrit Group. On 28 January 2014, Moody's Investors Services dropped Sony's credit rating to Ba1—"judged to have speculative elements and a significant credit risk"—saying that the company's "profitability is likely to remain weak and volatile." On 6 February 2014, Sony announced it would trim as many as 5,000 jobs as it attempts to sell its PC business and focus on mobile and tablets. In 2014, Sony South Africa closed its TV, Hi-Fi and camera divisions with the purpose of reconsidering its local distribution model and, in 2017, it returned facilitated by Premium Brand Distributors (Pty) Ltd. In November 2018, Sony posted its earning report for the second quarter showing it has lost about US$480 million in the mobile phone division, prompting another round of downsizing in the unit, including the closure of a manufacturing plant and halving of its workforce. In November 2025, Israeli business media outlets, including Globes, reported that the Japanese company Sony plans to cease its telecommunications chip development activities in Israel. Criticism & controversies Over the years, Sony has faced a number of allegations and criticism pertaining to their corporate behavior, often leading to legal proceedings and customer dissatisfaction. In August 2000, then Sony Pictures Entertainment U.S. senior vice president Steve Heckler was quoted saying "The industry will take whatever steps it needs to protect itself and protect its revenue streams ...". Sony then worked on a DRM system that works like a rootkit in order to enforce its copyright claims upon users of music CDs. With respect to Sony's gaming consoles, subsequent updates are released to the said consoles, many of which strip the user of some of the originally advertised features in order to save the company some licensing fees or protect itself from the modding community. On April 1, 2010, Sony released a patch for the PS3 that would remove OtherOS from being installed onto the system after hackers were looking for ways to exploit OtherOS in order to run homebrew software. Then on January 12, 2011, Sony filled lawsuits against geohot and fail0verflow for their efforts on exploiting the PS3. In December 2023, Sony announced that it will remove the Discovery app and its content, even if previously paid for, from its gaming consoles. In November 2025, a collaborative decision was made by Sony Semiconductor Israel and the headquarters of Sony Corporation in Japan, stating that Sony Israel will detach from Sony Corporation and will recommence its operations as an independent entity under its original name, Altair Semiconductor. In December 2025, Texas Attorney General Ken Paxton filed a lawsuit against Sony and four other smart TV manufacturers, alleging that the companies were illegally "spying on Texans by secretly recording what consumers watch in their own homes" using automated content recognition (ACR) technology. In November 2011, Sony was ranked ninth (jointly with Panasonic) in Greenpeace's Guide to Greener Electronics. This chart grades major electronics companies on their environmental work. The company scored 3.6/10, incurring a penalty point for comments it has made in opposition to energy efficiency standards in California. It also risks a further penalty point in future editions for being a member of trade associations that have commented against energy efficiency standards. Together with Philips, Sony receives the highest score for energy policy advocacy after calling on the EU to adopt an unconditional 30% reduction target for greenhouse gas emissions by 2020. Meanwhile, it receives full marks for the efficiency of its products. In June 2007, Sony ranked 14th on the Greenpeace guide. Sony fell from its earlier 11th-place ranking due to Greenpeace's claims that Sony had double standards in their waste policies. As of May 2018[update] Greenpeace's 2017 Guide to Greener Electronics rated Sony approximately in the middle among electronics manufacturers with a grade of D+. Since 1976, Sony has had an Environmental Conference. Sony's policies address their effects on global warming, the environment, and resources. They are taking steps to reduce the amount of greenhouse gases that they put out as well as regulating the products they get from their suppliers in a process that they call "green procurement". Sony has said that they have signed on to have about 75 percent of their Sony Building running on geothermal power. The "Sony Take Back Recycling Program" allow consumers to recycle the electronics products that they buy from Sony by taking them to eCycle (Recycling) drop-off points around the U.S. The company has also developed a biobattery that runs on sugars and carbohydrates that works similarly to the way living creatures work. This is the most powerful small biobattery to date. In 2000, Sony faced criticism for a document entitled "NGO Strategy" that was leaked to the press. The document involved the company's surveillance of environmental activists in an attempt to plan how to counter their movements. It specifically mentioned environmental groups that were trying to pass laws that held electronics-producing companies responsible for the cleanup of the toxic chemicals contained in their merchandise. In 2007 an investigation launched in 2002 by the European Commission culminated in Sony, Fuji and Maxell receiving a total of 110 million US dollar fine for fixing professional videotape prices between 1999 and 2002 through regular meetings and other illegal contracts; at the time the three corporations shared a combined 85% control of the market. Sony's part of the fine was raised by a third for trying to obstruct the investigation by refusing to answer inquiries made by the EU officials and shredding of evidence during the multiple law-enforcement raids. During the year 2001 prior to the investigation Sony sold professional videotapes for a total of €115 million inside the EU. See also Notes References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/List_of_programming_languages#R] | [TOKENS: 115]
List of programming languages This is an index to notable programming languages, in current or historical use. Dialects of BASIC (which have their own page), esoteric programming languages, and markup languages are not included. A programming language does not need to be imperative or Turing-complete, but must be executable and so does not include markup languages such as HTML or XML, but does include domain-specific languages such as SQL and its dialects. A B C D E F G H I J K L M N O P Q R S T U V W X Y Z See also
========================================
[SOURCE: https://en.wikipedia.org/wiki/Minecraft#cite_ref-131] | [TOKENS: 12858]
Contents Minecraft Minecraft is a sandbox game developed and published by Mojang Studios. Following its initial public alpha release in 2009, it was formally released in 2011 for personal computers. The game has since been ported to numerous platforms, including mobile devices and various video game consoles. In Minecraft, players explore a procedurally generated world with virtually infinite terrain made up of voxels (cubes). They can discover and extract raw materials, craft tools and items, build structures, fight hostile mobs, and cooperate with or compete against other players in multiplayer. The game's large community offers a wide variety of user-generated content, such as modifications, servers, player skins, texture packs, and custom maps, which add new game mechanics and possibilities. Originally created by Markus "Notch" Persson using the Java programming language, Jens "Jeb" Bergensten was handed control over the game's development following its full release. In 2014, Mojang and the Minecraft intellectual property were purchased by Microsoft for US$2.5 billion; Xbox Game Studios hold the publishing rights for the Bedrock Edition, the unified cross-platform version which evolved from the Pocket Edition codebase[i] and replaced the legacy console versions. Bedrock is updated concurrently with Mojang's original Java Edition, although with numerous, generally small, differences. Minecraft is the best-selling video game in history with over 350 million copies sold. It has received critical acclaim, winning several awards and being cited as one of the greatest video games of all time. Social media, parodies, adaptations, merchandise, and the annual Minecon conventions have played prominent roles in popularizing it. The wider Minecraft franchise includes several spin-off games, such as Minecraft: Story Mode, Minecraft Dungeons, and Minecraft Legends. A film adaptation, titled A Minecraft Movie, was released in 2025 and became the second highest-grossing video game film of all time. Gameplay Minecraft is a 3D sandbox video game that has no required goals to accomplish, giving players a large amount of freedom in choosing how to play the game. The game features an optional achievement system. Gameplay is in the first-person perspective by default, but players have the option of third-person perspectives. The game world is composed of rough 3D objects—mainly cubes, referred to as blocks—representing various materials, such as dirt, stone, ores, tree trunks, water, and lava. The core gameplay revolves around picking up and placing these objects. These blocks are arranged in a voxel grid, while players can move freely around the world. Players can break, or mine, blocks and then place them elsewhere, enabling them to build things. Very few blocks are affected by gravity, instead maintaining their voxel position in the air. Players can also craft a wide variety of items, such as armor, which mitigates damage from attacks; weapons (such as swords or bows and arrows), which allow monsters and animals to be killed more easily; and tools (such as pickaxes or shovels), which break certain types of blocks more quickly. Some items have multiple tiers depending on the material used to craft them, with higher-tier items being more effective and durable. They may also freely craft helpful blocks—such as furnaces which can cook food and smelt ores, and torches that produce light—or exchange items with villagers (NPC) through trading emeralds for different goods and vice versa. The game has an inventory system, allowing players to carry a limited number of items. The in-game time system follows a day and night cycle, with one full cycle lasting for 20 real-time minutes. The game also contains a material called redstone, which can be used to make primitive mechanical devices, electrical circuits, and logic gates, allowing for the construction of many complex systems. New players are given a randomly selected default character skin out of nine possibilities, including Steve or Alex, but are able to create and upload their own skins. Players encounter various mobs (short for mobile entities) including animals, villagers, and hostile creatures. Passive mobs, such as cows, pigs, and chickens, spawn during the daytime and can be hunted for food and crafting materials, while hostile mobs—including large spiders, witches, skeletons, and zombies—spawn during nighttime or in dark places such as caves. Some hostile mobs, such as zombies and skeletons, burn under the sun if they have no headgear and are not standing in water. Other creatures unique to Minecraft include the creeper (an exploding creature that sneaks up on the player) and the enderman (a creature with the ability to teleport as well as pick up and place blocks). There are also variants of mobs that spawn in different conditions; for example, zombies have husk and drowned variants that spawn in deserts and oceans, respectively. The Minecraft environment is procedurally generated as players explore it using a map seed that is randomly chosen at the time of world creation (or manually specified by the player). Divided into biomes representing different environments with unique resources and structures, worlds are designed to be effectively infinite in traditional gameplay, though technical limits on the player have existed throughout development, both intentionally and not. Implementation of horizontally infinite generation initially resulted in a glitch termed the "Far Lands" at over 12 million blocks away from the world center, where terrain generated as wall-like, fissured patterns. The Far Lands and associated glitches were considered the effective edge of the world until they were resolved, with the current horizontal limit instead being a special impassable barrier called the world border, located 30 million blocks away. Vertical space is comparatively limited, with an unbreakable bedrock layer at the bottom and a building limit several hundred blocks into the sky. Minecraft features three independent dimensions accessible through portals and providing alternate game environments. The Overworld is the starting dimension and represents the real world, with a terrestrial surface setting including plains, mountains, forests, oceans, caves, and small sources of lava. The Nether is a hell-like underworld dimension accessed via an obsidian portal and composed mainly of lava. Mobs that populate the Nether include shrieking, fireball-shooting ghasts, alongside anthropomorphic pigs called piglins and their zombified counterparts. Piglins in particular have a bartering system, where players can give them gold ingots and receive items in return. Structures known as Nether Fortresses generate in the Nether, containing mobs such as wither skeletons and blazes, which can drop blaze rods needed to access the End dimension. The player can also choose to build an optional boss mob known as the Wither, using skulls obtained from wither skeletons and soul sand. The End can be reached through an end portal, consisting of twelve end portal frames. End portals are found in underground structures in the Overworld known as strongholds. To find strongholds, players must craft eyes of ender using an ender pearl and blaze powder. Eyes of ender can then be thrown, traveling in the direction of the stronghold. Once the player reaches the stronghold, they can place eyes of ender into each portal frame to activate the end portal. The dimension consists of islands floating in a dark, bottomless void. A boss enemy called the Ender Dragon guards the largest, central island. Killing the dragon opens access to an exit portal, which, when entered, cues the game's ending credits and the End Poem, a roughly 1,500-word work written by Irish novelist Julian Gough, which takes about nine minutes to scroll past, is the game's only narrative text, and the only text of significant length directed at the player.: 10–12 At the conclusion of the credits, the player is teleported back to their respawn point and may continue the game indefinitely. In Survival mode, players have to gather natural resources such as wood and stone found in the environment in order to craft certain blocks and items. Depending on the difficulty, monsters spawn in darker areas outside a certain radius of the character, requiring players to build a shelter in order to survive at night. The mode also has a health bar which is depleted by attacks from mobs, falls, drowning, falling into lava, suffocation, starvation, and other events. Players also have a hunger bar, which must be periodically refilled by eating food in-game unless the player is playing on peaceful difficulty. If the hunger bar is empty, the player starves. Health replenishes when players have a full hunger bar or continuously on peaceful. Upon losing all health, players die. The items in the players' inventories are dropped unless the game is reconfigured not to do so. Players then re-spawn at their spawn point, which by default is where players first spawn in the game and can be changed by sleeping in a bed or using a respawn anchor. Dropped items can be recovered if players can reach them before they despawn after 5 minutes. Players may acquire experience points (commonly referred to as "xp" or "exp") by killing mobs and other players, mining, smelting ores, animal breeding, and cooking food. Experience can then be spent on enchanting tools, armor and weapons. Enchanted items are generally more powerful, last longer, or have other special effects. The game features two more game modes based on Survival, known as Hardcore mode and Adventure mode. Hardcore mode plays identically to Survival mode, but with the game's difficulty setting locked to "Hard" and with permadeath, forcing them to delete the world or explore it as a spectator after dying. Adventure mode was added to the game in a post-launch update, and prevents the player from directly modifying the game's world. It was designed primarily for use in custom maps, allowing map designers to let players experience it as intended. In Creative mode, players have access to an infinite number of all resources and items in the game through the inventory menu and can place or mine them instantly. Players can toggle the ability to fly freely around the game world at will, and their characters usually do not take any damage nor are affected by hunger. The game mode helps players focus on building and creating projects of any size without disturbance. Multiplayer in Minecraft enables multiple players to interact and communicate with each other on a single world. It is available through direct game-to-game multiplayer, local area network (LAN) play, local split screen (console-only), and servers (player-hosted and business-hosted). Players can run their own server by making a realm, using a host provider, hosting one themselves or connect directly to another player's game via Xbox Live, PlayStation Network or Nintendo Switch Online. Single-player worlds have LAN support, allowing players to join a world on locally interconnected computers without a server setup. Minecraft multiplayer servers are guided by server operators, who have access to server commands such as setting the time of day and teleporting players. Operators can also set up restrictions concerning which usernames or IP addresses are allowed or disallowed to enter the server. Multiplayer servers have a wide range of activities, with some servers having their own unique rules and customs. The largest and most popular server is Hypixel, which has been visited by over 14 million unique players. Player versus player combat (PvP) can be enabled to allow fighting between players. In 2013, Mojang announced Minecraft Realms, a server hosting service intended to enable players to run server multiplayer games easily and safely without having to set up their own. Unlike a standard server, only invited players can join Realms servers, and these servers do not use server addresses. Minecraft: Java Edition Realms server owners can invite up to twenty people to play on their server, with up to ten players online at a time. Minecraft Realms server owners can invite up to 3,000 people to play on their server, with up to ten players online at one time. The Minecraft: Java Edition Realms servers do not support user-made plugins, but players can play custom Minecraft maps. Minecraft Bedrock Realms servers support user-made add-ons, resource packs, behavior packs, and custom Minecraft maps. At Electronic Entertainment Expo 2016, support for cross-platform play between Windows 10, iOS, and Android platforms was added through Realms starting in June 2016, with Xbox One and Nintendo Switch support to come later in 2017, and support for virtual reality devices. On 31 July 2017, Mojang released the beta version of the update allowing cross-platform play. Nintendo Switch support for Realms was released in July 2018. The modding community consists of fans, users and third-party programmers. Using a variety of application program interfaces that have arisen over time, they have produced a wide variety of downloadable content for Minecraft, such as modifications, texture packs and custom maps. Modifications of the Minecraft code, called mods, add a variety of gameplay changes, ranging from new blocks, items, and mobs to entire arrays of mechanisms. The modding community is responsible for a substantial supply of mods from ones that enhance gameplay, such as mini-maps, waypoints, and durability counters, to ones that add to the game elements from other video games and media. While a variety of mod frameworks were independently developed by reverse engineering the code, Mojang has also enhanced vanilla Minecraft with official frameworks for modification, allowing the production of community-created resource packs, which alter certain game elements including textures and sounds. Players can also create their own "maps" (custom world save files) that often contain specific rules, challenges, puzzles and quests, and share them for others to play. Mojang added an adventure mode in August 2012 and "command blocks" in October 2012, which were created specially for custom maps in Java Edition. Data packs, introduced in version 1.13 of the Java Edition, allow further customization, including the ability to add new achievements, dimensions, functions, loot tables, predicates, recipes, structures, tags, and world generation. The Xbox 360 Edition supported downloadable content, which was available to purchase via the Xbox Games Store; these content packs usually contained additional character skins. It later received support for texture packs in its twelfth title update while introducing "mash-up packs", which combined texture packs with skin packs and changes to the game's sounds, music and user interface. The first mash-up pack (and by extension, the first texture pack) for the Xbox 360 Edition was released on 4 September 2013, and was themed after the Mass Effect franchise. Unlike Java Edition, however, the Xbox 360 Edition did not support player-made mods or custom maps. A cross-promotional resource pack based on the Super Mario franchise by Nintendo was released exclusively for the Wii U Edition worldwide on 17 May 2016, and later bundled free with the Nintendo Switch Edition at launch. Another based on Fallout was released on consoles that December, and for Windows and Mobile in April 2017. In April 2018, malware was discovered in several downloadable user-made Minecraft skins for use with the Java Edition of the game. Avast stated that nearly 50,000 accounts were infected, and when activated, the malware would attempt to reformat the user's hard drive. Mojang promptly patched the issue, and released a statement stating that "the code would not be run or read by the game itself", and would run only when the image containing the skin itself was opened. In June 2017, Mojang released the "1.1 Discovery Update" to the Pocket Edition of the game, which later became the Bedrock Edition. The update introduced the "Marketplace", a catalogue of purchasable user-generated content intended to give Minecraft creators "another way to make a living from the game". Various skins, maps, texture packs and add-ons from different creators can be bought with "Minecoins", a digital currency that is purchased with real money. Additionally, users can access specific content with a subscription service titled "Marketplace Pass". Alongside content from independent creators, the Marketplace also houses items published by Mojang and Microsoft themselves, as well as official collaborations between Minecraft and other intellectual properties. By 2022, the Marketplace had over 1.7 billion content downloads, generating over $500 million in revenue. Development Before creating Minecraft, Markus "Notch" Persson was a game developer at King, where he worked until March 2009. At King, he primarily developed browser games and learned several programming languages. During his free time, he prototyped his own games, often drawing inspiration from other titles, and was an active participant on the TIGSource forums for independent developers. One such project was "RubyDung", a base-building game inspired by Dwarf Fortress, but with an isometric, three-dimensional perspective similar to RollerCoaster Tycoon. Among the features in RubyDung that he explored was a first-person view similar to Dungeon Keeper, though he ultimately discarded this idea, feeling the graphics were too pixelated at the time. Around March 2009, Persson left King and joined jAlbum, while continuing to work on his prototypes. Infiniminer, a block-based open-ended mining game first released in April 2009, inspired Persson's vision for RubyDung's future direction. Infiniminer heavily influenced the visual style of gameplay, including bringing back the first-person mode, the "blocky" visual style and the block-building fundamentals. However, unlike Infiniminer, Persson wanted Minecraft to have RPG elements. The first public alpha build of Minecraft was released on 17 May 2009 on TIGSource. Over the years, Persson regularly released test builds that added new features, including tools, mobs, and entire new dimensions. In 2011, partly due to the game's rising popularity, Persson decided to release a full 1.0 version—a second part of the "Adventure Update"—on 18 November 2011. Shortly after, Persson stepped down from development, handing the project's lead to Jens "Jeb" Bergensten. On 15 September 2014, Microsoft, the developer behind the Microsoft Windows operating system and Xbox video game console, announced a $2.5 billion acquisition of Mojang, which included the Minecraft intellectual property. Persson had suggested the deal on Twitter, asking a corporation to buy his stake in the game after receiving criticism for enforcing terms in the game's end-user license agreement (EULA), which had been in place for the past three years. According to Persson, Mojang CEO Carl Manneh received a call from a Microsoft executive shortly after the tweet, asking if Persson was serious about a deal. Mojang was also approached by other companies including Activision Blizzard and Electronic Arts. The deal with Microsoft was arbitrated on 6 November 2014 and led to Persson becoming one of Forbes' "World's Billionaires". After 2014, Minecraft's primary versions received usually annual major updates—free to players who have purchased the game— each primarily centered around a specific theme. For instance, version 1.13, the Update Aquatic, focused on ocean-related features, while version 1.16, the Nether Update, introduced significant changes to the Nether dimension. However, in late 2024, Mojang announced a shift in their update strategy; rather than releasing large updates annually, they opted for a more frequent release schedule with smaller, incremental updates, stating, "We know that you want new Minecraft content more often." The Bedrock Edition has also received regular updates, now matching the themes of the Java Edition updates. Other versions of the game, such as various console editions and the Pocket Edition, were either merged into Bedrock or discontinued and have not received further updates. On 7 May 2019, coinciding with Minecraft's 10th anniversary, a JavaScript recreation of an old 2009 Java Edition build named Minecraft Classic was made available to play online for free. On 16 April 2020, a Bedrock Edition-exclusive beta version of Minecraft, called Minecraft RTX, was released by Nvidia. It introduced physically-based rendering, real-time path tracing, and DLSS for RTX-enabled GPUs. The public release was made available on 8 December 2020. Path tracing can only be enabled in supported worlds, which can be downloaded for free via the in-game Minecraft Marketplace, with a texture pack from Nvidia's website, or with compatible third-party texture packs. It cannot be enabled by default with any texture pack on any world. Initially, Minecraft RTX was affected by many bugs, display errors, and instability issues. On 22 March 2025, a new visual mode called Vibrant Visuals, an optional graphical overhaul similar to Minecraft RTX, was announced. It promises modern rendering features—such as dynamic shadows, screen space reflections, volumetric fog, and bloom—without the need of RTX-capable hardware. Vibrant Visuals was released as a part of the Chase the Skies update on 17 June 2025 for Bedrock Edition and is planned to release on Java Edition at a later date. Development began for the original edition of Minecraft—then known as Cave Game, and now known as the Java Edition—in May 2009,[k] and ended on 13 May, when Persson released a test video on YouTube of an early version of the game, dubbed the "Cave game tech test" or the "Cave game tech demo". The game was named Minecraft: Order of the Stone the next day, after a suggestion made by a player. "Order of the Stone" came from the webcomic The Order of the Stick, and "Minecraft" was chosen "because it's a good name". The title was later shortened to just Minecraft, omitting the subtitle. Persson completed the game's base programming over a weekend in May 2009, and private testing began on TigIRC on 16 May. The first public release followed on 17 May 2009 as a developmental version shared on the TIGSource forums. Based on feedback from forum users, Persson continued updating the game. This initial public build later became known as Classic. Further developmental phases—dubbed Survival Test, Indev, and Infdev—were released throughout 2009 and 2010. The first major update, known as Alpha, was released on 30 June 2010. At the time, Persson was still working a day job at jAlbum but later resigned to focus on Minecraft full-time as sales of the alpha version surged. Updates were distributed automatically, introducing new blocks, items, mobs, and changes to game mechanics such as water flow. With revenue generated from the game, Persson founded Mojang, a video game studio, alongside former colleagues Jakob Porser and Carl Manneh. On 11 December 2010, Persson announced that Minecraft would enter its beta phase on 20 December. He assured players that bug fixes and all pre-release updates would remain free. As development progressed, Mojang expanded, hiring additional employees to work on the project. The game officially exited beta and launched in full on 18 November 2011. On 1 December 2011, Jens "Jeb" Bergensten took full creative control over Minecraft, replacing Persson as lead designer. On 28 February 2012, Mojang announced the hiring of the developers behind Bukkit, a popular developer API for Minecraft servers, to improve Minecraft's support of server modifications. This move included Mojang taking apparent ownership of the CraftBukkit server mod, though this apparent acquisition later became controversial, and its legitimacy was questioned due to CraftBukkit's open-source nature and licensing under the GNU General Public License and Lesser General Public License. In August 2011, Minecraft: Pocket Edition was released as an early alpha for the Xperia Play via the Android Market, later expanding to other Android devices on 8 October 2011. The iOS version followed on 17 November 2011. A port was made available for Windows Phones shortly after Microsoft acquired Mojang. Unlike Java Edition, Pocket Edition initially focused on Minecraft's creative building and basic survival elements but lacked many features of the PC version. Bergensten confirmed on Twitter that the Pocket Edition was written in C++ rather than Java, as iOS does not support Java. On 10 December 2014, a port of Pocket Edition was released for Windows Phone 8.1. In July 2015, a port of the Pocket Edition to Windows 10 was released as the Windows 10 Edition, with full crossplay to other Pocket versions. In January 2017, Microsoft announced that it would no longer maintain the Windows Phone versions of Pocket Edition. On 20 September 2017, with the "Better Together Update", the Pocket Edition was ported to the Xbox One, and was renamed to the Bedrock Edition. The console versions of Minecraft debuted with the Xbox 360 edition, developed by 4J Studios and released on 9 May 2012. Announced as part of the Xbox Live Arcade NEXT promotion, this version introduced a redesigned crafting system, a new control interface, in-game tutorials, split-screen multiplayer, and online play via Xbox Live. Unlike the PC version, its worlds were finite, bordered by invisible walls. Initially, the Xbox 360 version resembled outdated PC versions but received updates to bring it closer to Java Edition before eventually being discontinued. The Xbox One version launched on 5 September 2014, featuring larger worlds and support for more players. Minecraft expanded to PlayStation platforms with PlayStation 3 and PlayStation 4 editions released on 17 December 2013 and 4 September 2014, respectively. Originally planned as a PS4 launch title, it was delayed before its eventual release. A PlayStation Vita version followed in October 2014. Like the Xbox versions, the PlayStation editions were developed by 4J Studios. Nintendo platforms received Minecraft: Wii U Edition on 17 December 2015, with a physical release in North America on 17 June 2016 and in Europe on 30 June. The Nintendo Switch version launched via the eShop on 11 May 2017. During a Nintendo Direct presentation on 13 September 2017, Nintendo announced that Minecraft: New Nintendo 3DS Edition, based on the Pocket Edition, would be available for download immediately after the livestream, and a physical copy available on a later date. The game is compatible only with the New Nintendo 3DS or New Nintendo 2DS XL systems and does not work with the original 3DS or 2DS systems. On 20 September 2017, the Better Together Update introduced Bedrock Edition across Xbox One, Windows 10, VR, and mobile platforms, enabling cross-play between these versions. Bedrock Edition later expanded to Nintendo Switch and PlayStation 4, with the latter receiving the update in December 2019, allowing cross-platform play for users with a free Xbox Live account. The Bedrock Edition released a native version for PlayStation 5 on 22 October 2024, while the Xbox Series X/S version launched on 17 June 2025. On 18 December 2018, the PlayStation 3, PlayStation Vita, Xbox 360, and Wii U versions of Minecraft received their final update and would later become known as "Legacy Console Editions". On 15 January 2019, the New Nintendo 3DS version of Minecraft received its final update, effectively becoming discontinued as well. An educational version of Minecraft, designed for use in schools, launched on 1 November 2016. It is available on Android, ChromeOS, iPadOS, iOS, MacOS, and Windows. On 20 August 2018, Mojang announced that it would bring Education Edition to iPadOS in Autumn 2018. It was released to the App Store on 6 September 2018. On 27 March 2019, it was announced that it would be operated by JD.com in China. On 26 June 2020, a public beta for the Education Edition was made available to Google Play Store compatible Chromebooks. The full game was released to the Google Play Store for Chromebooks on 7 August 2020. On 20 May 2016, China Edition (also known as My World) was announced as a localized edition for China, where it was released under a licensing agreement between NetEase and Mojang. The PC edition was released for public testing on 8 August 2017. The iOS version was released on 15 September 2017, and the Android version was released on 12 October 2017. The PC edition is based on the original Java Edition, while the iOS and Android mobile versions are based on the Bedrock Edition. The edition is free-to-play and had over 700 million registered accounts by September 2023. This version of Bedrock Edition is exclusive to Microsoft's Windows 10 and Windows 11 operating systems. The beta release for Windows 10 launched on the Windows Store on 29 July 2015. After nearly a year and a half in beta, Microsoft fully released the version on 19 December 2016. Called the "Ender Update", this release implemented new features to this version of Minecraft like world templates and add-on packs. On 7 June 2022, the Java and Bedrock Editions of Minecraft were merged into a single bundle for purchase on Windows; those who owned one version would automatically gain access to the other version. Both game versions would otherwise remain separate. Around 2011, prior to Minecraft's full release, Mojang collaborated with The Lego Group to create a Lego brick-based Minecraft game called Brickcraft. This would have modified the base Minecraft game to use Lego bricks, which meant adapting the basic 1×1 block to account for larger pieces typically used in Lego sets. Persson worked on an early version called "Project Rex Kwon Do", named after the character of the same name from the film Napoleon Dynamite. Although Lego approved the project and Mojang assigned two developers for six months, it was canceled due to the Lego Group's demands, according to Mojang's Daniel Kaplan. Lego considered buying Mojang to complete the game, but when Microsoft offered over $2 billion for the company, Lego stepped back, unsure of Minecraft's potential. On 26 June 2025, a build of Brickcraft dated 28 June 2012 was published on a community archive website Omniarchive. Initially, Markus Persson planned to support the Oculus Rift with a Minecraft port. However, after Facebook acquired Oculus in 2013, he abruptly canceled the plans, stating, "Facebook creeps me out." In 2016, a community-made mod, Minecraft VR, added VR support for Java Edition, followed by Vivecraft for HTC Vive. Later that year, Microsoft introduced official Oculus Rift support for Windows 10 Edition, leading to the discontinuation of the Minecraft VR mod due to trademark complaints. Vivecraft was endorsed by Minecraft VR contributors for its Rift support. Also available is a Gear VR version, titled Minecraft: Gear VR Edition. Windows Mixed Reality support was added in 2017. On 7 September 2020, Mojang Studios announced that the PlayStation 4 Bedrock version would receive PlayStation VR support later that month. In September 2024, the Minecraft team announced they would no longer support PlayStation VR, which received its final update in March 2025. Music and sound design Minecraft's music and sound effects were produced by German musician Daniel Rosenfeld, better known as C418. To create the sound effects for the game, Rosenfeld made extensive use of Foley techniques. On learning the processes for the game, he remarked, "Foley's an interesting thing, and I had to learn its subtleties. Early on, I wasn't that knowledgeable about it. It's a whole trial-and-error process. You just make a sound and eventually you go, 'Oh my God, that's it! Get the microphone!' There's no set way of doing anything at all." He reminisced on creating the in-game sound for grass blocks, stating "It turns out that to make grass sounds you don't actually walk on grass and record it, because grass sounds like nothing. What you want to do is get a VHS, break it apart, and just lightly touch the tape." According to Rosenfeld, his favorite sound to design for the game was the hisses of spiders. He elaborates, "I like the spiders. Recording that was a whole day of me researching what a spider sounds like. Turns out, there are spiders that make little screeching sounds, so I think I got this recording of a fire hose, put it in a sampler, and just pitched it around until it sounded like a weird spider was talking to you." Many of the sound design decisions by Rosenfeld were done accidentally or spontaneously. The creeper notably lacks any specific noises apart from a loud fuse-like sound when about to explode; Rosenfeld later recalled "That was just a complete accident by Markus and me [sic]. We just put in a placeholder sound of burning a matchstick. It seemed to work hilariously well, so we kept it." On other sounds, such as those of the zombie, Rosenfeld remarked, "I actually never wanted the zombies so scary. I intentionally made them sound comical. It's nice to hear that they work so well [...]." Rosenfeld remarked that the sound engine was "terrible" to work with, remembering "If you had two song files at once, it [the game engine] would actually crash. There were so many more weird glitches like that the guys never really fixed because they were too busy with the actual game and not the sound engine." The background music in Minecraft consists of instrumental ambient music. To compose the music of Minecraft, Rosenfeld used the package from Ableton Live, along with several additional plug-ins. Speaking on them, Rosenfeld said "They can be pretty much everything from an effect to an entire orchestra. Additionally, I've got some synthesizers that are attached to the computer. Like a Moog Voyager, Dave Smith Prophet 08 and a Virus TI." On 4 March 2011, Rosenfeld released a soundtrack titled Minecraft – Volume Alpha; it includes most of the tracks featured in Minecraft, as well as other music not featured in the game. Kirk Hamilton of Kotaku chose the music in Minecraft as one of the best video game soundtracks of 2011. On 9 November 2013, Rosenfeld released the second official soundtrack, titled Minecraft – Volume Beta, which included the music that was added in a 2013 "Music Update" for the game. A physical release of Volume Alpha, consisting of CDs, black vinyl, and limited-edition transparent green vinyl LPs, was issued by indie electronic label Ghostly International on 21 August 2015. On 14 August 2020, Ghostly released Volume Beta on CD and vinyl, with alternate color LPs and lenticular cover pressings released in limited quantities. The final update Rosenfeld worked on was 2018's 1.13 Update Aquatic. His music remained the only music in the game until 2020's "Nether Update", introducing pieces from Lena Raine. Since then, other composers have made contributions, including Kumi Tanioka, Samuel Åberg, Aaron Cherof, and Amos Roddy, with Raine remaining as the new primary composer. Ownership of all music besides Rosenfeld's independently released albums has been retained by Microsoft, with their label publishing all of the other artists' releases. Gareth Coker also composed some of the music for the game's mini games from the Legacy Console editions. Rosenfeld had stated his intent to create a third album of music for the game in a 2015 interview with Fact, and confirmed its existence in a 2017 tweet, stating that his work on the record as of then had tallied up to be longer than the previous two albums combined, which in total clocks in at over 3 hours and 18 minutes. However, due to licensing issues with Microsoft, the third volume has since not seen release. On 8 January 2021, Rosenfeld was asked in an interview with Anthony Fantano whether or not there was still a third volume of his music intended for release. Rosenfeld responded, saying, "I have something—I consider it finished—but things have become complicated, especially as Minecraft is now a big property, so I don't know." Reception Minecraft has received critical acclaim, with praise for the creative freedom it grants players in-game, as well as the ease of enabling emergent gameplay. Critics have expressed enjoyment in Minecraft's complex crafting system, commenting that it is an important aspect of the game's open-ended gameplay. Most publications were impressed by the game's "blocky" graphics, with IGN describing them as "instantly memorable". Reviewers also liked the game's adventure elements, noting that the game creates a good balance between exploring and building. The game's multiplayer feature has been generally received favorably, with IGN commenting that "adventuring is always better with friends". Jaz McDougall of PC Gamer said Minecraft is "intuitively interesting and contagiously fun, with an unparalleled scope for creativity and memorable experiences". It has been regarded as having introduced millions of children to the digital world, insofar as its basic game mechanics are logically analogous to computer commands. IGN was disappointed about the troublesome steps needed to set up multiplayer servers, calling it a "hassle". Critics also said that visual glitches occur periodically. Despite its release out of beta in 2011, GameSpot said the game had an "unfinished feel", adding that some game elements seem "incomplete or thrown together in haste". A review of the alpha version, by Scott Munro of the Daily Record, called it "already something special" and urged readers to buy it. Jim Rossignol of Rock Paper Shotgun also recommended the alpha of the game, calling it "a kind of generative 8-bit Lego Stalker". On 17 September 2010, gaming webcomic Penny Arcade began a series of comics and news posts about the addictiveness of the game. The Xbox 360 version was generally received positively by critics, but did not receive as much praise as the PC version. Although reviewers were disappointed by the lack of features such as mod support and content from the PC version, they acclaimed the port's addition of a tutorial and in-game tips and crafting recipes, saying that they make the game more user-friendly. The Xbox One Edition was one of the best received ports, being praised for its relatively large worlds. The PlayStation 3 Edition also received generally favorable reviews, being compared to the Xbox 360 Edition and praised for its well-adapted controls. The PlayStation 4 edition was the best received port to date, being praised for having 36 times larger worlds than the PlayStation 3 edition and described as nearly identical to the Xbox One edition. The PlayStation Vita Edition received generally positive reviews from critics but was noted for its technical limitations. The Wii U version received generally positive reviews from critics but was noted for a lack of GamePad integration. The 3DS version received mixed reviews, being criticized for its high price, technical issues, and lack of cross-platform play. The Nintendo Switch Edition received fairly positive reviews from critics, being praised, like other modern ports, for its relatively larger worlds. Minecraft: Pocket Edition initially received mixed reviews from critics. Although reviewers appreciated the game's intuitive controls, they were disappointed by the lack of content. The inability to collect resources and craft items, as well as the limited types of blocks and lack of hostile mobs, were especially criticized. After updates added more content, Pocket Edition started receiving more positive reviews. Reviewers complimented the controls and the graphics, but still noted a lack of content. Minecraft surpassed over a million purchases less than a month after entering its beta phase in early 2011. At the same time, the game had no publisher backing and has never been commercially advertised except through word of mouth, and various unpaid references in popular media such as the Penny Arcade webcomic. By April 2011, Persson estimated that Minecraft had made €23 million (US$33 million) in revenue, with 800,000 sales of the alpha version of the game, and over 1 million sales of the beta version. In November 2011, prior to the game's full release, Minecraft beta surpassed 16 million registered users and 4 million purchases. By March 2012, Minecraft had become the 6th best-selling PC game of all time. As of 10 October 2014[update], the game had sold 17 million copies on PC, becoming the best-selling PC game of all time. On 25 February 2014, the game reached 100 million registered users. By May 2019, 180 million copies had been sold across all platforms, making it the single best-selling video game of all time. The free-to-play Minecraft China version had over 700 million registered accounts by September 2023. By 2023, the game had sold over 300 million copies. As of April 2025, Minecraft has sold over 350 million copies. The Xbox 360 version of Minecraft became profitable within the first day of the game's release in 2012, when the game broke the Xbox Live sales records with 400,000 players online. Within a week of being on the Xbox Live Marketplace, Minecraft sold a million copies. GameSpot announced in December 2012 that Minecraft sold over 4.48 million copies since the game debuted on Xbox Live Arcade in May 2012. In 2012, Minecraft was the most purchased title on Xbox Live Arcade; it was also the fourth most played title on Xbox Live based on average unique users per day. As of 4 April 2014[update], the Xbox 360 version has sold 12 million copies. In addition, Minecraft: Pocket Edition has reached a figure of 21 million in sales. The PlayStation 3 Edition sold one million copies in five weeks. The release of the game's PlayStation Vita version boosted Minecraft sales by 79%, outselling both PS3 and PS4 debut releases and becoming the largest Minecraft launch on a PlayStation console. The PS Vita version sold 100,000 digital copies in Japan within the first two months of release, according to an announcement by SCE Japan Asia. By January 2015, 500,000 digital copies of Minecraft were sold in Japan across all PlayStation platforms, with a surge in primary school children purchasing the PS Vita version. As of 2022, the Vita version has sold over 1.65 million physical copies in Japan, making it the best-selling Vita game in the country. Minecraft helped improve Microsoft's total first-party revenue by $63 million for the 2015 second quarter. The game, including all of its versions, had over 112 million monthly active players by September 2019. On its 11th anniversary in May 2020, the company announced that Minecraft had reached over 200 million copies sold across platforms with over 126 million monthly active players. By April 2021, the number of active monthly users had climbed to 140 million. In July 2010, PC Gamer listed Minecraft as the fourth-best game to play at work. In December of that year, Good Game selected Minecraft as their choice for Best Downloadable Game of 2010, Gamasutra named it the eighth best game of the year as well as the eighth best indie game of the year, and Rock, Paper, Shotgun named it the "game of the year". Indie DB awarded the game the 2010 Indie of the Year award as chosen by voters, in addition to two out of five Editor's Choice awards for Most Innovative and Best Singleplayer Indie. It was also awarded Game of the Year by PC Gamer UK. The game was nominated for the Seumas McNally Grand Prize, Technical Excellence, and Excellence in Design awards at the March 2011 Independent Games Festival and won the Grand Prize and the community-voted Audience Award. At Game Developers Choice Awards 2011, Minecraft won awards in the categories for Best Debut Game, Best Downloadable Game and Innovation Award, winning every award for which it was nominated. It also won GameCity's video game arts award. On 5 May 2011, Minecraft was selected as one of the 80 games that would be displayed at the Smithsonian American Art Museum as part of The Art of Video Games exhibit that opened on 16 March 2012. At the 2011 Spike Video Game Awards, Minecraft won the award for Best Independent Game and was nominated in the Best PC Game category. In 2012, at the British Academy Video Games Awards, Minecraft was nominated in the GAME Award of 2011 category and Persson received The Special Award. In 2012, Minecraft XBLA was awarded a Golden Joystick Award in the Best Downloadable Game category, and a TIGA Games Industry Award in the Best Arcade Game category. In 2013, it was nominated as the family game of the year at the British Academy Video Games Awards. During the 16th Annual D.I.C.E. Awards, the Academy of Interactive Arts & Sciences nominated the Xbox 360 version of Minecraft for "Strategy/Simulation Game of the Year". Minecraft Console Edition won the award for TIGA Game Of The Year in 2014. In 2015, the game placed 6th on USgamer's The 15 Best Games Since 2000 list. In 2016, Minecraft placed 6th on Time's The 50 Best Video Games of All Time list. Minecraft was nominated for the 2013 Kids' Choice Awards for Favorite App, but lost to Temple Run. It was nominated for the 2014 Kids' Choice Awards for Favorite Video Game, but lost to Just Dance 2014. The game later won the award for the Most Addicting Game at the 2015 Kids' Choice Awards. In addition, the Java Edition was nominated for "Favorite Video Game" at the 2018 Kids' Choice Awards, while the game itself won the "Still Playing" award at the 2019 Golden Joystick Awards, as well as the "Favorite Video Game" award at the 2020 Kids' Choice Awards. Minecraft also won "Stream Game of the Year" at inaugural Streamer Awards in 2021. The game later garnered a Nickelodeon Kids' Choice Award nomination for Favorite Video Game in 2021, and won the same category in 2022 and 2023. At the Golden Joystick Awards 2025, it won the Still Playing Award - PC and Console. Minecraft has been subject to several notable controversies. In June 2014, Mojang announced that it would begin enforcing the portion of Minecraft's end-user license agreement (EULA) which prohibits servers from giving in-game advantages to players in exchange for donations or payments. Spokesperson Owen Hill stated that servers could still require players to pay a fee to access the server and could sell in-game cosmetic items. The change was supported by Persson, citing emails he received from parents of children who had spent hundreds of dollars on servers. The Minecraft community and server owners protested, arguing that the EULA's terms were more broad than Mojang was claiming, that the crackdown would force smaller servers to shut down for financial reasons, and that Mojang was suppressing competition for its own Minecraft Realms subscription service. The controversy contributed to Notch's decision to sell Mojang. In 2020, Mojang announced an eventual change to the Java Edition to require a login from a Microsoft account rather than a Mojang account, the latter of which would be sunsetted. This also required Java Edition players to create Xbox network Gamertags. Mojang defended the move to Microsoft accounts by saying that improved security could be offered, including two-factor authentication, blocking cyberbullies in chat, and improved parental controls. The community responded with intense backlash, citing various technical difficulties encountered in the process and how account migration would be mandatory, even for those who do not play on servers. As of 10 March 2022, Microsoft required that all players migrate in order to maintain access the Java Edition of Minecraft. Mojang announced a deadline of 19 September 2023 for account migration, after which all legacy Mojang accounts became inaccessible and unable to be migrated. In June 2022, Mojang added a player-reporting feature in Java Edition. Players could report other players on multiplayer servers for sending messages prohibited by the Xbox Live Code of Conduct; report categories included profane language,[l] substance abuse, hate speech, threats of violence, and nudity. If a player was found to be in violation of Xbox Community Standards, they would be banned from all servers for a specific period of time or permanently. The update containing the report feature (1.19.1) was released on 27 July 2022. Mojang received substantial backlash and protest from community members, one of the most common complaints being that banned players would be forbidden from joining any server, even private ones. Others took issue to what they saw as Microsoft increasing control over its player base and exercising censorship, leading some to start a hashtag #saveminecraft and dub the version "1.19.84", a reference to the dystopian novel Nineteen Eighty-Four. The "Mob Vote" was an online event organized by Mojang in which the Minecraft community voted between three original mob concepts; initially, the winning mob was to be implemented in a future update, while the losing mobs were scrapped, though after the first mob vote this was changed, and losing mobs would now have a chance to come to the game in the future. The first Mob Vote was held during Minecon Earth 2017 and became an annual event starting with Minecraft Live 2020. The Mob Vote was often criticized for forcing players to choose one mob instead of implementing all three, causing divisions and flaming within the community, and potentially allowing internet bots and Minecraft content creators with large fanbases to conduct vote brigading. The Mob Vote was also blamed for a perceived lack of new content added to Minecraft since Microsoft's acquisition of Mojang in 2014. The 2023 Mob Vote featured three passive mobs—the crab, the penguin, and the armadillo—with voting scheduled to start on 13 October. In response, a Change.org petition was created on 6 October, demanding that Mojang eliminate the Mob Vote and instead implement all three mobs going forward. The petition received approximately 445,000 signatures by 13 October and was joined by calls to boycott the Mob Vote, as well as a partially tongue-in-cheek "revolutionary" propaganda campaign in which sympathizers created anti-Mojang and pro-boycott posters in the vein of real 20th century propaganda posters. Mojang did not release an official response to the boycott, and the Mob Vote otherwise proceeded normally, with the armadillo winning the vote. In September 2024, as part of a blog post detailing their future plans for Minecraft's development, Mojang announced the Mob Vote would be retired. Cultural impact In September 2019, The Guardian classified Minecraft as the best video game of the 21st century to date, and in November 2019, Polygon called it the "most important game of the decade" in its 2010s "decade in review". In June 2020, Minecraft was inducted into the World Video Game Hall of Fame. Minecraft is recognized as one of the first successful games to use an early access model to draw in sales prior to its full release version to help fund development. As Minecraft helped to bolster indie game development in the early 2010s, it also helped to popularize the use of the early access model in indie game development. Social media sites such as YouTube, Facebook, and Reddit have played a significant role in popularizing Minecraft. Research conducted by the Annenberg School for Communication at the University of Pennsylvania showed that one-third of Minecraft players learned about the game via Internet videos. In 2010, Minecraft-related videos began to gain influence on YouTube, often made by commentators. The videos usually contain screen-capture footage of the game and voice-overs. Common coverage in the videos includes creations made by players, walkthroughs of various tasks, and parodies of works in popular culture. By May 2012, over four million Minecraft-related YouTube videos had been uploaded. The game would go on to be a prominent fixture within YouTube's gaming scene during the entire 2010s; in 2014, it was the second-most searched term on the entire platform. By 2018, it was still YouTube's biggest game globally. Some popular commentators have received employment at Machinima, a now-defunct gaming video company that owned a highly watched entertainment channel on YouTube. The Yogscast is a British company that regularly produces Minecraft videos; their YouTube channel has attained billions of views, and their panel at Minecon 2011 had the highest attendance. Another well-known YouTube personality is Jordan Maron, known online as CaptainSparklez, who has also created many Minecraft music parodies, including "Revenge", a parody of Usher's "DJ Got Us Fallin' in Love". Minecraft's popularity on YouTube was described by Polygon as quietly dominant, although in 2019, thanks in part to PewDiePie's playthroughs of the game, Minecraft experienced a visible uptick in popularity on the platform. Longer-running series include Far Lands or Bust, dedicated to reaching the obsolete "Far Lands" glitch by foot on an older version of the game. YouTube announced that on 14 December 2021 that the total amount of Minecraft-related views on the website had exceeded one trillion. Minecraft has been referenced by other video games, such as Torchlight II, Team Fortress 2, Borderlands 2, Choplifter HD, Super Meat Boy, The Elder Scrolls V: Skyrim, The Binding of Isaac, The Stanley Parable, and FTL: Faster Than Light. Minecraft is officially represented in downloadable content for the crossover fighter Super Smash Bros. Ultimate, with Steve as a playable character with a moveset including references to building, crafting, and redstone, alongside an Overworld-themed stage. It was also referenced by electronic music artist Deadmau5 in his performances. The game is also referenced heavily in "Informative Murder Porn", the second episode of the seventeenth season of the animated television series South Park. In 2025, A Minecraft Movie was released. It made $313 million in the box office in the first week, a record-breaking opening for a video game adaptation. Minecraft has been noted as a cultural touchstone for Generation Z, as many of the generation's members played the game at a young age. The possible applications of Minecraft have been discussed extensively, especially in the fields of computer-aided design (CAD) and education. In a panel at Minecon 2011, a Swedish developer discussed the possibility of using the game to redesign public buildings and parks, stating that rendering using Minecraft was much more user-friendly for the community, making it easier to envision the functionality of new buildings and parks. In 2012, a member of the Human Dynamics group at the MIT Media Lab, Cody Sumter, said: "Notch hasn't just built a game. He's tricked 40 million people into learning to use a CAD program." Various software has been developed to allow virtual designs to be printed using professional 3D printers or personal printers such as MakerBot and RepRap. In September 2012, Mojang began the Block by Block project in cooperation with UN Habitat to create real-world environments in Minecraft. The project allows young people who live in those environments to participate in designing the changes they would like to see. Using Minecraft, the community has helped reconstruct the areas of concern, and citizens are invited to enter the Minecraft servers and modify their own neighborhood. Carl Manneh, Mojang's managing director, called the game "the perfect tool to facilitate this process", adding "The three-year partnership will support UN-Habitat's Sustainable Urban Development Network to upgrade 300 public spaces by 2016." Mojang signed Minecraft building community, FyreUK, to help render the environments into Minecraft. The first pilot project began in Kibera, one of Nairobi's informal settlements and is in the planning phase. The Block by Block project is based on an earlier initiative started in October 2011, Mina Kvarter (My Block), which gave young people in Swedish communities a tool to visualize how they wanted to change their part of town. According to Manneh, the project was a helpful way to visualize urban planning ideas without necessarily having a training in architecture. The ideas presented by the citizens were a template for political decisions. In April 2014, the Danish Geodata Agency generated all of Denmark in fullscale in Minecraft based on their own geodata. This is possible because Denmark is one of the flattest countries with the highest point at 171 meters (ranking as the country with the 30th smallest elevation span), where the limit in default Minecraft was around 192 meters above in-game sea level when the project was completed. Taking advantage of the game's accessibility where other websites are censored, the non-governmental organization Reporters Without Borders has used an open Minecraft server to create the Uncensored Library, a repository within the game of journalism by authors from countries (including Egypt, Mexico, Russia, Saudi Arabia and Vietnam) who have been censored and arrested, such as Jamal Khashoggi. The neoclassical virtual building was created over about 250 hours by an international team of 24 people. Despite its unpredictable nature, Minecraft speedrunning, where players time themselves from spawning into a new world to reaching The End and defeating the Ender Dragon boss, is popular. Some speedrunners use a combination of mods, external programs, and debug menus, while other runners play the game in a more vanilla or more consistency-oriented way. Minecraft has been used in educational settings through initiatives such as MinecraftEdu, founded in 2011 to make the game affordable and accessible for schools in collaboration with Mojang. MinecraftEdu provided features allowing teachers to monitor student progress, including screenshot submissions as evidence of lesson completion, and by 2012 reported that approximately 250,000 students worldwide had access to the platform. Mojang also developed Minecraft: Education Edition with pre-built lesson plans for up to 30 students in a closed environment. Educators have used Minecraft to teach subjects such as history, language arts, and science through custom-built environments, including reconstructions of historical landmarks and large-scale models of biological structures such as animal cells. The introduction of redstone blocks enabled the construction of functional virtual machines such as a hard drive and an 8-bit computer. Mods have been created to use these mechanics for teaching programming. In 2014, the British Museum announced a project to reproduce its building and exhibits in Minecraft in collaboration with the public. Microsoft and Code.org have offered Minecraft-based tutorials and activities designed to teach programming, reporting by 2018 that more than 85 million children had used their resources. In 2025, the Musée de Minéralogie in Paris held a temporary exhibition titled "Minerals in Minecraft." Following the initial surge in popularity of Minecraft in 2010, other video games were criticised for having various similarities to Minecraft, and some were described as being "clones", often due to a direct inspiration from Minecraft, or a superficial similarity. Examples include Ace of Spades, CastleMiner, CraftWorld, FortressCraft, Terraria, BlockWorld 3D, Total Miner, and Luanti (formerly Minetest). David Frampton, designer of The Blockheads, reported that one failure of his 2D game was the "low resolution pixel art" that too closely resembled the art in Minecraft, which resulted in "some resistance" from fans. A homebrew adaptation of the alpha version of Minecraft for the Nintendo DS, titled DScraft, has been released; it has been noted for its similarity to the original game considering the technical limitations of the system. In response to Microsoft's acquisition of Mojang and their Minecraft IP, various developers announced further clone titles developed specifically for Nintendo's consoles, as they were the only major platforms not to officially receive Minecraft at the time. These clone titles include UCraft (Nexis Games), Cube Life: Island Survival (Cypronia), Discovery (Noowanda), Battleminer (Wobbly Tooth Games), Cube Creator 3D (Big John Games), and Stone Shire (Finger Gun Games). Despite this, the fears of fans were unfounded, with official Minecraft releases on Nintendo consoles eventually resuming. Markus Persson made another similar game, Minicraft, for a Ludum Dare competition in 2011. In 2025, Persson announced through a poll on his X account that he was considering developing a spiritual successor to Minecraft. He later clarified that he was "100% serious", and that he had "basically announced Minecraft 2". Within days, however, Persson cancelled the plans after speaking to his team. In November 2024, artificial intelligence companies Decart and Etched released Oasis, an artificially generated version of Minecraft, as a proof of concept. Every in-game element is completely AI-generated in real time and the model does not store world data, leading to "hallucinations" such as items and blocks appearing that were not there before. In January 2026, indie game developer Unomelon announced that their voxel sandbox game Allumeria would be playable in Steam Next Fest that year. On 10 February, Mojang issued a DMCA takedown of Allumeria on Steam through Valve, alleging the game was infringing on Minecraft's copyright. Some reports suggested that the takedown may have used an automatic AI copyright claiming service. The DMCA was later withdrawn. Minecon was an annual official fan convention dedicated to Minecraft. The first full Minecon was held in November 2011 at the Mandalay Bay Hotel and Casino in Las Vegas. The event included the official launch of Minecraft; keynote speeches, including one by Persson; building and costume contests; Minecraft-themed breakout classes; exhibits by leading gaming and Minecraft-related companies; commemorative merchandise; and autograph and picture times with Mojang employees and well-known contributors from the Minecraft community. In 2016, Minecon was held in-person for the last time, with the following years featuring annual "Minecon Earth" livestreams on minecraft.net and YouTube instead. These livestreams, later rebranded to "Minecraft Live", included the mob/biome votes, and announcements of new game updates. In 2025, "Minecraft Live" became a biannual event as part of Minecraft's changing update schedule.[citation needed] Notes References External links
========================================
[SOURCE: https://www.reddit.com/user/StrawberryGS] | [TOKENS: 3599]
StrawberryGS u/StrawberryGS r/MinecraftCommands A place for all things about commands, command blocks and data-packs in vanilla Minecraft; to share, to question, to discuss, and more! Please read the pinned post before posting. Weekly visitors Weekly contributions r/MinecraftCommands r/MinecraftCommands A place for all things about commands, command blocks and data-packs in vanilla Minecraft; to share, to question, to discuss, and more! Please read the pinned post before posting. Weekly visitors Weekly contributions • Create persistent slime, damageable but unkillable, for parkour course? StrawberryGS replied to GalSergey After I removed enough characters to get it under 256, that seems to have worked! Thank you so much!! Reply reply A place for all things about commands, command blocks and data-packs in vanilla Minecraft; to share, to question, to discuss, and more! Please read the pinned post before posting. A place for all things about commands, command blocks and data-packs in vanilla Minecraft; to share, to question, to discuss, and more! Please read the pinned post before posting. Create persistent slime, damageable but unkillable, for parkour course? After I removed enough characters to get it under 256, that seems to have worked! Thank you so much!! r/MrTechnodad The official subreddit for everyone’s favorite internet dad, Mr. Technodad. Join Our Discord! https://discord.gg/mrtechnodad Weekly visitors Weekly contributions r/MrTechnodad r/MrTechnodad The official subreddit for everyone’s favorite internet dad, Mr. Technodad. Join Our Discord! https://discord.gg/mrtechnodad Weekly visitors Weekly contributions • Questions about Technoblade's MCC retirement StrawberryGS commented This is what Smajor said about Techno not coming back to MCC: https://www.twitch.tv/smajor/clip/SmellyTallLyrebirdPoooound-wPVh2LRotgDH6B98 This was said in July '21, shortly before Techno knew he had cancer. Scott was reacting to chat messages asking for Techno to be in MCC and asking why he wasn't. He said that Techno didn't like the pressure chat put on him. Reply reply The official subreddit for everyone’s favorite internet dad, Mr. Technodad. Join Our Discord! https://discord.gg/mrtechnodad The official subreddit for everyone’s favorite internet dad, Mr. Technodad. Join Our Discord! https://discord.gg/mrtechnodad Questions about Technoblade's MCC retirement This is what Smajor said about Techno not coming back to MCC: https://www.twitch.tv/smajor/clip/SmellyTallLyrebirdPoooound-wPVh2LRotgDH6B98 This was said in July '21, shortly before Techno knew he had cancer. Scott was reacting to chat messages asking for Techno to be in MCC and asking why he wasn't. He said that Techno didn't like the pressure chat put on him. r/Technoblade Official Subreddit for remembering the Youtuber Technoblade and anarchist propaganda. Community Discord: discord.gg/technoblade Weekly visitors Weekly contributions r/Technoblade r/Technoblade Official Subreddit for remembering the Youtuber Technoblade and anarchist propaganda. Community Discord: discord.gg/technoblade Weekly visitors Weekly contributions • Regarding Camman... StrawberryGS replied to ender_on_paws Someone thinks they caught Dream cheating at parkour back in MCC 11, and someone else claimed that this is why Techno quit MCC right before he got cancer. The latter is patently false, but is why this thread is posted here. The former, it's hard to say 100%, but Dream did a stream today pretty much explaining exactly how he did so well at all his jumps. I know I used to hang out in his Twitch chat when he was practicing, and he spent an extremely large amount of time practicing the smallest details. Some people really are just that good. I've seen IRL friends to amazing things. When the practice, more so. Dream's movement has always been beautiful and a memorizing sight to behold. If he was cheating, there'd be no need to practice that much. The most damning evidence was the dropout of his keyboard sounds during some sections of the VOD, but he explains the noise gate here and that also makes perfect sense: https://www.twitch.tv/dream/clip/SincereDoubtfulPeppermintGOWSkull-SgIy9DJWEC35mLJs My analysis: Dream is not guilty, but people who love to hate on him will continue to think so or use their hot takes for clout. I really wish the world would cut the guy a break. Reply reply Official Subreddit for remembering the Youtuber Technoblade and anarchist propaganda. Community Discord: discord.gg/technoblade Official Subreddit for remembering the Youtuber Technoblade and anarchist propaganda. Community Discord: discord.gg/technoblade Regarding Camman... Someone thinks they caught Dream cheating at parkour back in MCC 11, and someone else claimed that this is why Techno quit MCC right before he got cancer. The latter is patently false, but is why this thread is posted here. The former, it's hard to say 100%, but Dream did a stream today pretty much explaining exactly how he did so well at all his jumps. I know I used to hang out in his Twitch chat when he was practicing, and he spent an extremely large amount of time practicing the smallest details. Some people really are just that good. I've seen IRL friends to amazing things. When the practice, more so. Dream's movement has always been beautiful and a memorizing sight to behold. If he was cheating, there'd be no need to practice that much. The most damning evidence was the dropout of his keyboard sounds during some sections of the VOD, but he explains the noise gate here and that also makes perfect sense: https://www.twitch.tv/dream/clip/SincereDoubtfulPeppermintGOWSkull-SgIy9DJWEC35mLJs My analysis: Dream is not guilty, but people who love to hate on him will continue to think so or use their hot takes for clout. I really wish the world would cut the guy a break. r/Technoblade Official Subreddit for remembering the Youtuber Technoblade and anarchist propaganda. Community Discord: discord.gg/technoblade Weekly visitors Weekly contributions r/Technoblade r/Technoblade Official Subreddit for remembering the Youtuber Technoblade and anarchist propaganda. Community Discord: discord.gg/technoblade Weekly visitors Weekly contributions • Regarding Camman... StrawberryGS replied to apatheticchildofJen It is not at all certain that he cheated. He just did a whole stream today explaining how to do those jumps, and how, by crouching then jumping, you can get your timing almost perfect. It's much like the people that do well at rhythm games, once you get into the rhythm, you can be extremely accurate until you lose it. The clips amplifying his keyboard sounds, or lack thereof may seem pretty convincing, but he has a noise gate on his mic specifically to remove those sounds, so if he's not talking or nothing loud has happened recently, then those sounds will drop from the VOD. I use one myself, so I understand what he means. It's more "let's hate Dream because that's fun and popular" drama. Reply reply Official Subreddit for remembering the Youtuber Technoblade and anarchist propaganda. Community Discord: discord.gg/technoblade Official Subreddit for remembering the Youtuber Technoblade and anarchist propaganda. Community Discord: discord.gg/technoblade Regarding Camman... It is not at all certain that he cheated. He just did a whole stream today explaining how to do those jumps, and how, by crouching then jumping, you can get your timing almost perfect. It's much like the people that do well at rhythm games, once you get into the rhythm, you can be extremely accurate until you lose it. The clips amplifying his keyboard sounds, or lack thereof may seem pretty convincing, but he has a noise gate on his mic specifically to remove those sounds, so if he's not talking or nothing loud has happened recently, then those sounds will drop from the VOD. I use one myself, so I understand what he means. It's more "let's hate Dream because that's fun and popular" drama. Create persistent slime, damageable but unkillable, for parkour course? r/MinecraftCommands r/MinecraftCommands A place for all things about commands, command blocks and data-packs in vanilla Minecraft; to share, to question, to discuss, and more! Please read the pinned post before posting. Weekly visitors Weekly contributions • Create persistent slime, damageable but unkillable, for parkour course? Help | Java 1.21.11 Hi, On my SMP I'm making a parkour course. I generally try to keep things as "vanilla" as possible, but I'll make some lore around why a few things aren't strictly vanilla from time to time. The course is to use the trident, mace, and lance as in this video: https://www.youtube.com/watch?v=mkPFly9l8Pw I'm having trouble getting the mace parts to work. When I set Invulnerable:true then the slime won't take damage from players in survival, and the mace windburst effect won't trigger. Everything else I can think of to protect the slime from death doesn't seem to work. It will start dying after two mace hits when I use this command: /summon minecraft:slime ~ ~ ~ {Size:6, NoGravity:true, NoAI:true, PersistenceRequired:true, AbsorptionAmount:10000, active_effects:{resistance:10000, regeneration:10000 , instant_health:10000, absorption:10000}} Does anyone have any recommendations? Thanks! A place for all things about commands, command blocks and data-packs in vanilla Minecraft; to share, to question, to discuss, and more! Please read the pinned post before posting. Hi, On my SMP I'm making a parkour course. I generally try to keep things as "vanilla" as possible, but I'll make some lore around why a few things aren't strictly vanilla from time to time. The course is to use the trident, mace, and lance as in this video: https://www.youtube.com/watch?v=mkPFly9l8Pw I'm having trouble getting the mace parts to work. When I set Invulnerable:true then the slime won't take damage from players in survival, and the mace windburst effect won't trigger. Everything else I can think of to protect the slime from death doesn't seem to work. It will start dying after two mace hits when I use this command: /summon minecraft:slime ~ ~ ~ {Size:6, NoGravity:true, NoAI:true, PersistenceRequired:true, AbsorptionAmount:10000, active_effects:{resistance:10000, regeneration:10000 , instant_health:10000, absorption:10000}} Does anyone have any recommendations? Thanks! r/MrTechnodad The official subreddit for everyone’s favorite internet dad, Mr. Technodad. Join Our Discord! https://discord.gg/mrtechnodad Weekly visitors Weekly contributions r/MrTechnodad r/MrTechnodad The official subreddit for everyone’s favorite internet dad, Mr. Technodad. Join Our Discord! https://discord.gg/mrtechnodad Weekly visitors Weekly contributions • Questions about Technoblade's MCC retirement StrawberryGS commented I wonder if those same experts went through Techno's footage with the same critical eye. I've watched Dream spend countless hours practicing his skills. I unfortunately don't know enough to be able to say one way or the other what he did. Reply reply The official subreddit for everyone’s favorite internet dad, Mr. Technodad. Join Our Discord! https://discord.gg/mrtechnodad The official subreddit for everyone’s favorite internet dad, Mr. Technodad. Join Our Discord! https://discord.gg/mrtechnodad Questions about Technoblade's MCC retirement I wonder if those same experts went through Techno's footage with the same critical eye. I've watched Dream spend countless hours practicing his skills. I unfortunately don't know enough to be able to say one way or the other what he did. r/SteamScams A place to seek help if you have been scammed on Steam, help those who have been, alert others of new scams, and ridiculing bad scammers. - Please note that we are not affiliated with Steam, Valve Corp, or any other company or service in any way. Weekly visitors Weekly contributions r/SteamScams r/SteamScams A place to seek help if you have been scammed on Steam, help those who have been, alert others of new scams, and ridiculing bad scammers. - Please note that we are not affiliated with Steam, Valve Corp, or any other company or service in any way. Weekly visitors Weekly contributions • This is a new one StrawberryGS commented My streamer-personality e-mail address just received an e-mail from the same person (searching it led to this reddit post) asking about my daughter's account, and offering me $5K for it. That's some serious stalking if they know it's my daughter's account. I know Minecraft leaked the e-mails of old accounts a while ago, but this was still kind of creepy to receive. Reply reply A place to seek help if you have been scammed on Steam, help those who have been, alert others of new scams, and ridiculing bad scammers. - Please note that we are not affiliated with Steam, Valve Corp, or any other company or service in any way. A place to seek help if you have been scammed on Steam, help those who have been, alert others of new scams, and ridiculing bad scammers. - Please note that we are not affiliated with Steam, Valve Corp, or any other company or service in any way. This is a new one My streamer-personality e-mail address just received an e-mail from the same person (searching it led to this reddit post) asking about my daughter's account, and offering me $5K for it. That's some serious stalking if they know it's my daughter's account. I know Minecraft leaked the e-mails of old accounts a while ago, but this was still kind of creepy to receive. r/HytaleServers Welcome to /r/HytaleServers! This community is all about sharing and discovering new servers to play on in Hytale. Weekly visitors Weekly contributions r/HytaleServers r/HytaleServers Welcome to /r/HytaleServers! This community is all about sharing and discovering new servers to play on in Hytale. Weekly visitors Weekly contributions • Looking for a specific Vanilla server StrawberryGS commented Mine is close, but not quite what you're looking for. I implemented a "Strawberry Seeds" mod, just to show my chat how modding works. They're basically an exact duplicate of cauliflowers other than the sprites and the seeds craft instantly. (They still grow at the same rate). That's the only mod I plan to add. I'm a 50 year old "mom" content creator and there's an application process to join, which cuts down on but can't eliminate griefing. When core protect completely works well, I'll probably add that mod, because the server is pretty open to random people. We do have a "duping is ok" rule, if you're looking for servers that don't have mods that prevent dupes that are present in vanilla. Reply reply Welcome to /r/HytaleServers! This community is all about sharing and discovering new servers to play on in Hytale. Welcome to /r/HytaleServers! This community is all about sharing and discovering new servers to play on in Hytale. Looking for a specific Vanilla server Mine is close, but not quite what you're looking for. I implemented a "Strawberry Seeds" mod, just to show my chat how modding works. They're basically an exact duplicate of cauliflowers other than the sprites and the seeds craft instantly. (They still grow at the same rate). That's the only mod I plan to add. I'm a 50 year old "mom" content creator and there's an application process to join, which cuts down on but can't eliminate griefing. When core protect completely works well, I'll probably add that mod, because the server is pretty open to random people. We do have a "duping is ok" rule, if you're looking for servers that don't have mods that prevent dupes that are present in vanilla. StrawberryGS Internet Gamer Mom. I stream live on Twitch M-F at 1pm Pacific. I play mostly Minecraft and give out Mom advice. 16,802 2998 post karma, 13804 comment karma Karma 2,116 Contributions 4 y Cake day: Sep 16, 2021 Reddit Age 21 Active in > Social Links Trophy Case
========================================
[SOURCE: https://techcrunch.com/author/kirsten-korosec/] | [TOKENS: 190]
Save up to $680 on your pass with Super Early Bird rates. REGISTER NOW. Save up to $680 on your Disrupt 2026 pass. Ends February 27. REGISTER NOW. Latest AI Amazon Apps Biotech & Health Climate Cloud Computing Commerce Crypto Enterprise EVs Fintech Fundraising Gadgets Gaming Google Government & Policy Hardware Instagram Layoffs Media & Entertainment Meta Microsoft Privacy Robotics Security Social Space Startups TikTok Transportation Venture Staff Events Startup Battlefield StrictlyVC Newsletters Podcasts Videos Partner Content TechCrunch Brand Studio Crunchboard Contact Us Kirsten Korosec Transportation Editor, TechCrunch Latest from Kirsten Korosec 1,065 Episodes Last update: Feb 2026 Equity is TechCrunch’s flagship podcast about the business of startups, unpacked by the writers who know best. Produced by Theresa… TechCrunch Mobility is your destination for transportation news and insight. © 2025 TechCrunch Media LLC.
========================================
[SOURCE: https://en.wikipedia.org/wiki/Tharsis_Montes] | [TOKENS: 584]
Contents Tharsis Montes The Tharsis Montes (/ˈθɑːrsɪs ˈmɒntiːz/) are three large shield volcanoes in the Tharsis region of the planet Mars. From north to south, the volcanoes are Ascraeus Mons, Pavonis Mons and Arsia Mons. Mons (plural montes) is the Latin word for mountain; it is a descriptor term used in astrogeology for mountainous features in the Solar System. The three Tharsis Montes volcanoes are enormous by terrestrial standards, ranging in diameter from 375 km (233 mi) (Pavonis Mons) to 475 km (295 mi) (Arsia Mons). Ascraeus Mons is the tallest with a summit elevation of over 18 km (59,000 ft), or 15 km (49,000 ft) base-to-peak. For comparison, the tallest volcano on Earth, Mauna Kea in Hawaii, is about 120 km (75 mi) across and stands 9 km (30,000 ft) above the ocean floor. The Tharsis Montes volcanoes lie near the equator, along the crest of a vast volcanic plateau called the Tharsis region or Tharsis bulge. The Tharsis region is thousands of kilometers across and averages nearly 10 km (33,000 ft) above the mean elevation of the planet. Olympus Mons, the tallest known mountain in the Solar System, is located about 1,200 km (750 mi) northwest of the Tharsis Montes, at the edge of the Tharsis region. The Tharsis Montes were discovered by the Mariner 9 spacecraft in 1971. They were among the few surface features visible as the spacecraft entered orbit during a global dust storm. Appearing as faint spots through the dusty haze, they were informally christened North Spot, Middle Spot, and South Spot. A fourth spot corresponding to the albedo feature Nix Olympica was also visible and later named Olympus Mons. As the dust cleared, it became obvious that the spots were the tops of enormous shield volcanoes with complex central calderas (collapse craters). The three Tharsis Montes volcanoes are evenly spaced about 700 km (430 mi) apart from peak to peak, in a line oriented from southwest to northeast. This alignment is unlikely to be coincidental. Several smaller volcanic centers northeast of the Tharsis Montes are on an extension of the line. The three volcanoes, most notably Arsia Mons, also all have collapse features and rifts, from which flank eruptions issued, that transect them along the same northeast–southwest trend. The line clearly represents a major structural feature of the planet, but its origin is uncertain. Gallery See also References External links
========================================