text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-Hagopian-348] | [TOKENS: 17273] |
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America) |
======================================== |
[SOURCE: https://www.theverge.com/podcast] | [TOKENS: 1802] |
Podcasts from The VergeThe Verge’s award-winning podcasts put you right in the center of the intersection of technology, business, and culture. From in-depth conversations with bleeding-edge tech decision-makers to news and opinion on the latest gadgets, trends, and headlines, Verge podcasts always keep you slightly ahead of the times.Listen and follow wherever you get your podcasts, and subscribe to The Verge for a superior ad-free experience.THE VERGECASTThe Vergecast is the flagship podcast from The Verge about small gadgets, Big Tech, and everything in between. Every Friday, hosts Nilay Patel and David Pierce hang out and make sense of the week’s most important technology news. And every Tuesday, David leads a selection of The Verge’s expert staffers in an exploration of how gadgets and software affect our lives — and which ones you should bring into yours.Subscribe on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS DECODER WITH NILAY PATELDecoder is a new show from The Verge about big ideas — and other problems. Verge editor-in-chief Nilay Patel talks to a diverse cast of innovators and policymakers at the frontiers of business and technology to reveal how they’re navigating an ever-changing landscape, what keeps them up at night, and what it all means for our shared future.Subscribe on Apple Podcasts | Spotify | Google Podcasts | Pocket Casts | Stitcher VERSION HISTORYVersion History is a show about the best gadgets ever. And the worst ones. And the ones that might have changed the world, if they ever actually shipped. Every week, your favorite people from The Verge and beyond hang out to tell and debate the story of a gadget, app, website, or any other tech product, and try to determine the item’s true legacy. Because not every product is a hit, but every product has a story. And the ones that really matter aren’t always the ones you think.Subscribe everywhere you get your podcasts! PODCAST ARCHIVELAND OF THE GIANTSBig tech is transforming every aspect of our world. But how? And at what cost? In season 6 of Land of the Giants: The Facebook/ Meta Disruption, Shirin Ghaffary and Alex Heath brought us inside the company that’s determined how the world interacts and communicates online. How has Meta shaped our relationships, and what’s in store for us as the company undergoes an unprecedented transition?Listen on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS. WHY’D YOU PUSH THAT BUTTON?Hosted by Ashley Carman and Kaitlyn Tiffany, Why’d You Push That Button explored the choices technology forces us to make, featuring interviews with consumers, developers, friends, and strangers. It asked the hard, weird, occasionally dumb questions about how your tiny tech decisions impact your social life, like swiping on dating apps, leaving negative restaurant reviews, or indiscriminately liking celebrity photos on Instagram. Listen to the full show archive now.Listen on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS VERGE EXTRASAn experimental podcast from The Verge that explored new, fun, and interesting ways to entertain with audio. Featuring mini-series shows such as Pirate Radio, Better Worlds, and more.Listen on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS CONVERGE WITH CASEY NEWTONListen to the archives of Converge, a conversational game show hosted by Casey Newton. In each episode, one of the tech industry’s most fascinating entrepreneurs stepped into the hot seat to play a series of tailor-made games that are funny and revealing.Listen on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS CTRL-WALT-DELETEEnjoy the archives of this series featuring legendary technology columnist Walt Mossberg and The Verge’s editor-in-chief Nilay Patel. The series finale aired on June 13th, 2017, shortly before Walt’s well-deserved retirement.Listen on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS WHAT’S TECH?Enjoy the archives of this award-winning series from Christopher Thomas Plante and The Verge that explained technology bit by bit. The series finale aired December 6th, 2016, shortly before Chris re-joined Polygon as its executive editor.Listen on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS Podcasts from The Verge The Verge’s award-winning podcasts put you right in the center of the intersection of technology, business, and culture. From in-depth conversations with bleeding-edge tech decision-makers to news and opinion on the latest gadgets, trends, and headlines, Verge podcasts always keep you slightly ahead of the times. Listen and follow wherever you get your podcasts, and subscribe to The Verge for a superior ad-free experience. The Vergecast is the flagship podcast from The Verge about small gadgets, Big Tech, and everything in between. Every Friday, hosts Nilay Patel and David Pierce hang out and make sense of the week’s most important technology news. And every Tuesday, David leads a selection of The Verge’s expert staffers in an exploration of how gadgets and software affect our lives — and which ones you should bring into yours. Subscribe on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS Decoder is a new show from The Verge about big ideas — and other problems. Verge editor-in-chief Nilay Patel talks to a diverse cast of innovators and policymakers at the frontiers of business and technology to reveal how they’re navigating an ever-changing landscape, what keeps them up at night, and what it all means for our shared future. Subscribe on Apple Podcasts | Spotify | Google Podcasts | Pocket Casts | Stitcher Version History is a show about the best gadgets ever. And the worst ones. And the ones that might have changed the world, if they ever actually shipped. Every week, your favorite people from The Verge and beyond hang out to tell and debate the story of a gadget, app, website, or any other tech product, and try to determine the item’s true legacy. Because not every product is a hit, but every product has a story. And the ones that really matter aren’t always the ones you think. Subscribe everywhere you get your podcasts! PODCAST ARCHIVE Big tech is transforming every aspect of our world. But how? And at what cost? In season 6 of Land of the Giants: The Facebook/ Meta Disruption, Shirin Ghaffary and Alex Heath brought us inside the company that’s determined how the world interacts and communicates online. How has Meta shaped our relationships, and what’s in store for us as the company undergoes an unprecedented transition? Listen on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS. Hosted by Ashley Carman and Kaitlyn Tiffany, Why’d You Push That Button explored the choices technology forces us to make, featuring interviews with consumers, developers, friends, and strangers. It asked the hard, weird, occasionally dumb questions about how your tiny tech decisions impact your social life, like swiping on dating apps, leaving negative restaurant reviews, or indiscriminately liking celebrity photos on Instagram. Listen to the full show archive now. Listen on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS An experimental podcast from The Verge that explored new, fun, and interesting ways to entertain with audio. Featuring mini-series shows such as Pirate Radio, Better Worlds, and more. Listen on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS Listen to the archives of Converge, a conversational game show hosted by Casey Newton. In each episode, one of the tech industry’s most fascinating entrepreneurs stepped into the hot seat to play a series of tailor-made games that are funny and revealing. Listen on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS Enjoy the archives of this series featuring legendary technology columnist Walt Mossberg and The Verge’s editor-in-chief Nilay Patel. The series finale aired on June 13th, 2017, shortly before Walt’s well-deserved retirement. Listen on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS Enjoy the archives of this award-winning series from Christopher Thomas Plante and The Verge that explained technology bit by bit. The series finale aired December 6th, 2016, shortly before Chris re-joined Polygon as its executive editor. Listen on Apple Podcasts | Google Podcasts | Spotify | Pocket Casts | TuneIn | RSS © 2026 Vox Media, LLC. All Rights Reserved |
======================================== |
[SOURCE: https://www.wired.com/video/watch/tech-support-collectibles-expert-answers-collectibles-questions] | [TOKENS: 5412] |
Collectibles Expert Answers Collectibles Questions Released on 02/17/2026 Stay away from my Spider-Man #1! I'm Vincent Zurzolo, President of Metropolis Collectibles. Let's answer your questions from the internet. This is Collectibles Support. [upbeat music] Joseph Nicklo asks, What's the most expensive comic book you've ever held in your hands? Well, it's this one. This is Action Comics #1, CGC-graded 9.0, the Nicolas Cage copy. And we just sold this comic book for what is now a world record $15 million. Nicolas Cage was a huge comic book collector and our best customer. This was his copy and in 2000, it was stolen from his house in Los Angeles. We recovered it 11 years later and sold it for Nick for 2.2 million, which was a record at the time. This comic book has been in the same collection since 2011. And only recently, we were able to find a buyer at $15 million. This is the Holy Grail of Holy Grails, the most important comic book, and this happens to be the highest graded copy in the world. Before Action Comics #1, we did have comic books, comic books like this one with detectives, funny animals, teen humor, but there was nothing else like it until 1938, Action Comics #1. Action Comics #1 is the first appearance of Superman and the first appearance of any superhero. People immediately fell in love with Superman and they wanted more. Without this book, we wouldn't have the cottage industry that grew out of it. Batman, Wonder Woman, Captain America, and the list goes on and on. It all started here. @John_Connor_ asks, What's the most expensive Pokemon card and the rarest? Well, back in 2021, Logan Paul, the influencer, purchased a Pikachu Illustrator PSA 10.0 for a record then, $5.3 million. Why is this card so expensive? Well, only 40 copies were ever manufactured, and in 1998, they were given away at a Japanese fan contest. And this happens to be the highest graded copy known at a 10. While I don't have a PSA 10 Illustrator, I do have a PSA 10 Pikachu Van Gogh card. And what makes this really cool is it was given out at the Van Gogh Museum in 2023 and it caused a major disturbance that day. This card is worth approximately $2,500. Very rare, very tough to get and super cool. Electronic_Round_389 asks, How to grade comics and what does grading even do? To give you an idea on how to grade a comic book, I picked three different copies of Batman 259. A low grade, a mid grade, and a high grade copy. As we go higher in the numeric grading scale, from 1 to 10, there are less imperfections on the comic book. For instance, this 4.0 you can see has a ton of wrinkles and creases and tears. Quality is still decent and the page quality is good, but you can see that there's a lot of little imperfections on the spine, on the corners, paper missing, and this gets graded a 4.0 or very good. Next up, we have a mid-grade comic book, which is a 6.0. Here we see a lot less imperfections and defects, but you still have some. Overall though, a really nice looking book with great eye appeal. And last but not least, we have the high-grade comic book. This one, graded a 9.0, is almost perfect. It looks just fantastic and has a great spine, very square bound, beautiful pages, and a beautiful back cover too. To get a clearer picture of the differences between a low grade and a high grade book, I'm gonna show them to you together. Here you can see really, really easily that there are a lot more defects on this copy than there are on this copy. If you wanted to get your comic books graded, you could ship them to my company, Metropolis, or you could send them down to CGC, or you can bring them to your local comic shop. In all three cases, an expert will be able to look at your comic books and tell you what the grade is. If you get your comic book graded by CGC, one thing you should know is that it's not completely sealed. We have to allow for gases that are in the comic book holder to come out and for fresh air to get in. So there are built into this system ways for that to happen. Vin-zzz asks, Has anyone ever bought a prop from the set of a movie? Is this even possible? It's not only possible, but there's a whole world of prop collectors out there. I happen to be one of them, and I love props. Being one inch closer to the movie is just such an incredible feeling. I'm gonna show you my favorite prop. This is the Cosmic Cube from Captain America, The First Avenger. And I bought this at auction for $3,500. I've been offered 15,000, and I just will never sell this piece. I love this piece because it's from a pivotal point in the movie where the Red Skull is trying to get his hands on the Cosmic Cube. The auction took place at a comic book convention in Chicago. I remember taking one of my clients with me to the auction. I won this piece and he stayed behind and ended up bidding on the Captain America costume. He won it for what I thought was an insane $250,000. Years later, I got to auction that very piece off for him. Before we shipped it out, I asked my shipping guys, bring it into my office, and I got to try on Chris Evans' Captain America costume. It was amazing. Tangthattangerine asks, Any way to authenticate movie props after they're in-hand? One way is to screen match them. This is a sword from Conan the Destroyer. This is the sequel to Conan the Barbarian. This movie was made in 1984, and yes, this was held by none other than Arnold Schwarzenegger. And the way you can tell if this is a real prop is by screen matching it. You can tell by the little imperfections and dings on the sword and by looking at the film if it actually is the same sword. Collectors, auction houses, and dealers, they sit down and watch the movie frame by frame until they can tell that this is the actual piece. And there's another way to test the authenticity of a prop, it's called provenance. For example, these are the gloves from the opening scene of Enter the Dragon where Bruce Lee is fighting against Sammo Hung in the Shaolin Temple. I saw these online on social media. They were being auctioned and I knew I could not miss my chance to own such iconic props from one of my all-time favorite movies. Once the bidding started, it was hot and heavy, but I kept bidding. I was incredibly confident because I've been collecting Bruce Lee for over 30 years. And I kept bidding with incredible confidence because I knew these were authentic because of the provenance. Here we have a certificate of authenticity signed by Bruce Lee's student and friend, Taky Kimura. So this is a prop where you know the provenance. This was a friend and student of Bruce Lee. NotoriousAmish asks, Why exactly are misprints such a big deal? Well, let's take this book for example. Venom Lethal Protector #1 had a red foil cover, except for several copies that got out to the public with just black. So the foil never made it onto the cover, making these extremely rare and extremely collectible. Why do people love misprints? People love imperfections and rarities. These pieces were not meant to get out to the public. Much like the Inverted Jenny stamp, these were never meant to be held by human hands. The Inverted Jenny stamp has an airplane that's put upside down on the stamp. It's incredibly rare, and it is the holy grail of stamp collecting. How valuable is the Inverted Jenny? Well, in 2023, one fetched a whopping price of $2 million. Horusttheweebmaster asks, Can someone explain what the different ages of comics each mean? Well, let's start off with the Golden Age of comics. Boom! Marvel Comics #1. This is the comic book that started the entire Marvel Universe. And it has the first appearance of the Human Torch. Golden Age comic books run from the 1930s to the 1950s and feature a lot of our favorite characters like Superman, Batman, Wonder Woman, and Captain America. And this CGC-graded 4.5 copy of Marvel Comics #1 is worth an astounding $400,000. Next up is The Silver Age. And here we have it from the late '50s to the '60s. This is the period that launched the Marvel universe of comic book superheroes that we all know and love today. Here we have Amazing Fantasy 15, CGC-graded 8.0. This is the first appearance ever of Spider-Man in a comic book. And this copy here is worth close to $350,000. What made these Marvel superheroes stand out was each one of them had a tragic flaw, something that was going wrong in their lives. If DC Comics was the establishment, Marvel Comics was the counterculture. And that's what makes the Silver Age so special. Next up is the Bronze Age of comics, the 1970s, with some of your favorite characters being brought to life for the very first time, including Wolverine in Incredible Hulk 181. This is actually his first full appearance in a comic book. This CGC-graded 9.8 copy is worth a whopping $65,000. This is the Holy Grail of the '70s. What action comics is to the '30s, Wolverine's first appearance is to the '70s. For me, what makes the '70s so special is you have a lot of offbeat characters. The mutants from X-Men are becoming incredibly popular. You've got characters like Swamp Thing and DeathLok, and here we go with one of my favorite periods in comics, the Copper Age of Comics, which brought to life so many great characters including Teenage Mutant Ninja Turtles. This particular issue is graded 9.2 by CGC and is worth approximately $15,000. What makes it so special is this is the introduction of the Teenage Mutant Ninja Turtles. They've never appeared in a comic book before. What really makes the 1980s stand out from any other period in time in comics is the dark turn that they took, from Watchmen to the Turtles to The Dark Knight Returns by Frank Miller. These comic books really set the stage for what will end up being the modern age of comics. Mussygirl89 asks, Do people still watch or collect VHS tapes? You know, in the last 10 years, VHS has become popular again after going pretty much obsolete. People are buying VHS tapes, watching them, and they're getting so popular, just like vinyl records, that manufacturers are starting to make new versions of VHS. VHS collecting has gotten so popular that there are grading companies like this one, VHS DNA. VHS tapes are graded just like comics or any other collectible. You look for imperfections, you see if there are any creases, any dents, any wrinkles in the plastic wrap, and then you grade them. Why would anybody collect VHS tapes? Well, in our modern age where people are being bombarded by tons of content on social media and streaming, this takes you back to a simpler day where you could focus on just the movie. Another thing, a movie like Aliens, when you watch it in VHS, it's muddier, it's inkier, and you don't get all that detail. And that's what makes it so special. It simulates what you saw the first time you saw it in the movie theater. And finally, many of these movies were never digitized and the only way you can watch them is on VHS. You might've heard that Disney VHS is the place to be if you're a vintage VHS collector, but don't buy into the hype. The rare and limited print runs of old horror movies from the '70s and '80s, that's where it's at. Yrguiltyconscience asks, What's the most expensive or rare Star Wars collectible and what does it go for? The rocket-firing Boba Fett is the rarest Star Wars collectible, and one sold for a record-breaking $1.3 million. Kenner manufactured the Star Wars line, the original line, including this Chewbacca. And what was cool about the original Boba Fett prototype was it shot out the rocket from his back. They quickly realized that kids might swallow this and get hurt, so they never went to market with it, and that's what makes it so darn rare. Since it was a prototype, only a few exist in the world, and that's what makes it so valuable. And by the way, since we're on the topic of Star Wars, about a year ago, I got to sell one of the rarest Star Wars collectibles. It was the helmet used by a Sand Trooper in the original Star Wars. Only six were manufactured, and only three are known to exist today, and this was one of them. I ended up selling it for a fantastic six-figure number. Pickle121201 asks, How many X-Men first issues from 1963 do you think still exist? Well, back in the '60s, they were printing a ton of these. They used really cheap paper, so it was very inexpensive. I guess they printed originally a half a million copies of X-Men #1. Only a fraction of those still exist today, probably in the mid to high thousands. And what I can show you right now are four examples. What I'm holding in my hand is probably well over $100,000 worth of X-Men #1s. When Stan Lee first came up with this comic book concept, it was about tolerance. This was during the Civil Rights Movement. And what he wanted to show was that people, even if they're different, should be loved. The original title for this comic book was The Mutants, but his publisher at the time thought it sounded like evil, bad guys, the mutants. So he turned it into The X-Men, named after Professor Charles Xavier, the leader of the X-Men. And since we're on the topic of Silver Age #1s, I wanna share something else with you. Here we have a stack of Amazing Spider-Man #1. This is his very first issue, the first appearance of J. Jonah Jameson, the first appearance of the villain, the Chameleon, and it's the first time he ever meets the Fantastic Four. Spider-Man always appealed to me as a little kid because he didn't have big muscles, he wasn't a grownup. He was a kid like me, and he had all the problems that every one of us had when we were kids. There was always somebody bigger picking on us, and you never had any money in your pocket. Maybe the girls weren't always going after you. So a lot of us could relate to poor Peter Parker's problems. These six copies of Amazing Spider-Man #1 are worth in total over $150,000. Here's a question from Quora. What aged vintage toys are worth a lot? Toys from the '70s, '80s, and '90s are incredibly popular and highly collected, from Star Wars to Aliens. You name it, and people love it from this time period. Now what happens is as people get older, they hit a certain point where they can start taking discretionary income and buying things that they remember from their childhood. So collectibles, toys, go in a 30-year cycle. Right now, toys from the early 2000s, like Teenage Mutant Ninja Turtles and Tamagotchi are becoming increasingly popular. So don't throw those out. And by the way, I bought a great toy from my childhood. Check this out. This is from the '70s, and I remember getting this Christmas morning, Dragun from the Shogun Warriors. Pew, pew, pew, pew. [Kids] Mission accomplished! @AedraRising asks, Like, are video game collectors supposed to not play the video games they love so much? Well, there's no hard and fast rule. Some people like to play video games and some people like to collect and invest in them. If you wanna collect and invest in a game, you get 'em certified, like this centipede game from WADA. How do you grade a video game? It's very similar to comics, cards, and VHS. You're grading the imperfections or lack thereof on the holder, on the box, and also looking at how the wrapper is so tight, so perfect, without any wrinkles, without any tears. This Centipede video game is factory sealed from 1983 in mint condition. Graded by WADA, you can see there are no imperfections on it, the box is perfect, the wrapper is perfect, and it's great. But even if you have just a cartridge, you can also get that graded as well. If you wanna start collecting video games, you can collect NES, SNES, and if you wanna hit that sweet spot of that 30-year-old, nostalgic resurgence of interest, go with PlayStation 1 and Xbox. By the way, I've got a great video game story. Check this out. Couple years ago, I meet Mike Tyson at an event. He's busy, hundreds of people around him, and I finally corner him. I say, Mike, could you please sign my punch-out? Grabs his pen, starts to sign it, and just as he is, somebody calls, Hey, Mike, and he looks up and he signed half of it on the spot where we cut it out and half of it on the plastic. I'll never sell this, and it makes a great conversation piece, and it's mine. @jpalmiotti asks, and that's a friend of mine, super comic creator Jimmy Palmiotti, We are seeing super high prices on original comic art. Now I know for a fact more people than ever are collecting, but why do you think the prices are rising so quickly? I think what makes comic art so darn special is each page is one of a kind. This is a production piece of art from X-Men #1 by none other than Jack The King Kirby, one of the greatest comic book creators of all time. He basically invented the Marvel way of storytelling. Super bombastic fight scenes, action like you've never seen before. Comic art appreciation harkens back to several different golden rules. People are buying what they love. It's unique. It's rare. It was basically thrown away for the most part. So anything surviving is really, really valuable and really cool. You may be asking yourself, why are these in black and white and comics are in color? After this process of penciling and inking, copies were made of it and the colorist would color those copies and the editors would adjust the colors until they got it just right and then sent it off to the publishers. And yes, it's true, the prices are going up, up and away. I bought this piece of art about 15 years ago for probably around $35,000 and I've already turned down offers of a quarter of a million. Thirsty's New and Used asks, What is the rarest comic of all time? Is it this one? And there's a picture of New Adventure Comics 26, which is an incredibly rare book from 1938. And yes, probably the toughest DC comic to find. New Adventure 26 was printed in a very, very low print run. So there are very few survivors that have made it close to 100 years later. But on the other hand, we also have from Marvel, or Timely at the time, Motion Picture Funnies Weekly #1, which was a giveaway with the first appearance of the first Marvel superhero, Sub-Mariner. In my career of 40 years, I've only bought and sold maybe two or three copies. If somebody found a complete copy of this, at minimal, I think it would be worth about $50,000. Here's a question from the comic book collecting subreddit. What's the first comic book you ever bought that started your love of collecting? That's an awesome question, and I've got the book right here. When I was a little boy, my big brothers bought me this comic book, Astonishing Tales 31. And I didn't know who this guy was, but I wanted to figure it out. I remember going through the pages and seeing the colors and the costumes, and I could not put it down. And this comic book is such a big part of why I fell in love with the whole art form and with comic books in general. The cover is by Ed Hannigan and Bernie Wrightson, who I later got to meet in person as a professional in comic books, but this is the book that started it all. OfficialKnockout wants to know, my uncle gave me a ton of his comics he has been collecting since the '90s to try and sell. I know nothing about them. Where would I start to really understand how to look them up and value them? One thing you can do is pick up a copy of the Overstreet Price Guide. This is the Bible of comic books and has every single comic book ever made. Another way to find out the value of your comic books is to go to either a local comic store, our office is here in Midtown Manhattan, or a great comic book convention in your location. When you go to these conventions, you're surrounded by dealers from all over the country, and they can give you an idea as to what your comic books are worth. LiPerezRey asks, Metropolis Comic, I found my old POG collection from the '90s. How many millions of dollars is it worth? Ready to cash in? Well, since he's a member of the staff here, we're going to ask him to bring on his collection, and I'm gonna do a free appraisal. Wow, that's a lot of POGs. Let's take a look at them. In the early 90s, POGs became really popular. They moved their way west, actually from Hawaii, through the United States mainland. And I remember I was at a San Diego Comic-Con and these appeared out of nowhere. They're cardboard discs that feature anybody from Pocahontas to Michael Jordan. They became incredibly popular in the early to mid-90s. POGs are generally worthless, unless maybe they're in sealed packs like this. And especially, there was a set made for Jurassic Park with holographic dinosaurs on the POGs and the slammer, and these are collectible. In fact, a set sold recently for $2,500. If you wanna know what these are worth, I actually could trade you for some Beanie Babies, another fad collectible from the '90s. Here you go. The Beanie Babies craze was incredible. People couldn't get enough of these. You'd buy them in every type of shop you could imagine, from comic book and collectible stores to your local pharmacies. They were everywhere, and some of them were thought to be incredibly scarce. Unfortunately, the interest in Beanie Babies died down and the collectible aspect of them disappeared. An oversaturation of the market for Beanie Babies really caused the downfall and the lack of interest later on over the years. The moral of the story is try to collect things that have true scarcity levels to them, not manufactured collectibles. And also, most importantly, buy what you love. I remember when these showed up at San Diego Comic-Con, it was probably like 1994, and I was like, What the [beep] are these? And why are people collecting little cardboard circles? Dealers were coming up, Do you wanna buy some POGs? I was like, No. [beep] wave asks, What is your favorite, rare, or valuable book and why? Well, I happen to have right here one of my favorite pulps of all time. This is from October 1933. It is Weird Tales. And this cover is painted by none other than the legendary pulp painter, Margaret Brundage. They showcased some of the greatest science fiction, horror, and fantasy stories ever written. These pulps were made with pulp paper, hence the name. It was very inexpensive, and they could produce tons of these. This one was 25 cents. Pulps were widely read by the general public and were incredibly popular. These are the predecessors of comic books and many of the great science fiction, fantasy, and horror stories started in the pulps. For example, Conan started in the pulps. HP Lovecraft's Cthulhua, Mythos, started in the pulps. Those are all the questions for today. Thanks for watching Collectibles Support. [upbeat music fading] Trending video I Escaped Chinese Mafia Crypto Slavery Olympian Answers Figure Skating Questions Professor Answers Olympic History Questions Paralympian Answers Paralympics Questions Collectibles Expert Answers Collectibles Questions © 2026 Condé Nast. All rights reserved. WIRED may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Mars#cite_ref-jgr107_E6_68-0] | [TOKENS: 11899] |
Contents Mars Mars is the fourth planet from the Sun. It is also known as the "Red Planet", for its orange-red appearance. Mars is a desert-like rocky planet with a tenuous atmosphere that is primarily carbon dioxide (CO2). At the average surface level the atmospheric pressure is a few thousandths of Earth's, atmospheric temperature ranges from −153 to 20 °C (−243 to 68 °F), and cosmic radiation is high. Mars retains some water, in the ground as well as thinly in the atmosphere, forming cirrus clouds, fog, frost, larger polar regions of permafrost and ice caps (with seasonal CO2 snow), but no bodies of liquid surface water. Its surface gravity is roughly a third of Earth's or double that of the Moon. Its diameter, 6,779 km (4,212 mi), is about half the Earth's, or twice the Moon's, and its surface area is the size of all the dry land of Earth. Fine dust is prevalent across the surface and the atmosphere, being picked up and spread at the low Martian gravity even by the weak wind of the tenuous atmosphere. The terrain of Mars roughly follows a north-south divide, the Martian dichotomy, with the northern hemisphere mainly consisting of relatively flat, low lying plains, and the southern hemisphere of cratered highlands. Geologically, the planet is fairly active with marsquakes trembling underneath the ground, but also hosts many enormous volcanoes that are extinct (the tallest is Olympus Mons, 21.9 km or 13.6 mi tall), as well as one of the largest canyons in the Solar System (Valles Marineris, 4,000 km or 2,500 mi long). Mars has two natural satellites that are small and irregular in shape: Phobos and Deimos. With a significant axial tilt of 25 degrees, Mars experiences seasons, like Earth (which has an axial tilt of 23.5 degrees). A Martian solar year is equal to 1.88 Earth years (687 Earth days), a Martian solar day (sol) is equal to 24.6 hours. Mars formed along with the other planets approximately 4.5 billion years ago. During the martian Noachian period (4.5 to 3.5 billion years ago), its surface was marked by meteor impacts, valley formation, erosion, the possible presence of water oceans and the loss of its magnetosphere. The Hesperian period (beginning 3.5 billion years ago and ending 3.3–2.9 billion years ago) was dominated by widespread volcanic activity and flooding that carved immense outflow channels. The Amazonian period, which continues to the present, is the currently dominating and remaining influence on geological processes. Because of Mars's geological history, the possibility of past or present life on Mars remains an area of active scientific investigation, with some possible traces needing further examination. Being visible with the naked eye in Earth's sky as a red wandering star, Mars has been observed throughout history, acquiring diverse associations in different cultures. In 1963 the first flight to Mars took place with Mars 1, but communication was lost en route. The first successful flyby exploration of Mars was conducted in 1965 with Mariner 4. In 1971 Mariner 9 entered orbit around Mars, being the first spacecraft to orbit any body other than the Moon, Sun or Earth; following in the same year were the first uncontrolled impact (Mars 2) and first successful landing (Mars 3) on Mars. Probes have been active on Mars continuously since 1997. At times, more than ten probes have simultaneously operated in orbit or on the surface, more than at any other planet beyond Earth. Mars is an often proposed target for future crewed exploration missions, though no such mission is currently planned. Natural history Scientists have theorized that during the Solar System's formation, Mars was created as the result of a random process of run-away accretion of material from the protoplanetary disk that orbited the Sun. Mars has many distinctive chemical features caused by its position in the Solar System. Elements with comparatively low boiling points, such as chlorine, phosphorus, and sulfur, are much more common on Mars than on Earth; these elements were probably pushed outward by the young Sun's energetic solar wind. After the formation of the planets, the inner Solar System may have been subjected to the so-called Late Heavy Bombardment. About 60% of the surface of Mars shows a record of impacts from that era, whereas much of the remaining surface is probably underlain by immense impact basins caused by those events. However, more recent modeling has disputed the existence of the Late Heavy Bombardment. There is evidence of an enormous impact basin in the Northern Hemisphere of Mars, spanning 10,600 by 8,500 kilometres (6,600 by 5,300 mi), or roughly four times the size of the Moon's South Pole–Aitken basin, which would be the largest impact basin yet discovered if confirmed. It has been hypothesized that the basin was formed when Mars was struck by a Pluto-sized body about four billion years ago. The event, thought to be the cause of the Martian hemispheric dichotomy, created the smooth Borealis basin that covers 40% of the planet. A 2023 study shows evidence, based on the orbital inclination of Deimos (a small moon of Mars), that Mars may once have had a ring system 3.5 billion years to 4 billion years ago. This ring system may have been formed from a moon, 20 times more massive than Phobos, orbiting Mars billions of years ago; and Phobos would be a remnant of that ring. Epochs: The geological history of Mars can be split into many periods, but the following are the three primary periods: Geological activity is still taking place on Mars. The Athabasca Valles is home to sheet-like lava flows created about 200 million years ago. Water flows in the grabens called the Cerberus Fossae occurred less than 20 million years ago, indicating equally recent volcanic intrusions. The Mars Reconnaissance Orbiter has captured images of avalanches. Physical characteristics Mars is approximately half the diameter of Earth or twice that of the Moon, with a surface area only slightly less than the total area of Earth's dry land. Mars is less dense than Earth, having about 15% of Earth's volume and 11% of Earth's mass, resulting in about 38% of Earth's surface gravity. Mars is the only presently known example of a desert planet, a rocky planet with a surface akin to that of Earth's deserts. The red-orange appearance of the Martian surface is caused by iron(III) oxide (nanophase Fe2O3) and the iron(III) oxide-hydroxide mineral goethite. It can look like butterscotch; other common surface colors include golden, brown, tan, and greenish, depending on the minerals present. Like Earth, Mars is differentiated into a dense metallic core overlaid by less dense rocky layers. The outermost layer is the crust, which is on average about 42–56 kilometres (26–35 mi) thick, with a minimum thickness of 6 kilometres (3.7 mi) in Isidis Planitia, and a maximum thickness of 117 kilometres (73 mi) in the southern Tharsis plateau. For comparison, Earth's crust averages 27.3 ± 4.8 km in thickness. The most abundant elements in the Martian crust are silicon, oxygen, iron, magnesium, aluminum, calcium, and potassium. Mars is confirmed to be seismically active; in 2019, it was reported that InSight had detected and recorded over 450 marsquakes and related events. Beneath the crust is a silicate mantle responsible for many of the tectonic and volcanic features on the planet's surface. The upper Martian mantle is a low-velocity zone, where the velocity of seismic waves is lower than surrounding depth intervals. The mantle appears to be rigid down to the depth of about 250 km, giving Mars a very thick lithosphere compared to Earth. Below this the mantle gradually becomes more ductile, and the seismic wave velocity starts to grow again. The Martian mantle does not appear to have a thermally insulating layer analogous to Earth's lower mantle; instead, below 1050 km in depth, it becomes mineralogically similar to Earth's transition zone. At the bottom of the mantle lies a basal liquid silicate layer approximately 150–180 km thick. The Martian mantle appears to be highly heterogenous, with dense fragments up to 4 km across, likely injected deep into the planet by colossal impacts ~4.5 billion years ago; high-frequency waves from eight marsquakes slowed as they passed these localized regions, and modeling indicates the heterogeneities are compositionally distinct debris preserved because Mars lacks plate tectonics and has a sluggishly convecting interior that prevents complete homogenization. Mars's iron and nickel core is at least partially molten, and may have a solid inner core. It is around half of Mars's radius, approximately 1650–1675 km, and is enriched in light elements such as sulfur, oxygen, carbon, and hydrogen. The temperature of the core is estimated to be 2000–2400 K, compared to 5400–6230 K for Earth's solid inner core. In 2025, based on data from the InSight lander, a group of researchers reported the detection of a solid inner core 613 kilometres (381 mi) ± 67 kilometres (42 mi) in radius. Mars is a terrestrial planet with a surface that consists of minerals containing silicon and oxygen, metals, and other elements that typically make up rock. The Martian surface is primarily composed of tholeiitic basalt, although parts are more silica-rich than typical basalt and may be similar to andesitic rocks on Earth, or silica glass. Regions of low albedo suggest concentrations of plagioclase feldspar, with northern low albedo regions displaying higher than normal concentrations of sheet silicates and high-silicon glass. Parts of the southern highlands include detectable amounts of high-calcium pyroxenes. Localized concentrations of hematite and olivine have been found. Much of the surface is deeply covered by finely grained iron(III) oxide dust. The Phoenix lander returned data showing Martian soil to be slightly alkaline and containing elements such as magnesium, sodium, potassium and chlorine. These nutrients are found in soils on Earth, and are necessary for plant growth. Experiments performed by the lander showed that the Martian soil has a basic pH of 7.7, and contains 0.6% perchlorate by weight, concentrations that are toxic to humans. Streaks are common across Mars and new ones appear frequently on steep slopes of craters, troughs, and valleys. The streaks are dark at first and get lighter with age. The streaks can start in a tiny area, then spread out for hundreds of metres. They have been seen to follow the edges of boulders and other obstacles in their path. The commonly accepted hypotheses include that they are dark underlying layers of soil revealed after avalanches of bright dust or dust devils. Several other explanations have been put forward, including those that involve water or even the growth of organisms. Environmental radiation levels on the surface are on average 0.64 millisieverts of radiation per day, and significantly less than the radiation of 1.84 millisieverts per day or 22 millirads per day during the flight to and from Mars. For comparison the radiation levels in low Earth orbit, where Earth's space stations orbit, are around 0.5 millisieverts of radiation per day. Hellas Planitia has the lowest surface radiation at about 0.342 millisieverts per day, featuring lava tubes southwest of Hadriacus Mons with potentially levels as low as 0.064 millisieverts per day, comparable to radiation levels during flights on Earth. Although Mars has no evidence of a structured global magnetic field, observations show that parts of the planet's crust have been magnetized, suggesting that alternating polarity reversals of its dipole field have occurred in the past. This paleomagnetism of magnetically susceptible minerals is similar to the alternating bands found on Earth's ocean floors. One hypothesis, published in 1999 and re-examined in October 2005 (with the help of the Mars Global Surveyor), is that these bands suggest plate tectonic activity on Mars four billion years ago, before the planetary dynamo ceased to function and the planet's magnetic field faded. Geography and features Although better remembered for mapping the Moon, Johann Heinrich von Mädler and Wilhelm Beer were the first areographers. They began by establishing that most of Mars's surface features were permanent and by more precisely determining the planet's rotation period. In 1840, Mädler combined ten years of observations and drew the first map of Mars. Features on Mars are named from a variety of sources. Albedo features are named for classical mythology. Craters larger than roughly 50 km are named for deceased scientists and writers and others who have contributed to the study of Mars. Smaller craters are named for towns and villages of the world with populations of less than 100,000. Large valleys are named for the word "Mars" or "star" in various languages; smaller valleys are named for rivers. Large albedo features retain many of the older names but are often updated to reflect new knowledge of the nature of the features. For example, Nix Olympica (the snows of Olympus) has become Olympus Mons (Mount Olympus). The surface of Mars as seen from Earth is divided into two kinds of areas, with differing albedo. The paler plains covered with dust and sand rich in reddish iron oxides were once thought of as Martian "continents" and given names like Arabia Terra (land of Arabia) or Amazonis Planitia (Amazonian plain). The dark features were thought to be seas, hence their names Mare Erythraeum, Mare Sirenum and Aurorae Sinus. The largest dark feature seen from Earth is Syrtis Major Planum. The permanent northern polar ice cap is named Planum Boreum. The southern cap is called Planum Australe. Mars's equator is defined by its rotation, but the location of its Prime Meridian was specified, as was Earth's (at Greenwich), by choice of an arbitrary point; Mädler and Beer selected a line for their first maps of Mars in 1830. After the spacecraft Mariner 9 provided extensive imagery of Mars in 1972, a small crater (later called Airy-0), located in the Sinus Meridiani ("Middle Bay" or "Meridian Bay"), was chosen by Merton E. Davies, Harold Masursky, and Gérard de Vaucouleurs for the definition of 0.0° longitude to coincide with the original selection. Because Mars has no oceans, and hence no "sea level", a zero-elevation surface had to be selected as a reference level; this is called the areoid of Mars, analogous to the terrestrial geoid. Zero altitude was defined by the height at which there is 610.5 Pa (6.105 mbar) of atmospheric pressure. This pressure corresponds to the triple point of water, and it is about 0.6% of the sea level surface pressure on Earth (0.006 atm). For mapping purposes, the United States Geological Survey divides the surface of Mars into thirty cartographic quadrangles, each named for a classical albedo feature it contains. In April 2023, The New York Times reported an updated global map of Mars based on images from the Hope spacecraft. A related, but much more detailed, global Mars map was released by NASA on 16 April 2023. The vast upland region Tharsis contains several massive volcanoes, which include the shield volcano Olympus Mons. The edifice is over 600 km (370 mi) wide. Because the mountain is so large, with complex structure at its edges, giving a definite height to it is difficult. Its local relief, from the foot of the cliffs which form its northwest margin to its peak, is over 21 km (13 mi), a little over twice the height of Mauna Kea as measured from its base on the ocean floor. The total elevation change from the plains of Amazonis Planitia, over 1,000 km (620 mi) to the northwest, to the summit approaches 26 km (16 mi), roughly three times the height of Mount Everest, which in comparison stands at just over 8.8 kilometres (5.5 mi). Consequently, Olympus Mons is either the tallest or second-tallest mountain in the Solar System; the only known mountain which might be taller is the Rheasilvia peak on the asteroid Vesta, at 20–25 km (12–16 mi). The dichotomy of Martian topography is striking: northern plains flattened by lava flows contrast with the southern highlands, pitted and cratered by ancient impacts. It is possible that, four billion years ago, the Northern Hemisphere of Mars was struck by an object one-tenth to two-thirds the size of Earth's Moon. If this is the case, the Northern Hemisphere of Mars would be the site of an impact crater 10,600 by 8,500 kilometres (6,600 by 5,300 mi) in size, or roughly the area of Europe, Asia, and Australia combined, surpassing Utopia Planitia and the Moon's South Pole–Aitken basin as the largest impact crater in the Solar System. Mars is scarred by 43,000 impact craters with a diameter of 5 kilometres (3.1 mi) or greater. The largest exposed crater is Hellas, which is 2,300 kilometres (1,400 mi) wide and 7,000 metres (23,000 ft) deep, and is a light albedo feature clearly visible from Earth. There are other notable impact features, such as Argyre, which is around 1,800 kilometres (1,100 mi) in diameter, and Isidis, which is around 1,500 kilometres (930 mi) in diameter. Due to the smaller mass and size of Mars, the probability of an object colliding with the planet is about half that of Earth. Mars is located closer to the asteroid belt, so it has an increased chance of being struck by materials from that source. Mars is more likely to be struck by short-period comets, i.e., those that lie within the orbit of Jupiter. Martian craters can[discuss] have a morphology that suggests the ground became wet after the meteor impact. The large canyon, Valles Marineris (Latin for 'Mariner Valleys, also known as Agathodaemon in the old canal maps), has a length of 4,000 kilometres (2,500 mi) and a depth of up to 7 kilometres (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 kilometres (277 mi) long and nearly 2 kilometres (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 kilometres (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement. Images from the Thermal Emission Imaging System (THEMIS) aboard NASA's Mars Odyssey orbiter have revealed seven possible cave entrances on the flanks of the volcano Arsia Mons. The caves, named after loved ones of their discoverers, are collectively known as the "seven sisters". Cave entrances measure from 100 to 252 metres (328 to 827 ft) wide and they are estimated to be at least 73 to 96 metres (240 to 315 ft) deep. Because light does not reach the floor of most of the caves, they may extend much deeper than these lower estimates and widen below the surface. "Dena" is the only exception; its floor is visible and was measured to be 130 metres (430 ft) deep. The interiors of these caverns may be protected from micrometeoroids, UV radiation, solar flares and high energy particles that bombard the planet's surface. Martian geysers (or CO2 jets) are putative sites of small gas and dust eruptions that occur in the south polar region of Mars during the spring thaw. "Dark dune spots" and "spiders" – or araneiforms – are the two most visible types of features ascribed to these eruptions. Similarly sized dust will settle from the thinner Martian atmosphere sooner than it would on Earth. For example, the dust suspended by the 2001 global dust storms on Mars only remained in the Martian atmosphere for 0.6 years, while the dust from Mount Pinatubo took about two years to settle. However, under current Martian conditions, the mass movements involved are generally much smaller than on Earth. Even the 2001 global dust storms on Mars moved only the equivalent of a very thin dust layer – about 3 μm thick if deposited with uniform thickness between 58° north and south of the equator. Dust deposition at the two rover sites has proceeded at a rate of about the thickness of a grain every 100 sols. Atmosphere Mars lost its magnetosphere 4 billion years ago, possibly because of numerous asteroid strikes, so the solar wind interacts directly with the Martian ionosphere, lowering the atmospheric density by stripping away atoms from the outer layer. Both Mars Global Surveyor and Mars Express have detected ionized atmospheric particles trailing off into space behind Mars, and this atmospheric loss is being studied by the MAVEN orbiter. Compared to Earth, the atmosphere of Mars is quite rarefied. Atmospheric pressure on the surface today ranges from a low of 30 Pa (0.0044 psi) on Olympus Mons to over 1,155 Pa (0.1675 psi) in Hellas Planitia, with a mean pressure at the surface level of 600 Pa (0.087 psi). The highest atmospheric density on Mars is equal to that found 35 kilometres (22 mi) above Earth's surface. The resulting mean surface pressure is only 0.6% of Earth's 101.3 kPa (14.69 psi). The scale height of the atmosphere is about 10.8 kilometres (6.7 mi), which is higher than Earth's 6 kilometres (3.7 mi), because the surface gravity of Mars is only about 38% of Earth's. The atmosphere of Mars consists of about 96% carbon dioxide, 1.93% argon and 1.89% nitrogen along with traces of oxygen and water. The atmosphere is quite dusty, containing particulates about 1.5 μm in diameter which give the Martian sky a tawny color when seen from the surface. It may take on a pink hue due to iron oxide particles suspended in it. Despite repeated detections of methane on Mars, there is no scientific consensus as to its origin. One suggestion is that methane exists on Mars and that its concentration fluctuates seasonally. The existence of methane could be produced by non-biological process such as serpentinization involving water, carbon dioxide, and the mineral olivine, which is known to be common on Mars, or by Martian life. Compared to Earth, its higher concentration of atmospheric CO2 and lower surface pressure may be why sound is attenuated more on Mars, where natural sources are rare apart from the wind. Using acoustic recordings collected by the Perseverance rover, researchers concluded that the speed of sound there is approximately 240 m/s for frequencies below 240 Hz, and 250 m/s for those above. Auroras have been detected on Mars. Because Mars lacks a global magnetic field, the types and distribution of auroras there differ from those on Earth; rather than being mostly restricted to polar regions as is the case on Earth, a Martian aurora can encompass the planet. In September 2017, NASA reported radiation levels on the surface of the planet Mars were temporarily doubled, and were associated with an aurora 25 times brighter than any observed earlier, due to a massive, and unexpected, solar storm in the middle of the month. Mars has seasons, alternating between its northern and southern hemispheres, similar to on Earth. Additionally the orbit of Mars has, compared to Earth's, a large eccentricity and approaches perihelion when it is summer in its southern hemisphere and winter in its northern, and aphelion when it is winter in its southern hemisphere and summer in its northern. As a result, the seasons in its southern hemisphere are more extreme and the seasons in its northern are milder than would otherwise be the case. The summer temperatures in the south can be warmer than the equivalent summer temperatures in the north by up to 30 °C (54 °F). Martian surface temperatures vary from lows of about −110 °C (−166 °F) to highs of up to 35 °C (95 °F) in equatorial summer. The wide range in temperatures is due to the thin atmosphere which cannot store much solar heat, the low atmospheric pressure (about 1% that of the atmosphere of Earth), and the low thermal inertia of Martian soil. The planet is 1.52 times as far from the Sun as Earth, resulting in just 43% of the amount of sunlight. Mars has the largest dust storms in the Solar System, reaching speeds of over 160 km/h (100 mph). These can vary from a storm over a small area, to gigantic storms that cover the entire planet. They tend to occur when Mars is closest to the Sun, and have been shown to increase global temperature. Seasons also produce dry ice covering polar ice caps. Hydrology While Mars contains water in larger amounts, most of it is dust covered water ice at the Martian polar ice caps. The volume of water ice in the south polar ice cap, if melted, would be enough to cover most of the surface of the planet with a depth of 11 metres (36 ft). Water in its liquid form cannot persist on the surface due to Mars's low atmospheric pressure, which is less than 1% that of Earth. Only at the lowest of elevations are the pressure and temperature high enough for liquid water to exist for short periods. Although little water is present in the atmosphere, there is enough to produce clouds of water ice and different cases of snow and frost, often mixed with snow of carbon dioxide dry ice. Landforms visible on Mars strongly suggest that liquid water has existed on the planet's surface. Huge linear swathes of scoured ground, known as outflow channels, cut across the surface in about 25 places. These are thought to be a record of erosion caused by the catastrophic release of water from subsurface aquifers, though some of these structures have been hypothesized to result from the action of glaciers or lava. One of the larger examples, Ma'adim Vallis, is 700 kilometres (430 mi) long, much greater than the Grand Canyon, with a width of 20 kilometres (12 mi) and a depth of 2 kilometres (1.2 mi) in places. It is thought to have been carved by flowing water early in Mars's history. The youngest of these channels is thought to have formed only a few million years ago. Elsewhere, particularly on the oldest areas of the Martian surface, finer-scale, dendritic networks of valleys are spread across significant proportions of the landscape. Features of these valleys and their distribution strongly imply that they were carved by runoff resulting from precipitation in early Mars history. Subsurface water flow and groundwater sapping may play important subsidiary roles in some networks, but precipitation was probably the root cause of the incision in almost all cases. Along craters and canyon walls, there are thousands of features that appear similar to terrestrial gullies. The gullies tend to be in the highlands of the Southern Hemisphere and face the Equator; all are poleward of 30° latitude. A number of authors have suggested that their formation process involves liquid water, probably from melting ice, although others have argued for formation mechanisms involving carbon dioxide frost or the movement of dry dust. No partially degraded gullies have formed by weathering and no superimposed impact craters have been observed, indicating that these are young features, possibly still active. Other geological features, such as deltas and alluvial fans preserved in craters, are further evidence for warmer, wetter conditions at an interval or intervals in earlier Mars history. Such conditions necessarily require the widespread presence of crater lakes across a large proportion of the surface, for which there is independent mineralogical, sedimentological and geomorphological evidence. Further evidence that liquid water once existed on the surface of Mars comes from the detection of specific minerals such as hematite and goethite, both of which sometimes form in the presence of water. The chemical signature of water vapor on Mars was first unequivocally demonstrated in 1963 by spectroscopy using an Earth-based telescope. In 2004, Opportunity detected the mineral jarosite. This forms only in the presence of acidic water, showing that water once existed on Mars. The Spirit rover found concentrated deposits of silica in 2007 that indicated wet conditions in the past, and in December 2011, the mineral gypsum, which also forms in the presence of water, was found on the surface by NASA's Mars rover Opportunity. It is estimated that the amount of water in the upper mantle of Mars, represented by hydroxyl ions contained within Martian minerals, is equal to or greater than that of Earth at 50–300 parts per million of water, which is enough to cover the entire planet to a depth of 200–1,000 metres (660–3,280 ft). On 18 March 2013, NASA reported evidence from instruments on the Curiosity rover of mineral hydration, likely hydrated calcium sulfate, in several rock samples including the broken fragments of "Tintina" rock and "Sutton Inlier" rock as well as in veins and nodules in other rocks like "Knorr" rock and "Wernicke" rock. Analysis using the rover's DAN instrument provided evidence of subsurface water, amounting to as much as 4% water content, down to a depth of 60 centimetres (24 in), during the rover's traverse from the Bradbury Landing site to the Yellowknife Bay area in the Glenelg terrain. In September 2015, NASA announced that they had found strong evidence of hydrated brine flows in recurring slope lineae, based on spectrometer readings of the darkened areas of slopes. These streaks flow downhill in Martian summer, when the temperature is above −23 °C, and freeze at lower temperatures. These observations supported earlier hypotheses, based on timing of formation and their rate of growth, that these dark streaks resulted from water flowing just below the surface. However, later work suggested that the lineae may be dry, granular flows instead, with at most a limited role for water in initiating the process. A definitive conclusion about the presence, extent, and role of liquid water on the Martian surface remains elusive. Researchers suspect much of the low northern plains of the planet were covered with an ocean hundreds of meters deep, though this theory remains controversial. In March 2015, scientists stated that such an ocean might have been the size of Earth's Arctic Ocean. This finding was derived from the ratio of protium to deuterium in the modern Martian atmosphere compared to that ratio on Earth. The amount of Martian deuterium (D/H = 9.3 ± 1.7 10−4) is five to seven times the amount on Earth (D/H = 1.56 10−4), suggesting that ancient Mars had significantly higher levels of water. Results from the Curiosity rover had previously found a high ratio of deuterium in Gale Crater, though not significantly high enough to suggest the former presence of an ocean. Other scientists caution that these results have not been confirmed, and point out that Martian climate models have not yet shown that the planet was warm enough in the past to support bodies of liquid water. Near the northern polar cap is the 81.4 kilometres (50.6 mi) wide Korolev Crater, which the Mars Express orbiter found to be filled with approximately 2,200 cubic kilometres (530 cu mi) of water ice. In November 2016, NASA reported finding a large amount of underground ice in the Utopia Planitia region. The volume of water detected has been estimated to be equivalent to the volume of water in Lake Superior (which is 12,100 cubic kilometers). During observations from 2018 through 2021, the ExoMars Trace Gas Orbiter spotted indications of water, probably subsurface ice, in the Valles Marineris canyon system. Orbital motion Mars's average distance from the Sun is roughly 230 million km (143 million mi), and its orbital period is 687 (Earth) days. The solar day (or sol) on Mars is only slightly longer than an Earth day: 24 hours, 39 minutes, and 35.244 seconds. A Martian year is equal to 1.8809 Earth years, or 1 year, 320 days, and 18.2 hours. The gravitational potential difference and thus the delta-v needed to transfer between Mars and Earth is the second lowest for Earth. The axial tilt of Mars is 25.19° relative to its orbital plane, which is similar to the axial tilt of Earth. As a result, Mars has seasons like Earth, though on Mars they are nearly twice as long because its orbital period is that much longer. In the present day, the orientation of the north pole of Mars is close to the star Deneb. Mars has a relatively pronounced orbital eccentricity of about 0.09; of the seven other planets in the Solar System, only Mercury has a larger orbital eccentricity. It is known that in the past, Mars has had a much more circular orbit. At one point, 1.35 million Earth years ago, Mars had an eccentricity of roughly 0.002, much less than that of Earth today. Mars's cycle of eccentricity is 96,000 Earth years compared to Earth's cycle of 100,000 years. Mars has its closest approach to Earth (opposition) in a synodic period of 779.94 days. It should not be confused with Mars conjunction, where the Earth and Mars are at opposite sides of the Solar System and form a straight line crossing the Sun. The average time between the successive oppositions of Mars, its synodic period, is 780 days; but the number of days between successive oppositions can range from 764 to 812. The distance at close approach varies between about 54 and 103 million km (34 and 64 million mi) due to the planets' elliptical orbits, which causes comparable variation in angular size. At their furthest Mars and Earth can be as far as 401 million km (249 million mi) apart. Mars comes into opposition from Earth every 2.1 years. The planets come into opposition near Mars's perihelion in 2003, 2018 and 2035, with the 2020 and 2033 events being particularly close to perihelic opposition. The mean apparent magnitude of Mars is +0.71 with a standard deviation of 1.05. Because the orbit of Mars is eccentric, the magnitude at opposition from the Sun can range from about −3.0 to −1.4. The minimum brightness is magnitude +1.86 when the planet is near aphelion and in conjunction with the Sun. At its brightest, Mars (along with Jupiter) is second only to Venus in apparent brightness. Mars usually appears distinctly yellow, orange, or red. When farthest away from Earth, it is more than seven times farther away than when it is closest. Mars is usually close enough for particularly good viewing once or twice at 15-year or 17-year intervals. Optical ground-based telescopes are typically limited to resolving features about 300 kilometres (190 mi) across when Earth and Mars are closest because of Earth's atmosphere. As Mars approaches opposition, it begins a period of retrograde motion, which means it will appear to move backwards in a looping curve with respect to the background stars. This retrograde motion lasts for about 72 days, and Mars reaches its peak apparent brightness in the middle of this interval. Moons Mars has two relatively small (compared to Earth's) natural moons, Phobos (about 22 km (14 mi) in diameter) and Deimos (about 12 km (7.5 mi) in diameter), which orbit at 9,376 km (5,826 mi) and 23,460 km (14,580 mi) around the planet. The origin of both moons is unclear, although a popular theory states that they were asteroids captured into Martian orbit. Both satellites were discovered in 1877 by Asaph Hall and were named after the characters Phobos (the deity of panic and fear) and Deimos (the deity of terror and dread), twins from Greek mythology who accompanied their father Ares, god of war, into battle. Mars was the Roman equivalent to Ares. In modern Greek, the planet retains its ancient name Ares (Aris: Άρης). From the surface of Mars, the motions of Phobos and Deimos appear different from that of the Earth's satellite, the Moon. Phobos rises in the west, sets in the east, and rises again in just 11 hours. Deimos, being only just outside synchronous orbit – where the orbital period would match the planet's period of rotation – rises as expected in the east, but slowly. Because the orbit of Phobos is below a synchronous altitude, tidal forces from Mars are gradually lowering its orbit. In about 50 million years, it could either crash into Mars's surface or break up into a ring structure around the planet. The origin of the two satellites is not well understood. Their low albedo and carbonaceous chondrite composition have been regarded as similar to asteroids, supporting a capture theory. The unstable orbit of Phobos would seem to point toward a relatively recent capture. But both have circular orbits near the equator, which is unusual for captured objects, and the required capture dynamics are complex. Accretion early in the history of Mars is plausible, but would not account for a composition resembling asteroids rather than Mars itself, if that is confirmed. Mars may have yet-undiscovered moons, smaller than 50 to 100 metres (160 to 330 ft) in diameter, and a dust ring is predicted to exist between Phobos and Deimos. A third possibility for their origin as satellites of Mars is the involvement of a third body or a type of impact disruption. More-recent lines of evidence for Phobos having a highly porous interior, and suggesting a composition containing mainly phyllosilicates and other minerals known from Mars, point toward an origin of Phobos from material ejected by an impact on Mars that reaccreted in Martian orbit, similar to the prevailing theory for the origin of Earth's satellite. Although the visible and near-infrared (VNIR) spectra of the moons of Mars resemble those of outer-belt asteroids, the thermal infrared spectra of Phobos are reported to be inconsistent with chondrites of any class. It is also possible that Phobos and Deimos were fragments of an older moon, formed by debris from a large impact on Mars, and then destroyed by a more recent impact upon the satellite. More recently, a study conducted by a team of researchers from multiple countries suggests that a lost moon, at least fifteen times the size of Phobos, may have existed in the past. By analyzing rocks which point to tidal processes on the planet, it is possible that these tides may have been regulated by a past moon. Human observations and exploration The history of observations of Mars is marked by oppositions of Mars when the planet is closest to Earth and hence is most easily visible, which occur every couple of years. Even more notable are the perihelic oppositions of Mars, which are distinguished because Mars is close to perihelion, making it even closer to Earth. The ancient Sumerians named Mars Nergal, the god of war and plague. During Sumerian times, Nergal was a minor deity of little significance, but, during later times, his main cult center was the city of Nineveh. In Mesopotamian texts, Mars is referred to as the "star of judgement of the fate of the dead". The existence of Mars as a wandering object in the night sky was also recorded by the ancient Egyptian astronomers and, by 1534 BCE, they were familiar with the retrograde motion of the planet. By the period of the Neo-Babylonian Empire, the Babylonian astronomers were making regular records of the positions of the planets and systematic observations of their behavior. For Mars, they knew that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. They invented arithmetic methods for making minor corrections to the predicted positions of the planets. In Ancient Greece, the planet was known as Πυρόεις. Commonly, the Greek name for the planet now referred to as Mars, was Ares. It was the Romans who named the planet Mars, for their god of war, often represented by the sword and shield of the planet's namesake. In the fourth century BCE, Aristotle noted that Mars disappeared behind the Moon during an occultation, indicating that the planet was farther away. Ptolemy, a Greek living in Alexandria, attempted to address the problem of the orbital motion of Mars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection later called the Almagest (from the Arabic for "greatest"), which became the authoritative treatise on Western astronomy for the next fourteen centuries. Literature from ancient China confirms that Mars was known by Chinese astronomers by no later than the fourth century BCE. In the East Asian cultures, Mars is traditionally referred to as the "fire star" (火星) based on the Wuxing system. In 1609 Johannes Kepler published a 10 year study of Martian orbit, using the diurnal parallax of Mars, measured by Tycho Brahe, to make a preliminary calculation of the relative distance to the planet. From Brahe's observations of Mars, Kepler deduced that the planet orbited the Sun not in a circle, but in an ellipse. Moreover, Kepler showed that Mars sped up as it approached the Sun and slowed down as it moved farther away, in a manner that later physicists would explain as a consequence of the conservation of angular momentum.: 433–437 In 1610 the first use of a telescope for astronomical observation, including Mars, was performed by Italian astronomer Galileo Galilei. With the telescope the diurnal parallax of Mars was again measured in an effort to determine the Sun-Earth distance. This was first performed by Giovanni Domenico Cassini in 1672. The early parallax measurements were hampered by the quality of the instruments. The only occultation of Mars by Venus observed was that of 13 October 1590, seen by Michael Maestlin at Heidelberg. By the 19th century, the resolution of telescopes reached a level sufficient for surface features to be identified. On 5 September 1877, a perihelic opposition to Mars occurred. The Italian astronomer Giovanni Schiaparelli used a 22-centimetre (8.7 in) telescope in Milan to help produce the first detailed map of Mars. These maps notably contained features he called canali, which, with the possible exception of the natural canyon Valles Marineris, were later shown to be an optical illusion. These canali were supposedly long, straight lines on the surface of Mars, to which he gave names of famous rivers on Earth. His term, which means "channels" or "grooves", was popularly mistranslated in English as "canals". Influenced by the observations, the orientalist Percival Lowell founded an observatory which had 30- and 45-centimetre (12- and 18-in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894, and the following less favorable oppositions. He published several books on Mars and life on the planet, which had a great influence on the public. The canali were independently observed by other astronomers, like Henri Joseph Perrotin and Louis Thollon in Nice, using one of the largest telescopes of that time. The seasonal changes (consisting of the diminishing of the polar caps and the dark areas formed during Martian summers) in combination with the canals led to speculation about life on Mars, and it was a long-held belief that Mars contained vast seas and vegetation. As bigger telescopes were used, fewer long, straight canali were observed. During observations in 1909 by Antoniadi with an 84-centimetre (33 in) telescope, irregular patterns were observed, but no canali were seen. The first spacecraft from Earth to visit Mars was Mars 1 of the Soviet Union, which flew by in 1963, but contact was lost en route. NASA's Mariner 4 followed and became the first spacecraft to successfully transmit from Mars; launched on 28 November 1964, it made its closest approach to the planet on 15 July 1965. Mariner 4 detected the weak Martian radiation belt, measured at about 0.1% that of Earth, and captured the first images of another planet from deep space. Once spacecraft visited the planet during the 1960s and 1970s, many previous concepts of Mars were radically broken. After the results of the Viking life-detection experiments, the hypothesis of a dead planet was generally accepted. The data from Mariner 9 and Viking allowed better maps of Mars to be made. Until 1997 and after Viking 1 shut down in 1982, Mars was only visited by three unsuccessful probes, two flying past without contact (Phobos 1, 1988; Mars Observer, 1993), and one (Phobos 2 1989) malfunctioning in orbit before reaching its destination Phobos. In 1997 Mars Pathfinder became the first successful rover mission beyond the Moon and started together with Mars Global Surveyor (operated until late 2006) an uninterrupted active robotic presence at Mars that has lasted until today. It produced complete, extremely detailed maps of the Martian topography, magnetic field and surface minerals. Starting with these missions a range of new improved crewless spacecraft, including orbiters, landers, and rovers, have been sent to Mars, with successful missions by the NASA (United States), Jaxa (Japan), ESA, United Kingdom, ISRO (India), Roscosmos (Russia), the United Arab Emirates, and CNSA (China) to study the planet's surface, climate, and geology, uncovering the different elements of the history and dynamic of the hydrosphere of Mars and possible traces of ancient life. As of 2023[update], Mars is host to ten functioning spacecraft. Eight are in orbit: 2001 Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, ExoMars Trace Gas Orbiter, the Hope orbiter, and the Tianwen-1 orbiter. Another two are on the surface: the Mars Science Laboratory Curiosity rover and the Perseverance rover. Collected maps are available online at websites including Google Mars. NASA provides two online tools: Mars Trek, which provides visualizations of the planet using data from 50 years of exploration, and Experience Curiosity, which simulates traveling on Mars in 3-D with Curiosity. Planned missions to Mars include: As of February 2024[update], debris from these types of missions has reached over seven tons. Most of it consists of crashed and inactive spacecraft as well as discarded components. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Key areas include establishing telecommunications, payload delivery and surface imaging. Habitability and habitation During the late 19th century, it was widely accepted in the astronomical community that Mars had life-supporting qualities, including the presence of oxygen and water. However, in 1894 W. W. Campbell at Lick Observatory observed the planet and found that "if water vapor or oxygen occur in the atmosphere of Mars it is in quantities too small to be detected by spectroscopes then available". That observation contradicted many of the measurements of the time and was not widely accepted. Campbell and V. M. Slipher repeated the study in 1909 using better instruments, but with the same results. It was not until the findings were confirmed by W. S. Adams in 1925 that the myth of the Earth-like habitability of Mars was finally broken. However, even in the 1960s, articles were published on Martian biology, putting aside explanations other than life for the seasonal changes on Mars. The current understanding of planetary habitability – the ability of a world to develop environmental conditions favorable to the emergence of life – favors planets that have liquid water on their surface. Most often this requires the orbit of a planet to lie within the habitable zone, which for the Sun is estimated to extend from within the orbit of Earth to about that of Mars. During perihelion, Mars dips inside this region, but Mars's thin (low-pressure) atmosphere prevents liquid water from existing over large regions for extended periods. The past flow of liquid water demonstrates the planet's potential for habitability. Recent evidence has suggested that any water on the Martian surface may have been too salty and acidic to support regular terrestrial life. The environmental conditions on Mars are a challenge to sustaining organic life: the planet has little heat transfer across its surface, it has poor insulation against bombardment by the solar wind due to the absence of a magnetosphere and has insufficient atmospheric pressure to retain water in a liquid form (water instead sublimes to a gaseous state). Mars is nearly, or perhaps totally, geologically dead; the end of volcanic activity has apparently stopped the recycling of chemicals and minerals between the surface and interior of the planet. Evidence suggests that the planet was once significantly more habitable than it is today, but whether living organisms ever existed there remains unknown. The Viking probes of the mid-1970s carried experiments designed to detect microorganisms in Martian soil at their respective landing sites and had positive results, including a temporary increase in CO2 production on exposure to water and nutrients. This sign of life was later disputed by scientists, resulting in a continuing debate, with NASA scientist Gilbert Levin asserting that Viking may have found life. A 2014 analysis of Martian meteorite EETA79001 found chlorate, perchlorate, and nitrate ions in sufficiently high concentrations to suggest that they are widespread on Mars. UV and X-ray radiation would turn chlorate and perchlorate ions into other, highly reactive oxychlorines, indicating that any organic molecules would have to be buried under the surface to survive. Small quantities of methane and formaldehyde detected by Mars orbiters are both claimed to be possible evidence for life, as these chemical compounds would quickly break down in the Martian atmosphere. Alternatively, these compounds may instead be replenished by volcanic or other geological means, such as serpentinite. Impact glass, formed by the impact of meteors, which on Earth can preserve signs of life, has also been found on the surface of the impact craters on Mars. Likewise, the glass in impact craters on Mars could have preserved signs of life, if life existed at the site. The Cheyava Falls rock discovered on Mars in June 2024 has been designated by NASA as a "potential biosignature" and was core sampled by the Perseverance rover for possible return to Earth and further examination. Although highly intriguing, no definitive final determination on a biological or abiotic origin of this rock can be made with the data currently available. Several plans for a human mission to Mars have been proposed, but none have come to fruition. The NASA Authorization Act of 2017 directed NASA to study the feasibility of a crewed Mars mission in the early 2030s; the resulting report concluded that this would be unfeasible. In addition, in 2021, China was planning to send a crewed Mars mission in 2033. Privately held companies such as SpaceX have also proposed plans to send humans to Mars, with the eventual goal to settle on the planet. As of 2024, SpaceX has proceeded with the development of the Starship launch vehicle with the goal of Mars colonization. In plans shared with the company in April 2024, Elon Musk envisions the beginning of a Mars colony within the next twenty years. This would be enabled by the planned mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. The moon Phobos has been proposed as an anchor point for a space elevator. Besides national space agencies and space companies, groups such as the Mars Society and The Planetary Society advocate for human missions to Mars. In culture Mars is named after the Roman god of war (Greek Ares), but was also associated with the demi-god Heracles (Roman Hercules) by ancient Greek astronomers, as detailed by Aristotle. This association between Mars and war dates back at least to Babylonian astronomy, in which the planet was named for the god Nergal, deity of war and destruction. It persisted into modern times, as exemplified by Gustav Holst's orchestral suite The Planets, whose famous first movement labels Mars "The Bringer of War". The planet's symbol, a circle with a spear pointing out to the upper right, is also used as a symbol for the male gender. The symbol dates from at least the 11th century, though a possible predecessor has been found in the Greek Oxyrhynchus Papyri. The idea that Mars was populated by intelligent Martians became widespread in the late 19th century. Schiaparelli's "canali" observations combined with Percival Lowell's books on the subject put forward the standard notion of a planet that was a drying, cooling, dying world with ancient civilizations constructing irrigation works. Many other observations and proclamations by notable personalities added to what has been termed "Mars Fever". In the present day, high-resolution mapping of the surface of Mars has revealed no artifacts of habitation, but pseudoscientific speculation about intelligent life on Mars still continues. Reminiscent of the canali observations, these speculations are based on small scale features perceived in the spacecraft images, such as "pyramids" and the "Face on Mars". In his book Cosmos, planetary astronomer Carl Sagan wrote: "Mars has become a kind of mythic arena onto which we have projected our Earthly hopes and fears." The depiction of Mars in fiction has been stimulated by its dramatic red color and by nineteenth-century scientific speculations that its surface conditions might support not just life but intelligent life. This gave way to many science fiction stories involving these concepts, such as H. G. Wells's The War of the Worlds, in which Martians seek to escape their dying planet by invading Earth; Ray Bradbury's The Martian Chronicles, in which human explorers accidentally destroy a Martian civilization; as well as Edgar Rice Burroughs's series Barsoom, C. S. Lewis's novel Out of the Silent Planet (1938), and a number of Robert A. Heinlein stories before the mid-sixties. Since then, depictions of Martians have also extended to animation. A comic figure of an intelligent Martian, Marvin the Martian, appeared in Haredevil Hare (1948) as a character in the Looney Tunes animated cartoons of Warner Brothers, and has continued as part of popular culture to the present. After the Mariner and Viking spacecraft had returned pictures of Mars as a lifeless and canal-less world, these ideas about Mars were abandoned; for many science-fiction authors, the new discoveries initially seemed like a constraint, but eventually the post-Viking knowledge of Mars became itself a source of inspiration for works like Kim Stanley Robinson's Mars trilogy. See also Notes References Further reading External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of". |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Computer_Conservation_Society] | [TOKENS: 253] |
Contents Computer Conservation Society The Computer Conservation Society (CCS) is a British organisation, founded in 1989. It is under the joint umbrella of the British Computer Society (BCS), the London Science Museum and the Manchester Museum of Science and Industry. Overview The CCS is interested in the history of computing in general and the conservation and preservation of early British historical computers in particular. The society runs a series of monthly public lectures between September and May each year in both London and Manchester. The events are detailed on the society's website. The CCS publishes a quarterly journal, Resurrection. The society celebrated its 25th anniversary in 2014. Dr Doron Swade, formerly the curator of the computing collection at the London Science Museum, was a founding committee member and As of 2021[update] is the current chair of the society. David Morriss, Rachel Burnett, and Roger Johnson are previous chairs, also all previous presidents of the BCS. Projects The society organises a number of projects to reconstruct and maintain early computers and to conserve early software. For example: Locations London Science Museum: Museum of Science and Industry, Manchester: The National Museum of Computing: Bletchley Park Trust: Currently not on public display: References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_ref-55] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Greater_Central_Asia] | [TOKENS: 856] |
Contents Greater Central Asia Greater Central Asia (GCA) is a variously defined region encompassing the area in and around Central Asia, by one definition including Pakistan, Iran, Turkey, Xinjiang (in China), and Afghanistan, and by a more expansive definition, excluding Turkey but including Mongolia and parts of India and Russia. The region was historically interconnected religiously, economically, and otherwise, being important as part of the Silk Road trading network until the 15th century; the competition between Soviet, British, and Chinese spheres of influence split the region apart in the 20th century. In the 21st century, it has been contested by a number of major powers, such as the United States, China, and Russia. The region is defined to a significant extent by its many tribal/clan alliances and histories. History In ancient times, GCA was involved in the Silk Road, and was greatly influenced by Buddhism as it transmitted through the region to East Asia. The region was important in an intellectual sense, coming up with many new ideas and connecting the intellectual spheres of neighboring Eurasian regions. Alexander the Great's conquests throughout the region, culminating in northwest India, Hellenized the region and left Greek kingdoms such as the Greco-Bactrian Kingdom in their wake. The Kushan Empire was one of the first empires to unite most of GCA. The Mongol conquest of Central Asia in the 13th century increased the economic connectivity of the region. The Islamization of GCA was ongoing during this time period; Arab conquests of the region from the 7th century onward had surpassed the conquests of the region from the previous millennium in bringing cultural and religious change, with the southern regions of GCA having converted to Islam within the first Islamic century, while the northern parts of Central Asia took closer to a millennium; Central Asia then went on to be a core contributor to the Islamic Golden Age. However, non-Muslim areas of GCA such as Mongolia still share common religious heritage with neighboring areas through elements such as Tengrism. Central Asian conquests of India in the first half of the second millennium, primarily by Timur and later Babur, then resulted in the spread of a Turco-Persian tradition throughout GCA and through northwestern South Asia into the rest of South Asia. By the 17th century, the importance of the Silk Road had declined due to the rise of maritime trade. The 18th- to mid 20th-century British rule of India disconnected South Asians from their centuries-long ties to GCA at the same time that the Soviet Union and Chinese Qing dynasty were conquering parts of the region. Afghanistan became a buffer state between the British Empire and the Soviet Union in what was referred to as the "Great Game". After India's independence in 1947, it was able to build closer ties with Soviet Central Asia as part of its overall close relations with the Soviet Union during the Cold War, in contrast to Pakistan. The Soviet invasion of Afghanistan of the 1980s prompted a greater level of Western interest in the GCA concept, as a way of understanding contemporary events in the context of historical Eurasian geopolitics. By 1991, the Soviet Union had ended and the five modern Central Asian nations became independent. Important events in the early 2020s, such as America's chaotic pullout from Afghanistan, along with Russia's full-scale invasion of Ukraine, have reduced Central Asia's chances of creating land routes to the sea for trade, and have created fears in the region of being invaded again. China's involvement in GCA, involving over $100 billion in investment, is argued to be aimed towards the protection of its Xinjiang region from neighboring terrorist groups, as well as securing natural resources and curbing the local influence of America and India. India is interested in engaging with GCA, though its difficult relationship with Pakistan and the instability of Afghanistan reduce the potential for such engagement for the time being. India also lacks the direct borders with Central Asia as well as the economic heft of being able to provide a Belt and Road Initiative-type project to the region that China has, which are factors that favor China's influence in the region. See also References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Inner_Asia] | [TOKENS: 1312] |
Contents Inner Asia Inner Asia refers to the northern and landlocked regions spanning North, Central, and East Asia. It includes parts of western and northeast China, as well as southern Siberia. The area overlaps with some definitions of "Central Asia", mostly the historical ones, but certain regions that are often included in Inner Asia, such as Manchuria, are not a part of Central Asia by any of its definitions. Inner Asia may be regarded as the western and northern "frontier" of China proper and as being bounded by East Asia proper, which consists of China proper, Japan, and Korea. The extent of Inner Asia has been understood differently in different periods. "Inner Asia" is sometimes contrasted to "China proper", that is, the territories originally unified under the Qin dynasty with majority Han populations. By the year 1800, Chinese Inner Asia consisted of four main areas, namely Manchuria (modern Northeast China and Outer Manchuria), the Mongolian Plateau (Inner Mongolia and Outer Mongolia), Xinjiang (Chinese Turkestan or East Turkestan), and Tibet. Many of these areas had been only recently conquered by the Qing dynasty of China and, during most of the Qing period, they were governed through administrative structures different from those of the older Chinese provinces. A Qing government agency, the Lifan Yuan, supervised the empire's Inner Asian regions, also known as Chinese Tartary. The frontier regions of China proper—Gansu, Qinghai, Sichuan, and Yunnan—are also sometimes included as part of Inner Asia. Definition and usage "Inner Asia" today has a range of definitions and usages. Denis Sinor, for example, used "Inner Asia" in contrast to agricultural civilizations, noting its changing borders, such as when a Roman province was taken by the Huns, areas of North China were occupied by the Mongols, or Anatolia came under Turkish influence, eradicating Hellenistic culture. Scholars or historians of the Qing dynasty, such as those who compiled the New Qing History, often use the term "Inner Asia" when studying Qing interests or reigns outside China proper, although previous Chinese dynasties like the Han, Tang, and Ming dynasties also expanded their realms and influences into Inner Asia. According to Morris Rossabi, Inner Asia is composed not only of the five Central Asian countries, which includes Turkmenistan, Uzbekistan, Tajikistan, Kyrgyzstan, and Kazakhstan, but also includes Afghanistan, Xinjiang, Mongolia, Manchuria, and parts of Iran. The Committee on Inner Asian and Altaic Studies of Harvard University defines Inner Asia as a region consisting of Russian Turkestan, Xinjiang, Eastern Iran, Northern Pakistan, Afghanistan, Tibet, Qinghai, Sichuan, Gansu, and northwestern Yunnan. The Mongolia and Inner Asia Studies Unit at the University of Cambridge defines Inner Asia as "an area centred on Mongolia and extending across the region of the great steppes to the Himalayas", including Kazakhstan, Kyrgyzstan, Tajikistan, Uzbekistan, Xinjiang, Tibet, Qinghai, Gansu, Sichuan, Yunnan, Nepal, Sikkim, Bhutan, Inner Mongolia, Liaoning, Jilin, Heilongjiang, Altai, Tuva, Buryatia, and Chita. In other languages In French, Asie centrale can mean either "Central Asia" or "Inner Asia", while Mongolia and Tibet are grouped as Haute Asie (lit. 'Upper Asia'). The terms meaning "Inner Asia" in the languages of Inner Asia itself are all modern translations of terms in European languages, mostly Russian. Related terms "Central Asia" normally denotes the western part of Inner Asia; that is, Kazakhstan, Kyrgyzstan, Tajikistan, Turkmenistan, and Uzbekistan, with Afghanistan sometimes also included as part of Central Asia. However, the Library of Congress' subject classification system treats "Central Asia" and Inner Asia as synonymous. Historian Morris Rossabi posits that the "Inner Asia" is the established term for the area in relevant literature. Historian Denis Sinor believed the term was deficient, particularly as it implies an "Outer Asia" that in fact has no agreed-upon meaning or common usage. As an alternative, Sinor proposed the neologism "Central Eurasia" to emphasize the region's history in transcontinental exchange, e.g., as territories of the Silk Road. According to Sinor: The definition that can be given of Central Eurasia in space is negative. It is that part of the continent of Eurasia that lies beyond the borders of the great sedentary civilizations.... Although the area of Central Eurasia is subject to fluctuations, the general trend is that of diminution. With the territorial growth of the sedentary civilizations, their borderline extends and offers a larger surface on which new layers of barbarians will be deposited. Origin of Inner Asian studies Central Europe is the birthplace of Inner Asian studies in the West. Hungarian explorers and scholars of the early 19th century traveled to Inner Asia with an attempt to uncover their own Magyar prehistory. The linguist Sándor Kőrösi Csoma (1784 – 1842) was the first among these explorers; he later became a founder of Tibetology. Count Béla Széchenyi led a scientific expedition to Inner Asia in 1877–1880; he later founded the Hungarian journal Turán in 1913. The term "Inner Asian studies" (Hungarian: belső-ázsiai kutatások; German: innerasiatische Studien) first appeared in the masthead of Turán. The periodical's name refers to the historical region in Central Asia known as Turan. In the first two decades of the 20th century, Hungarian-British archaeologist Aurel Stein made important discoveries over the course of his four expeditions to Inner Asia. In 1928, Stein published Innermost Asia: Detailed Report of Explorations in Central Asia, Kan-su and Eastern Iran, Carried Out and Described under the Orders of H.M. Indian Government in four volumes. In 1940, the first academic chair for Inner Asian studies was established by the Hungarian Orientalist and linguist Lajos Ligeti at the University of Budapest. See also References External links |
======================================== |
[SOURCE: https://www.mako.co.il/food-feed/2026-m02_w03/shorts-1a8866b06d57c91027.htm] | [TOKENS: 101] |
We are sorry... ...but your activity and behavior on this website made us think that you are a bot. Please solve this CAPTCHA in helping us understand your behavior to grant access You reached this page when trying to access https://www.mako.co.il/food-feed/2026-m02_w03/shorts-1a8866b06d57c91027.htm from 79.181.162.231 on February 21 2026, 10:56:36 UTC |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/21st_Century_Fox] | [TOKENS: 3287] |
Contents 21st Century Fox Twenty-First Century Fox, Inc. (did business as 21st Century Fox), abbreviated as 21CF, was an American multinational mass media and entertainment conglomerate based in Midtown Manhattan, New York City. It was formed on June 28, 2013, as the legal successor to News Corporation, while the second News Corporation was formed the same day as a spin-off. 21st Century Fox was the legal successor to News Corporation dealing primarily in the film and television industries. It was the United States' fourth-largest media conglomerate by revenue, up until its acquisition by the Walt Disney Company in 2019. The second News Corporation, which is doing business as News Corp, was spun off from the first News Corporation and holds Rupert Murdoch's print interests and other media assets in Australia (both owned by him and his family via a family trust with 39% interest in each). Murdoch was co-executive chairman, while his sons Lachlan Murdoch and James Murdoch were co-executive chairman and CEO, respectively. 21st Century Fox's assets included the Fox Entertainment Group—owners of the 20th Century Fox film studio (the company's partial namesake), the Fox television network, and a 73% stake in National Geographic Partners—the commercial media arm of the National Geographic Society, among other assets. It also had significant foreign operations, including the prominent Indian television channel operator Star India. The company ranked No. 109 in the 2018 Fortune 500 list of the largest United States corporations by total revenue. On December 14, 2017, The Walt Disney Company agreed to acquire 21st Century Fox for $52.4 billion in stock. After Comcast mounted an all-cash bid of $65 billion, Disney increased its offer to $71.3 billion in cash and stock. Comcast dropped its bid on July 19, 2018, to instead acquire Sky plc, a British media group in which 21CF held a 39% stake. On July 27, 2018, Disney's offer was approved by shareholders of both companies. The sale covered the majority of 21CF's entertainment assets, including 20th Century Fox, FX Networks, and National Geographic Partners among others; while the sale also included 21CF's regional Fox Sports Networks, Disney was required to sell them within 90 days of the closure of the acquisition to comply with antitrust rulings. The remaining assets, consisting primarily of the Fox and MyNetworkTV networks, and 21CF's local station, news and national sports assets, were spun out into a new company named Fox Corporation, which began trading on March 19, 2019. Disney's acquisition of 21st Century Fox closed on March 20 of the same year. History 21st Century Fox was formed by the splitting of entertainment and media properties from News Corporation. In February 2012, Natalie Ravitz accepted a position to become Rupert Murdoch's Chief of Staff at News Corporation. News Corporation's board approved the split on May 24, 2013, while shareholders approved the split on June 11, 2013; the company completed the split on June 28 and formally started trading on NASDAQ on July 1. Plans for the split were originally announced on June 28, 2012, while additional details and the working name of the new company were unveiled on December 3, 2012. Murdoch stated that performing this split would "unlock the true value of both companies and their distinct assets, enabling investors to benefit from the separate strategic opportunities resulting from more focused management of each division." The move also came in the wake of a series of scandals that had damaged the reputation of the company's publishing operations in the United Kingdom. The split was structured so that the old News Corporation would change its name to 21st Century Fox and spin-off its publishing assets into a "new" News Corporation. While the company was originally announced as the Fox Group, on April 16, 2013, Murdoch announced the new name as a way to leverage the already established 20th Century Fox brand name. Its logo was officially unveiled on May 9, 2013, featuring a modernized version of the iconic Fox searchlights designed by Pentagram. However, the 21st Century Fox brand does not extend to the existing 20th Century Fox division (which remains under its original name). The formation of 21st Century Fox was officially finalized on June 28, 2013. It formally began trading on NASDAQ and the Australian Securities Exchange on July 1, 2013, with its executives including Rupert Murdoch being chairman and chief executive officer (CEO) of the company, while Chase Carey took the posts of president and chief operating officer, with Co-chairman and Co-CEO positions were created in 2014 and later filled by Lachlan Murdoch and James Murdoch, respectively, both sons of Rupert Murdoch. On January 8, 2014, Rupert Murdoch announced plans to delist 21st Century Fox's shares from the Australian Securities Exchange, in favor of solely trading on the NASDAQ. Its listing in Australia was a holdover from its period as News Corporation, and 21st Century Fox has relatively little presence in Australia, unlike News Corp. Murdoch stated that the changes, which were expected to be complete by June 2014, would "simplify the capital and operating structure" of 21st Century Fox and provide "improved liquidity" to shareholders. Also that month, the company acquired a majority ownership in YES Network, a New York regional sports network founded by the New York Yankees. In June 2014, 21st Century Fox made a bid to acquire Time Warner, which had similarly spun off its publishing assets, for $80 billion in a cash and stock deal. The deal, which was rejected by Time Warner's board of directors in July 2014, would have also involved the sale of CNN to ease antitrust issues. On August 5, 2014, 21st Century Fox announced it had withdrawn its bid for Time Warner. The company's stock had fallen sharply since the bid was announced, prompting directors to announce 21st Century Fox would buy back $6 billion of its shares over the following 12 months. On July 25, 2014, 21st Century Fox announced the sale of Sky Italia and Sky Deutschland to BSkyB for $9 billion, subject to regulatory and shareholder approval. Fox would use the money from the sale, along with $25 billion it received from Goldman Sachs, to attempt another bid for Time Warner. In December 2014, Fox-owned television studio Shine Group merged with Apollo Global Management's Endemol and Core Media Group to form Endemol Shine Group, which was jointly owned by 21st Century Fox and Apollo. On July 1, 2015, Lachlan Murdoch was elevated to Co-Executive Chairman alongside his father and James Murdoch replaced his father as CEO of the company. Former COO Chase Carey became Executive Vice-chairman. On September 9, 2015, 21st Century Fox announced a for-profit joint venture with the National Geographic Society, which called National Geographic Partners, which took ownership of all of National Geographic media and consumer businesses, including National Geographic magazine, and the National Geographic-branded television channels that were already run as a joint venture with Fox. 21st Century Fox holds a 73% stake in the company. On December 9, 2016, 21st Century Fox announced it had made an offer to acquire the 61% share of Sky plc that it did not already own. The company was valued at £18.5 billion. The deal was approved by the European Commission on April 7, 2017, followed by Ireland's Minister for Communications, Climate Action and Environment on June 27. However, the deal has become subject to scrutiny and an extended regulatory review in the United Kingdom, over concerns surrounding the plurality of British news media that will be owned by the Murdoch family post-merger (counting Sky News, as well as News Corp's newspapers and recent acquisition of radio station operator Wireless Group), and violations of British news broadcasting regulations connected to Sky's former carriage of Fox News Channel in the country. However, a bidding war ensued over the company; in September 2018, Comcast won a regulator-mandated auction with a bid of £17.28 per-share. On September 26, 2018, 21st Century Fox subsequently announced its intent to sell all of its shares in Sky plc to Comcast for £12 billion. On October 4, 2018, 21st Century Fox completed the sale of its stake to Comcast, giving the latter a 76.8% controlling stake. The Kingdom Holding Company, owned by Prince Al-Waleed bin Talal, sold its minority stake in 21st Century Fox during the fiscal quarter ending September 2017. It previously held a 6% stake, which had been reduced to around 5% in 2015. The valuation of the shares, or who they were sold to, is unknown; Al-Waleed was the company's largest single shareholder behind the Murdoch family. The sale was reported after Al-Waleed was arrested in early-November 2017 as part of an anti-corruption probe by the Saudi government. On December 14, 2017, after rumors of such a sale that had been circulating since November 6 following a CNBC report, The Walt Disney Company began its acquisition of 21st Century Fox for $52.4 billion after the spin-off of certain businesses, pending regulatory approval. 21st Century Fox president Peter Rice stated that he expected the sale to be completed by mid-2019. Under the terms of the deal, 21st Century Fox spun off an entity that was initially being referred to as "New Fox", consisting of the Fox Broadcasting Company, Fox News, Fox Business Network, and the national operations of Fox Sports (such as Fox Sports 1, Fox Sports 2, and Big Ten Network, but excluding its regional sports networks), and Disney acquired the remainder of 21st Century Fox. This included key entertainment assets such as the 20th Century Fox film studio and its subsidiaries; a stake in Hulu; the U.S. pay television subsidiaries FX Networks, Fox Sports Networks and National Geographic Partners; and international operations of Fox Networks Group as well as Star India. The acquisition was primarily intended to bolster two over-the-top content endeavors—ESPN+ and Disney+. Disney will lease the 20th Century Fox backlot in Century City, Los Angeles for seven years. The proposed transaction raised antitrust issues, due to concerns that it could have led to a tangible loss in competition in the film and sports broadcasting industries. Several legal experts and industry analysts expressed the opinion that the transaction was likely to receive regulatory approval, but would be scrutinized by regulators. In February 2018, the Wall Street Journal reported that Comcast, the owner of NBCUniversal, was considering a counter-offer. Despite initially bidding $60 billion earlier, Fox had rejected Comcast's offer due to the possibility of antitrust concerns. On May 5, 2018, it was reported that Comcast was preparing to make an unsolicited, all-cash counteroffer to acquire the 21st Century Fox's assets Disney has offered to purchase, contingent on the outcome of an antitrust lawsuit AT&T's acquisition of Time Warner. Comcast confirmed on May 23, 2018, that it was "considering, and is in advanced stages of preparing, an offer for the businesses that Fox has agreed to sell to Disney." A shareholder vote on the sale was scheduled for special shareholder meetings by Fox and Disney on July 10, 2018, at the New York Hilton Midtown and New Amsterdam Theatre respectively, although Fox warned that it might "postpone or adjourn" the meeting if Comcast were to follow through with its intent to make a counter-offer. It was also reported that Disney was preparing an all-cash offer of its own to counter Comcast's bid. On June 13, 2018, the day after AT&T was given an approval to merge with Time Warner, Comcast officially announced a $65 billion all-cash counter-offer to acquire the 21st Century Fox's assets Disney had offered to purchase. However, on June 20, 2018, Disney agreed to increase its bid to a $71.3 billion cash-and-stock offer. Fox agreed to the new offer. The proposed purchase was given antitrust approval by the Department of Justice on June 27, 2018, under the condition that Disney divest all of Fox's regional sports networks. The networks could either be divested to third-parties, or retained by "New Fox". On July 19, 2018, Comcast announced it was dropping its bid for Fox in order to focus on its bid for Sky. On July 27, 2018, it was reported that Fox and Disney shareholders had "overwhelmingly" approved the proposed purchase. The acquisition was expected to be completed by late January 2019, after remaining regulatory approvals are granted in China and the European Union. In October 2018, it was reported that the new, post-merger organizational structure of "New Fox" would be implemented by January 1, 2019, ahead of the closure of the Disney sale (which was still expected to occur within the early of March). On November 6, 2018, the European Commission approved the sale, pursuant to the divestment of A&E Networks properties in Europe deemed to overlap with those of Fox. At a shareholders' meeting the following week, it was revealed that the new company would simply be known as "Fox". On November 19, 2018, the deal was approved unconditionally by Chinese regulators. On January 7, 2019, 21st Century Fox filed the registration statement for "New Fox", under the name Fox Corporation, with the U.S. Securities and Exchange Commission. In an SEC filing, Fox stated that it did not intend to bid for its former regional sports networks. On February 27, 2019, it was reported by Bloomberg that Disney had also planned to divest the international Fox Sports operations in Brazil and Latin America to secure antitrust clearance in Brazil and Mexico, as they both compete with ESPN International properties in their respective regions. On February 27, 2019, the sale was approved by Brazil's Administrative Council for Economic Defense (CADE), with Disney having agreed to the expected divestiture of Fox Sports Latin America. CADE coordinated with regulators in Mexico and Chile in evaluating the transaction. Mexico approved the sale on March 12, 2019, with similar concessions. Clearance in Brazil and Mexico was reported to be the last major hurdles for the sale. On March 12, 2019, Disney officially announced that the sale would be completed on March 20, 2019. On March 19, 2019, preliminary trading for the new Fox Corporation on the S&P 500 started in preparation for the formal merger that was finalized on the next day. Under the terms of acquisition, Disney would phase out Fox brand usage by 2024. Lachlan Murdoch, James Murdoch, their sister Elisabeth Murdoch, and half-sister Prudence MacLeod, each benefited by approximately $2 billion as a result of the Disney transaction. Final holdings 21st Century Fox primarily consisted of the media and broadcasting properties that were owned by its predecessor, such as the Fox Entertainment Group and Star India. News Corporation's broadcasting properties in Australia, such as Foxtel and Fox Sports Australia, remained a part of the newly renamed News Corp Australia—which was spun off with the current incarnation of News Corp and was not a part of 21st Century Fox. These units were transferred to the Fox Corporation, not Disney. Cable TV channels owned (in whole or part) and operated by 21st Century Fox include: References External links |
======================================== |
[SOURCE: https://www.wired.com/video/watch/study-of-buddhist-monks-finds-meditation-alters-brain-activity] | [TOKENS: 364] |
Study of Buddhist Monks Finds Meditation Alters Brain Activity Released on 02/13/2026 You think meditation simply rest the brain? Think again. A new international study has concluded that meditation is actually a state of heightened cerebral activity that profoundly alters brain dynamics. Researchers from Canada and Italy recruited 12 mocks from a Buddhist monastery outside of Rome and analyze their brain activity using magnetoencephalography, a technology that records the brain's electrical signals. The study focused on two forms of meditation, Samatha, which focuses on steady attention to calm and stabilize the mind. And vipasana, which involves observing sensations, thought, and emotions with equanimity to achieve mental clarity and insight. In the experiment, researchers used a high resolution MEG scanner to record the monk's brain activity as they switched between the two meditation practices. So recordings were then analyzed using advanced signal analysis and machine learning methods to identify patterns in neural complexity and brain dynamics. Results found that although through different dynamic configurations, both meditation styles increase brain signal complexity, meaning that the brain becomes more dynamically engaged rather than relaxed. This means that meditation enhances wellbeing and reduces stress, anxiety, and depression, not by switching the brain off, but actually by actively engaging it. So... Trending video Collectibles Expert Answers Collectibles Questions Olympian Answers Figure Skating Questions Paralympian Answers Paralympics Questions I Escaped Chinese Mafia Crypto Slavery Professor Answers Olympic History Questions © 2026 Condé Nast. All rights reserved. WIRED may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Bilad_a-Sham] | [TOKENS: 4236] |
Contents Bilad al-Sham Bilad al-Sham (Arabic: بِلَاد الشَّام, romanized: Bilād al-Shām), often referred to as Islamic Syria or simply Syria in English-language sources, was a province of the Rashidun, Umayyad, Abbasid, and Fatimid caliphates. It roughly corresponded with the Byzantine Diocese of the East, conquered by the Muslims in 634–647. Under the Umayyads (661–750), Bilad al-Sham was the metropolitan province of the Caliphate and different localities throughout the province served as the seats of the Umayyad caliphs and princes. Bilad al-Sham remains to be the name of the Levant region in Arabic. Bilad al-Sham was first organized into the four ajnad (military districts; singular jund) of Dimashq (Damascus), Hims (Homs), al-Urdunn (Jordan), and Filastin (Palestine), between 637 and 640 by Caliph Umar following the Muslim conquest. The jund of Qinnasrin was created out of the northern part of Hims by caliphs Mu'awiya I (r. 661–680) or Yazid I (r. 680–683). The Jazira (Upper Mesopotamia) was made an independent province from the Mesopotamian part of Qinnasrin by Caliph Abd al-Malik in 692. In 786, the jund of al-Awasim and al-Thughur were established from the northern frontier region of Qinnasrin by Caliph Harun al-Rashid. As centralized Abbasid rule over Bilad al-Sham collapsed in the 10th century, control over the region was divided by several potentates and the ajnad only represented nominal divisions. The Abbasids and the Egypt-based Fatimid Caliphate continued to officially recognize the province and its ajnad until the Crusader invasions of the coastal regions in 1099. Name The name Bilad al-Sham in Arabic translates as "the left-hand region". It was so named from the perspective of the people of the Hejaz (western Arabia), who considered themselves to be facing the rising sun, that the Syrian region was positioned to their left, while to their right was al-Yaman ("the right-hand-region"). Geography Bilad al-Sham comprised the area of the region of Syria, spanning the modern countries of Syria, Lebanon, Jordan, and Palestine, as well as the regions of Hatay, Gaziantep, and Diyarbakir in modern Turkey. It was bound by the Mediterranean Sea in the west and the Syrian Desert in the east toward Iraq. The western, Mediterranean coastal range were characterized by rolling hills in Palestine in the south, rising to their highest points in Mount Lebanon in the center before becoming considerably lower in the Jabal Ansariya range in the north. Eastward from the coastal range, the ridges of inland Syria become gradually lower, with the exception of Mount Hermon north of the Golan, and include the ranges of the Anti-Lebanon, Jabal al-Ruwaq, and Jabal Bishri. With the termination of the inland ridges begins the mostly level Syrian steppe. History Following the consolidation of Islamic hegemony over Arabia and its nomadic Arab tribes in the Ridda wars of 632–633, the caliph (leader of the Muslim community) Abu Bakr (r. 632–634) turned the nascent Muslim state's goals toward the conquest of Syria. The conquest unfolded in three main phases, according to the historian Fred Donner. In the first phase, Abu Bakr dispatched four armies from Medina in late 633 led by the commanders Amr ibn al-As, Yazid ibn Abi Sufyan, Shurahbil ibn Hasana, all veterans of the Ridda wars, and Abu Ubayda ibn al-Jarrah, a leading companion of Muhammad. Abu Ubayda may not have been dispatched until 636. Each commander was assigned to a different zone, with Amr entrusted over Palestine, Yazid to the Balqa (central Transjordan), Shurahbil to southern Transjordan, and Abu Ubayda to the Ghassanid stomping grounds of the Golan Heights. The Muslim commanders mainly engaged in small-scale skirmishes in the southern Syrian countryside with local garrisons. The goal of the Muslims at the start of the conquest was likely bringing the Arabic-speaking nomadic, semi-nomadic, and settled tribesmen of the southern Syrian desert fringes under their control. The second phase began with the arrival of Khalid ibn al-Walid and his troops to Syria in 634. Under Khalid's supreme command, the Muslim armies besieged and captured the southern Syrian urban centers of Bosra, Damascus, Beisan (Scythopolis), Pella, Gaza, and temporarily, Homs (Emesa) and Baalbek (Hierapolis). Heraclius responded by deploying successive imperial armies against the Muslims. The Byzantines were decisively defeated in the resulting major battles of Ajnadayn in Palestine and Fahl and Yarmouk in Transjordan, all occurring in 634–636. The Muslim battlefield victories effectively ended organized resistance by the Byzantines. In the third phase, beginning about 637, the Muslim armies quickly occupied the northern Syrian countryside, while steadily conquering individual towns throughout the region whose garrisons held out alone following the breakdown of the imperial defense. Among the towns, a number of which held out until 637 or 638, were Aleppo (Beroea) and Qinnasrin (Chalcis) in the north, Hama, Homs and Baalbek (the latter two possibly for the second time), Damascus possibly for the second time, and Jerusalem. Within the next few years, the Mediterranean coastal towns of Beirut, Sidon, Tyre, Caesarea, Antioch, Tripoli and Ascalon were captured by Muslim forces. Umar had appointed Abu Ubayda ibn al-Jarrah commander of the Muslim troops in Syria in c. 636 and supreme governor of the conquered regions. Abu Ubayda died in the plague of Amwas, which devastated the Muslims at their camp near Jerusalem and caused significant loss of life throughout Syria. Umar replaced him with Yazid ibn Abi Sufyan in the southern districts of Syria and Iyad ibn Ghanm in the northern districts. Yazid died from the plague soon after and was replaced by his brother Mu'awiya. Umar's successor, Caliph Uthman (r. 644–656), gradually expanded Mu'awiya's governorship to span all of Syria. As governor, Mu'awiya, forged strong ties with the old-established Arab tribes of Syria, which, by dint of their long service under the Byzantines, were more politically experienced than the tribesmen of Arabia, who filled the ranks of the Muslim armies. Among the Syrian tribes, the powerful Banu Kalb and their Quda'a confederacy gained the preeminent position in Mu'awiya's government. He also accommodated Arab newcomers, most prominently the Kinda from South Arabia. The tribes and commanders of Syria backed Mu'awiya in his confrontation with Caliph Ali at the Battle of Siffin in 657, which ended in a stalemate and an agreement to arbitrate their dispute. The arbitration talks collapsed and Mu'awiya's Syrian supporters recognized him as caliph in a ceremony in Jerusalem in 660. Ali was murdered the following year, paving the way for Mu'awiya to gain control of the rest of the Caliphate. Syria became the metropolitan province of the Umayyad Caliphate which Mu'awiya founded and whose capital was at Damascus. Syria's history under Umayyad rule was "essentially the history of the Umayyad dynasty", according to the historians Henri Lammens and Clifford Edmund Bosworth. Mu'awiya had his son Yazid I, the son of a Kalbi woman, recognized as his successor. Yazid I (r. 680–683) was opposed by the people of the Hejaz, whose revolt against him was crushed by Syria's troops at the Battle of al-Harra. The Syrians proceeded to besiege Mecca in 683, but withdrew to Syria after Yazid I died. The Meccan leader of the revolt, Ibn al-Zubayr, was recognized as caliph across much of the Muslim empire, while Yazid I's son and successor, Mu'awiya II, succumbed to the plague. The Kalb and other loyalist tribes elected another Umayyad, Marwan I, as caliph and he moved to secure the dynasty's Syrian heartland. With these tribes' support, he defeated the Qays tribes and other supporters of Ibn al-Zubayr at the Battle of Marj Rahit, north of Damascus, in 684. Under his son and successor, Abd al-Malik (r. 685–705), Syrian troops reconquered the rest of the Caliphate and killed Ibn al-Zubayr in a second siege of Mecca. A standing army composed of the Syrian tribal soldiery was established under this caliph and his sons and successors. Abd al-Malik inaugurated a more Arab–Islamic government in Syria by changing the language of its bureaucracy from Greek to Arabic, switching from Byzantine coinage to a strictly Islamic currency, and building the Dome of the Rock in Jerusalem, which he may have promoted as an additional center of Muslim pilgrimage to Mecca. Abd al-Malik's son and successor, al-Walid I (r. 705–715), ruled with autocratic tendencies and less tolerance for the non-Muslims in Syria and the empire in general, which reached its greatest territorial extent during his reign. He largely demolished the Christian basilica of St. John in Damascus and built in its place the landmark Great Umayyad Mosque. He achieved great popularity among the Syrian Arabs. During his rule and that of his successors, Damascus retained its role as the administrative capital of the empire, but the caliphs increasingly resided in their country estates in the Syrian steppe. After a period of stagnation, the caliph Hisham (r. 724–743) restored the prestige of the Umayyad Caliphate through his administrative reforms, state-building and austerity, though the conquests ground to a halt. His successor, al-Walid II, was assassinated, sparking the Third Muslim Civil War. The next caliph, Yazid III, died after a few months, followed by the weak rule of Ibrahim. Marwan II took control in late 744, crushed his Syrian tribal opponents, and shifted the capital to Harran, outside of Syria, which increased Syrian opposition to his rule. Bilad al-Sham became much less important under the Abbasid Caliphate, which succeeded the Umayyads in 750. The Abbasids moved the capital first to Kufa, and then to Baghdad and Samarra, all of which were in Iraq, which consequently became their most important province. The mainly Arab Syrians were marginalized by Iranian and Turkish forces who rose to power under the Abbasids, a trend which also expressed itself on a cultural level. From 878 until 905, Syria came under the effective control of the Tulunids of Egypt, but direct Abbasid control was re-established soon thereafter. It lasted until the 940s, when the province was partitioned between the Hamdanid Emirate of Aleppo in the north and Ikhshidid-controlled Egypt in the south. In the 960s the Byzantine Empire under Nikephoros II Phokas conquered much of northern Syria, and Aleppo became a Byzantine tributary, while the southern provinces passed to the Fatimid Caliphate after its conquest of Egypt in 969. The division of Syria into northern and southern parts would persist, despite political changes, until the Mamluk conquest in the late 13th century.[citation needed] Administrative history The ajnad were an adaptation of the preexisting administrative system of the Diocese of the East (Byzantine Syria) to suit the nascent Muslim state's needs. The Byzantine system, in turn, had been based on that instituted by its Roman predecessor in the aftermath of the First Jewish Revolt in 70 CE and the Bar Kokhba Revolt in 135 CE. To establish closer control over the broadly spread population of Syria following the revolts, the region was subdivided into smaller units centered around an urban center which policed and collected taxes from the surrounding hinterland. By 400 the southern half of Syria was divided between the three Palestines (Palaestina Prima, Palaestina Secunda, and Palaestina Tertia), Phoenice and Arabia. Following the decisive Muslim victory at Yarmouk in 636, and the occupation of most of the Mediterranean coast and northern Syria in the next two years, the Muslims began to militarily and administratively organize the region for their needs. Caliph Umar, who ruled from Medina, visited the Muslim army's principal camp at Jabiya, the former Ghassanid capital, at least once between 637 and 639. From there he personally oversaw the distribution of allowances (ata) and rations (rizq) to the Muslim soldiery, tax collection from the conquered population, and the appointments to military command. There may have been initial Muslim intentions to establish Jabiya as the permanent, central garrison town of Syria along the lines of those later established in the conquered regions of Iraq (Kufa and Basra), Egypt (Fustat), and Ifriqiya (Kairouan). Those garrison cities developed into major urban centers of the Caliphate. During one of his visits, or by 640 at the latest, the central army camp at Jabiya was disbanded by Umar. Instead, as a result of several factors, "a self-supporting, more flexible" military-administrative system was established, according to the historian Alan Walmsley. Unlike Iraq and Egypt where settlement was concentrated along the major rivers of those provinces, Syrian settlement was distributed over an extensive area of mountains, valleys, and plains. The complex geography slowed communications and army movements in the region, necessitating multiple regional centers for efficient administration and defense; according to Walmsley, this was "a principle confirmed by over 500 years of Roman and Byzantine administration". The change of Muslim military objectives following Yarmouk, when focus shifted to the northern Syrian and Mediterranean fronts, also necessitated the establishment of additional army headquarters and garrisons, such as Homs, diminishing Jabiya's centrality. Further reducing troop numbers in Jabiya was the Plague of Amwas in 639, which reduced the garrison there from 24,000 to 4,000. The decrease was likely due to factors in addition to the plague. In late 639 or early 640, a significant number of Muslim troops also left Syria for the conquest of Egypt under Amr's command. Troop numbers in Jabiya could not be restored in the aftermath of the plague and the departure of Muslim troops to other fronts. Unlike in Iraq where there were high levels of Arab tribal immigration, similar immigration into Syria was restricted by the Qurayshite elite in a bid to preserve their pre-established interests in the region. Syria had a substantial, long-standing Arab population, both in the tribes who dominated the steppe and formerly served Byzantium and in the urban Arab communities, particularly those of Damascus and Homs. Not long after Yarmouk, the Arab tribes of Syria were incorporated into the nascent Muslim military structure there. The native tribes had a preference for the established urban centers with which they were long familiarized. Muslim settlement in the urban centers was facilitated by the wide availability of property in the cities in the wake of the conquests, as a result of the exodus of pro-Byzantine, Greek-speaking residents or in property transfers to the Muslims secured in capitulation agreements. Muslim settlement in the hinterland, on the other hand, was limited as the Aramaic-speaking peasantry remained in their villages. Umar divided Syria into the four ajnad of Filastin, al-Urdunn, Dimashq, and Hims. The new garrisons were assigned to the urban centers of Lydda, Tiberias, Damascus, and Homs, respectively. In effect, Umar gave his sanction of the existing military situation in Syria, where different army units operated independently on the different fronts. By establishing the ajnad, Umar transformed the military structures into provincial governments concerned with the taxation of the local populations and the distribution of collected money and supplies for the troops. During the caliphate of Umar's successor Uthman (r. 644–656), supplemental garrisons were established in the respective ajnad, especially in the coastal cities. During the reign of Mu'awiya I or Yazid I, Qinnasrin (northern Syria) and the Jazira (Upper Mesopotamia) were separated from Jund Hims and became Jund Qinnasrin. The separation may have been a response to the influx of northern Arab (Qays and Mudar) immigrant tribesmen to Qinnasrin and the Jazira during Mu'awiya's governorship and caliphate. In 692 Caliph Abd al-Malik separated the Jazira from Jund Qinnasrin, and it became the independent province of the Jazira. According to Blankinship, this change of status may have been related to the peace settlement reached with the Qays in 691 after the Qays had rebelled against the Umayyads during the Second Muslim Civil War. According to the historian Hugh N. Kennedy, the separation was done at the request of Muhammad ibn Marwan, Abd al-Malik's brother and his commander responsible for the Jazira. In 786 Caliph Harun al-Rashid established Jund al-Awasim out of the northern part of Jund Qinnasrin. It spanned the frontier zone with the Byzantine Empire, extending from the areas immediately south of Antioch, Aleppo, and Manbij and eastward to the Euphrates. Manbij and later Antioch became the capitals of the new jund. Jund al-Awasim served as the second defensive line behind the actual frontier zone, the Thughur, which encompassed the far northern Syrian towns of Baghras, Bayas, Duluk, Alexandretta, Cyrrhus, Ra'ban and Tizin. The Thughur was subdivided into the Cilician or Syrian al-Thughur al-Sha'miya and the Jaziran or Mesopotamian al-Thughur al-Jaziriya sectors, roughly separated by the Amanus mountains. Tarsus and Malatya were the most important towns in the Syrian and the Mesopotamian sectors respectively, though the two districts did not have administrative capitals sometimes were under the administrative control of Jund al-Awasim. By the 10th century, the terms Thughur and al-Awasim were often used interchangeably in the sources. The governor of the provinces were called wali or amir. As direct Abbasid rule over the Levant faltered and eventually collapsed in the 10th century, different parts of the region were controlled by several different rulerships. The ajnad became nominal divisions with no practical relevance. The administrative system continued to be officially recognized by the Abbasid and Fatimid governments until the Crusader conquests of the western parts of Bilad al-Sham, beginning in 1099. As a geographic expression, "Bilad al-Sham" continued to be used by Arabic-speaking Muslims into the late 19th century, when Suriyya, the Arabic word for "Syria", generally replaced the term in common usage. Leading up to that point, Suriyya had been increasingly used in 19th-century Arabic Christian literature and among Europeans. See also References Bibliography |
======================================== |
[SOURCE: https://www.theverge.com/podcast/881222/fcc-colbert-talarico-brendan-carr-vergecast] | [TOKENS: 2052] |
PodcastsClosePodcastsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PodcastsAICloseAIPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All AIPolicyClosePolicyPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PolicyThe speech police came for ColbertOn The Vergecast: the FCC’s chilling effect, Apple’s AI gadgets, and Tesla’s robotaxi record.On The Vergecast: the FCC’s chilling effect, Apple’s AI gadgets, and Tesla’s robotaxi record.by David PierceCloseDavid PierceEditor-at-LargePosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by David PierceFeb 19, 2026, 3:08 PM UTCLinkShareGiftDavid PierceCloseDavid PiercePosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by David Pierce is editor-at-large and Vergecast co-host with over a decade of experience covering consumer tech. Previously, at Protocol, The Wall Street Journal, and Wired.Generally speaking, arcane and mostly unenforced FCC rules are not the province of late night talk shows. FCC Commissioner Brendan Carr seems intent on changing that, though; not long after causing a ruckus that briefly took Jimmy Kimmel off the air, his vague threats appear to have been enough to convince CBS to tell Stephen Colbert not to air an interview. Which, of course, became a whole thing.Verge subscribers, don’t forget you get exclusive access to ad-free Vergecast wherever you get your podcasts. Head here. Not a subscriber? You can sign up here.On this episode of The Vergecast (which we recorded and published a day early both because of the news and because Nilay has a vacation to go on), David and Nilay open the show with an extra-large installment of Brendan Carr is a Dummy. We talk through the timeline of the Colbert / CBS back-and-forth, once again attempt to explain how the equal time rule actually works, and wonder exactly how far Carr’s chilling effect will be allowed to go.After that, we turn to some gadgets. Nilay has always wanted exactly the facial-recognition feature Meta appears to be getting ready to launch for its smart glasses, but we’re not sure it should exist. And it appears Meta knows that. Apple is also gearing up for potentially a series of launches in early March, which could include new iPads, new Macs, and more. It almost certainly won’t include any of the AI gadgets Apple is reportedly working on, but we talk about those anyway.Finally, in the lightning round, we discuss Tesla’s rough self-driving record, Samsung’s next phones, an astonishing robovac security problem, and more. Subscribe: Spotify | Apple Podcasts | Overcast | Pocket Casts | MoreIf you want to know more about everything we discuss in this episode, here are some links to get you started, first in FCC news:The first Colbert video: Why CBS Didn’t Broadcast Stephen Colbert’s Interview With James TalaricoStephen Colbert says CBS banned him from airing this James Talarico interviewThe second Colbert video: Why Everyone’s Talking About Stephen Colbert, CBS, The FCC And James TalaricoFrom Public Knowledge: Equal Time, Unequal Enforcement: The Latest Move to Weaponize the FCC Against Trump CriticsAnd in gadgets:Meta reportedly wants to add face recognition to smart glasses while privacy advocates are distractedApple’s doing something on March 4th Apple is reportedly planning to launch AI-powered glasses, a pendant, and AirPodsMeta’s new deal with Nvidia buys up millions of AI chipsSwitch 2 pricing and next PlayStation release could be impacted by memory shortagePixel 10A hands-on: More like a slightly better Pixel 9AAnd in the lightning round:Tesla’s robotaxis have crashed 14 times in 9 months. Tesla won’t use the term ‘Autopilot’ in California anymoreSamsung ad confirms rumors of a useful S26 ‘privacy display’ The DJI Romo robovac had security so poor, this man remotely accessed thousands of themWarner Bros. Discovery gives Paramount one week to present its ‘best and final’ offer Why are Epstein’s emails full of equals signs?4chan’s creator says ‘Epstein had nothing to do’ with creating infamous far-right board /pol/WordPress’ new AI assistant will let users edit their sites with promptsFollow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.David PierceCloseDavid PierceEditor-at-LargePosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by David PierceAICloseAIPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All AIAppleCloseApplePosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All ApplePodcastsClosePodcastsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PodcastsPolicyClosePolicyPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PolicyTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechVergecastCloseVergecastPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All VergecastMost PopularMost PopularThe RAM shortage is coming for everything you care aboutA $10K+ bounty is waiting for anyone who can unplug Ring doorbells from Amazon’s cloudMeta’s VR metaverse is ditching VRTurtle Beach’s new PC controller with swiveling sticks is 30 percent offGE Profile made a smaller version of its nugget ice maker that needs less counter spaceThe Verge DailyA free daily digest of the news that matters most.Email (required)Sign UpBy submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.Advertiser Content FromThis is the title for the native ad Posts from this topic will be added to your daily email digest and your homepage feed. See All Podcasts Posts from this topic will be added to your daily email digest and your homepage feed. See All AI Posts from this topic will be added to your daily email digest and your homepage feed. See All Policy The speech police came for Colbert On The Vergecast: the FCC’s chilling effect, Apple’s AI gadgets, and Tesla’s robotaxi record. On The Vergecast: the FCC’s chilling effect, Apple’s AI gadgets, and Tesla’s robotaxi record. Posts from this author will be added to your daily email digest and your homepage feed. See All by David Pierce Posts from this author will be added to your daily email digest and your homepage feed. See All by David Pierce Generally speaking, arcane and mostly unenforced FCC rules are not the province of late night talk shows. FCC Commissioner Brendan Carr seems intent on changing that, though; not long after causing a ruckus that briefly took Jimmy Kimmel off the air, his vague threats appear to have been enough to convince CBS to tell Stephen Colbert not to air an interview. Which, of course, became a whole thing. Verge subscribers, don’t forget you get exclusive access to ad-free Vergecast wherever you get your podcasts. Head here. Not a subscriber? You can sign up here. On this episode of The Vergecast (which we recorded and published a day early both because of the news and because Nilay has a vacation to go on), David and Nilay open the show with an extra-large installment of Brendan Carr is a Dummy. We talk through the timeline of the Colbert / CBS back-and-forth, once again attempt to explain how the equal time rule actually works, and wonder exactly how far Carr’s chilling effect will be allowed to go. After that, we turn to some gadgets. Nilay has always wanted exactly the facial-recognition feature Meta appears to be getting ready to launch for its smart glasses, but we’re not sure it should exist. And it appears Meta knows that. Apple is also gearing up for potentially a series of launches in early March, which could include new iPads, new Macs, and more. It almost certainly won’t include any of the AI gadgets Apple is reportedly working on, but we talk about those anyway. Finally, in the lightning round, we discuss Tesla’s rough self-driving record, Samsung’s next phones, an astonishing robovac security problem, and more. Subscribe: Spotify | Apple Podcasts | Overcast | Pocket Casts | More If you want to know more about everything we discuss in this episode, here are some links to get you started, first in FCC news: And in gadgets: And in the lightning round: Posts from this author will be added to your daily email digest and your homepage feed. See All by David Pierce Posts from this topic will be added to your daily email digest and your homepage feed. See All AI Posts from this topic will be added to your daily email digest and your homepage feed. See All Apple Posts from this topic will be added to your daily email digest and your homepage feed. See All Podcasts Posts from this topic will be added to your daily email digest and your homepage feed. See All Policy Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech Posts from this topic will be added to your daily email digest and your homepage feed. See All Vergecast Most Popular The Verge Daily A free daily digest of the news that matters most. This is the title for the native ad More in Podcasts This is the title for the native ad Top Stories © 2026 Vox Media, LLC. All Rights Reserved |
======================================== |
[SOURCE: https://www.ynet.co.il/economy/article/skbkeksd11l] | [TOKENS: 245] |
כמה עלתה דירה בסוף 2025? הערים היקרות והזולות בישראל איזו עיר הייתה הכי יקרה למגורים ברבעון האחרון של שנת 2025, בכמה טיפסו מחירי הדירות בערים הגדולות לעומת הרבעון המקביל ב-2024, היכן נצפו ירידות במחירי הדיור ואיפה בכל זאת אפשר לקנות נכס במחיר של פחות ממיליון שקל? |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/China_proper] | [TOKENS: 3452] |
Contents China proper China proper, also called Inner China, are terms used primarily in the Western world in reference to the traditional "core" regions of Chinese civilization centered around the Yellow River and Yangtze River valleys. There is no fixed definition for China proper as many administrative, cultural and territorial shifts have occurred throughout history. One definition refers to the original heartland regions of the Chinese civilization, the Central Plain (southern North China Plain around the lower Yellow River valley) as well as the historical Nine Provinces; another to the Eighteen Provinces inside Shanhai Pass[note 1] designated by the Qing regime. In contrast, Outer China is a term usually includes the peripheral marchland regions such as Gobi Desert,[note 2], Tarim Basin, Northeast China, Dzungaria, Tibetan Plateau and Yungui Plateau, which were historically autonomous regions with unstable allegiance to the authority of Chinese monarchs. The term was first used by the Europeans during the 17th century to distinguish the historical "Han lands" (Chinese: 漢地, i.e. regions long dominated by the majority Han Chinese population) from "frontier" regions of China where Han populations intermix with other indigenous ethnicities (e.g. Turkic peoples such as Uyghurs, Kazakhs and Uzbeks, Mongolic peoples, and Tibeto-Burmese peoples such as Tibetans, Yi and Bai) and newer foreign immigrants (e.g. Slavic colonists such as Russians and Ukrainian Cossacks), sometimes known as "Outer China". There was no direct translation for "China proper" in the Chinese language at the time due to differences in terminology used by the Qing regime to refer to the regions. Etymology According to Harry Harding, the concept can date back to 1827. But as early as in 1795, William Winterbotham adopted this concept in his book. When describing the Chinese Empire under the Qing dynasty, Winterbotham divided it into three parts: China proper, Chinese Tartary, and the states tributary to China. He adopted the opinions of Du Halde and Grosier and suspected that the name of "China" came from Qin dynasty. He then said: "China, properly so called,... comprehends from north to south eighteen degrees; its extent from east to west being somewhat less..." The concept of "China proper" also appeared before this 1795 book. It can be found in The Gentleman's Magazine, published in 1790, and The Monthly Review, published in 1749. In the nineteenth century, the term "China proper" was sometimes used by Chinese officials when they were communicating in foreign languages. For instance, the Qing ambassador to Britain Zeng Jize used it in an English language article, which he published in 1887. "Dulimbai Gurun" is the Manchu name for China (中國, Zhongguo; "Middle Kingdom"). After conquering the Ming, the Qing identified their state as "China" (Zhongguo), and referred to it as "Dulimbai Gurun" in the Manchu language. The Qing emperors equated the lands of the Qing state (including both "China proper" and present day Manchuria, Xinjiang, Mongolia, Tibet and other areas) as "China" in both the Chinese and Manchu languages, defining China as a multiethnic state, rejecting the idea that China only meant Han-populated areas in "China proper", proclaiming that both Han and non-Han peoples were part of "China", using "China" to refer to the Qing in official documents, international treaties, and foreign affairs, and the "Chinese language" (Dulimbai gurun i bithe) referred to Chinese, Manchu, and Mongol languages, and the term "Chinese people" (中國人, Zhongguo ren; Manchu: Dulimbai gurun i niyalma) referred to all Han, Manchu, and Mongol subjects of the Qing. When the Qing conquered Dzungaria in 1759, they proclaimed that the new land was absorbed into "China" (Dulimbai Gurun) in a Manchu language memorial. The Qing expounded on their ideology that they were bringing together the "outer" non-Han peoples like the Manchus, Mongols, Uighurs and Tibetans together with the "inner" Han people, into "one family" united under the Qing state, showing that the diverse subjects of the Qing were all part of one family, the Qing used the phrase "Zhong Wai Yi Jia" (中外一家) or "Nei Wai Yi Jia" (內外一家, "interior and exterior as one family"), to convey this idea of "unification" of the different peoples. A Manchu language version of a treaty with the Russian Empire concerning criminal jurisdiction over bandits called people from the Qing as "people of the Central Kingdom (Dulimbai Gurun)". In the Manchu official Tulisen's Manchu language account of his meeting with the Torghut Mongol leader Ayuki Khan, it was mentioned that while the Torghuts were unlike the Russians, the "people of the Central Kingdom" (dulimba-i gurun; 中國, Zhongguo) were like the Torghut Mongols, and the "people of the Central Kingdom" referred to the Manchus. While the Qing dynasty used "China" (Zhongguo) to describe non-Han areas, some Han scholar-officials opposed the Qing emperor's use of Zhongguo to refer to non-Han areas, using instead Zhongguo to mark a distinction between the culturally Han areas and the territories newly acquired by the Qing empire. In the early 19th century, Wei Yuan's Shengwuji (Military History of the Qing Dynasty) calls the Inner Asian polities guo, while the seventeen provinces of the traditional heartland, that is, "China proper", and three eastern provinces of Manchuria are called "Zhongguo". Some Ming loyalists of Han ethnicity refused to use Zhongguo to refer to areas outside the borders of Ming China, in effect refusing to acknowledge the legitimacy of the Qing dynasty. Han Chinese intellectuals gradually embraced the new meaning of "China" and began to recognize it as their homeland. Political use In the early 20th century, a series of Sino-Japanese conflicts had raised Chinese people's concern for national unity, and the concept of a unified, undivided Chinese nation became more popular among Chinese scholars. On Jan 1, 1939, Gu Jiegang published his article "The term 'China proper' should be abolished immediately", which argued that the widely accepted area covered by "China proper" is not the actual territory of any of the Chinese dynasties. Gu further theorized that "中国本部", the Chinese and Japanese term equal to "China proper" at the time, actually originated from Japan and was translated into "China proper", hence the concept of "China proper" was developed by Japanese people, and it had become a tool to divide Chinese people, making way for the Japanese invasion of Mongolia, Manchuria, and other parts of China. Gu's article sparked a heated debate on the definition and origin of "Zhonghua minzu" (Chinese nation), which contributed to unifying the Chinese people in the Second Sino-Japanese War, and to an extent shaped the later established concept of Zhonghua minzu. Modern Today, China proper is a controversial concept in China itself, since the current official paradigm does not contrast the core and the periphery of China. There is no single widely used term corresponding to it in the Chinese language. The separation of China into a "China proper" dominated by Han people and other states for ethnic minorities such as East Turkestan for the Uyghurs impugns on the legitimacy of China's current territorial borders, which is based on the succession of states principle. According to sinologist Colin Mackerras, foreign governments have generally accepted Chinese claims over its ethnic minority areas, because to redefine a country's territory every time it underwent a change of regime would cause endless instability and warfare. Also, he asks, "if the boundaries of the Qing were considered illegitimate, why should it go back to the much smaller Ming in preference to the quite extensive Tang dynasty boundaries?" Extent There is no fixed geographical extent for China proper, as it is used to express the contrast between the core and frontier regions of China from multiple perspectives: historical, administrative, cultural, and linguistic. The Great Wall of China is often used as an approximate boundary between Han Chinese-dominated core regions and other frontier regions, which roughly corresponds to the so-called "400 mm (16 in) annual precipitation line" that delineates arid/semi-arid regions largely unsuitable for agricultural activities from those with more rainfall and thus more adaptable to agrarian societies (such as those of the Han Chinese). One way of thinking about China proper is to refer to the long-standing territories held by dynasties of China founded by the Han people. Chinese civilization developed from a core region in the North China Plain, and expanded outwards over several millennia, conquering and assimilating surrounding peoples, or being conquered and influenced in turn. Some dynasties, such as the Han and Tang dynasties, were particularly expansionist, extending far into Inner Asia, while others, such as the Jin and Song dynasties, were forced to relinquish the North China Plain itself to rivaling regimes founded by peoples from the north. The Ming dynasty was the last orthodox Chinese dynasty of ethnic Han origin and the second-last imperial dynasty of China. It governed fifteen administrative entities, which included thirteen provinces (Chinese: 布政使司; pinyin: Bùzhèngshǐ Sī) and two "directly-governed" areas. After the Manchu-led Qing dynasty succeeded the Ming dynasty in China proper, the Qing court decided to continue to use the Ming administrative system to rule over former Ming lands, without applying it to other domains under Qing rule, namely Manchuria, Mongolia, Xinjiang, Taiwan and Tibet. The 15 administrative units of the Ming dynasty underwent minor reforms to become the "Eighteen Provinces" (一十八行省; Yīshíbā Xíngshěng, or 十八省; Shíbā Shěng) of China proper under the Qing dynasty. It was these eighteen provinces that early Western sources referred to as China proper. There are some minor differences between the extent of Ming China and the extent of the eighteen provinces of Qing China: for example, some parts of Manchuria were Ming possessions belonging to the province of Liaodong (now Liaoning), which is inside the Ming Great Wall; however, the Qing conquered it before entering the Central Plain and did not administer as part of a regular province of China proper. On the other hand, Taiwan was a new acquisition of the Qing dynasty, and it was placed under the administration of Fujian, one of the provinces of China proper. Eastern Kham in Greater Tibet was added to Sichuan, while much of what now constitutes northern Burma was added to Yunnan. Near the end of the Qing dynasty, there was an effort to extend the province system of China proper to the rest of the empire. Taiwan was converted into a separate province in 1885, but was ceded to Japan in 1895. Xinjiang was reorganized into a province in 1884. Manchuria was split into the three provinces of Fengtian, Jilin and Heilongjiang in 1907. There was discussion to do the same in Tibet, Qinghai (Kokonor), Inner Mongolia, and Outer Mongolia, but these proposals were not put to practice, and these areas were outside the provincial system of China proper when the Qing dynasty fell in 1912. The Provinces of the Qing Dynasty were: Some of the revolutionaries who sought to overthrow Qing rule desired to establish a state independent of the Qing dynasty within the bounds of the Eighteen Provinces, as evinced by their Eighteen-Star Flag. Others favoured the replacement of the entire Qing dynasty by a new republic, as evinced by their Five-Striped Flag. Some revolutionaries, such as Zou Rong, used the term Zhongguo Benbu (中国本部) which roughly identifies the Eighteen Provinces. When the Qing dynasty fell, the abdication decree of the Xuantong Emperor bequeathed all the territories of the Qing dynasty to the new Republic of China, and the latter idea was therefore adopted by the new republic as the principle of Five Races Under One Union, with Five Races referring to the Han, Manchus, Mongols, Muslims (Uyghurs, Hui etc.) and Tibetans. The Five-Striped Flag was adopted as the national flag, and the Republic of China viewed itself as a single unified state encompassing all five regions handed down by the Qing dynasty. The People's Republic of China, which was founded in 1949 and replaced the Republic of China on the Chinese mainland, has continued to claim essentially the same borders, with the only major exception being the recognition of an independent Mongolia. As a result, the concept of China proper fell out of favour in China. The Eighteen Provinces of the Qing dynasty still largely exist, but their boundaries have changed. Beijing and Tianjin were eventually split from Hebei (renamed from Zhili), Shanghai from Jiangsu, Chongqing from Sichuan, Ningxia autonomous region from Gansu, and Hainan from Guangdong. Guangxi is now an autonomous region. The provinces that the late Qing dynasty set up have also been kept: Xinjiang became an autonomous region under the People's Republic of China, while the three provinces of Manchuria now have somewhat different borders, with Fengtian renamed as Liaoning. When the Qing dynasty fell, Republican Chinese control of Qing territories, including of those generally considered to be in "China proper", was tenuous, and non-existent in Tibet and Mongolian People's Republic (former Outer Mongolia) since 1922, which were controlled by governments that declared independence from China. The Republic of China subdivided Inner Mongolia in its time on the mainland, although the People's Republic of China later joined Mongol-inhabited territories into a single autonomous region. The PRC joined the Qamdo area into the Tibet area (later the Tibet Autonomous Region). The Republic of China officially recognized the independence of Mongolia in 1946, which was also acknowledged by the PRC government since its founding in 1949. China proper is often associated with the Han people, the majority ethnic group of China and with the extent of the Chinese languages, an important unifying element of the Han ethnicity. However, Han regions in the present day do not correspond well to the Eighteen Provinces of the Qing dynasty. Much of southwestern China, such as areas in the provinces of Yunnan, Guangxi, and Guizhou, was part of successive dynasties of ethnic Han origin, including the Ming dynasty and the Eighteen Provinces of the Qing dynasty. However, these areas were and continue to be populated by various non-Han minority groups, such as the Zhuang, the Miao people, and the Bouyei. Conversely, Han people form the majority in most of Manchuria, much of Inner Mongolia, many areas in Xinjiang and scattered parts of Tibet today, not least due to the expansion of Han settlement encouraged by the late Qing dynasty, the Republic of China, and the People's Republic of China. Ethnic Han is not synonymous with speakers of the Chinese language. Many non-Han ethnicities, such as the Hui and Manchu, are essentially monolingual in the Chinese language, but do not identify as ethnic Han. The Chinese language itself is also a complex entity, and should be described as a family of related languages rather than a single language if the criterion of mutual intelligibility is used to classify its subdivisions. In polls the majority of the people of Taiwan call themselves "Taiwanese" only with the rest identifying as "Taiwanese and Chinese" or "Chinese" only. Most of the people of Taiwan are descendants of immigrants from mainland China since the 1600s, but the inclusion of Taiwan in the definition of China proper, is still a controversial subject. See History of Taiwan and Political status of Taiwan for more information. See also Notes References External links This article incorporates public domain material from websites or documents of the United States government. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_ref-FOOTNOTEWilliams199720_178-0] | [TOKENS: 10728] |
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006—over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to £700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, — with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) — as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999–2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256×224 to 640×480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consoles—including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears out—usually unevenly—due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Exilarch] | [TOKENS: 8774] |
Contents Exilarch The exilarch[a] was the leader of the Jewish community in Mesopotamia (modern-day Iraq) during the Parthian and Sasanian Empires and Abbasid Caliphate up until the 1258 CE Mongol invasion of Baghdad, with intermittent gaps due to ongoing political developments. The exilarch was regarded by the Jewish community as the royal heir of the Davidic line and held prominence as both a rabbinical authority and a noble within the Persian and Arab courts. Within the Sasanian Empire, the exilarch was the political equivalent of the Catholicos of the Christian Church of the East and was thus responsible for community-specific organizational tasks such as running the rabbinical courts, collecting taxes from Jewish communities, supervising and providing financing for the Talmudic academies in Babylonia, and the charitable re-distribution and financial assistance to needy members of the exile community. The position of exilarch was hereditary, held in continuity by a family that traced its patrilineal descent from antiquity stemming from King David. The first historical documents referring to it date from the time when Babylonia was part of the late Parthian Empire. The office first appears in the 2nd century and continues through the middle of the 6th century under different Persian dynasties (the Parthians and Sassanids). In the late 5th and early 6th centuries, Mar-Zutra II briefly ruled a politically independent state from Mahoza for about 7 years. He was eventually defeated by Kavadh I, King of Persia, and the office of the exilarch was diminished for some time thereafter. The position was restored to prominence in the 7th century, under the rule of the Arab Caliphate, and the office of exilarch continued to be appointed by Arab authorities through the 11th century. The exilarch's authority came under considerable challenge in 825 during the reign of al-Ma'mun who issued a decree permitting a group of ten men from any religious community to organize separately, which allowed the Gaon of the Talmudic academies of Sura and Pumbedita to compete with the exilarch for power and influence, later contributing to the wider schism between Karaites and Rabbinic Jewry. Title The word exilarch is a Greco-Latin calque of the Hebrew Rosh HaGola (ראש הגולה), literally meaning 'head of the exile'. The position was similarly called in Aramaic (Imperial Aramaic: ריש גלותא, romanized: Reysh Galuta or Resh Galvata) and Arabic (Arabic: رأس الجالوت, romanized: Raas al-Galut). It was translated into Persian as سر جالوت. The Jewish people in exile were referred to as golah (as in, e.g., Jeremiah 28:6 and Jeremiah 29:1) or galut. The contemporary Greek term that was used was Aechmalotarches (Αἰχμαλωτάρχης), literally meaning the 'leader of the captives'. The Greek term has continued to be applied to the office, notwithstanding changes to the position over time, which were largely titular. Development and organization Although there is no mention of the office before the 2nd century, the Seder Olam Zutta alleges that the office of exilarch was established following the deportation of King Jeconiah and his court into exile in Babylon after the first fall of Jerusalem in 597 BCE and augmented after the further deportations following the destruction of the kingdom of Judah in 587. The history of the Babylonian exilarchate falls into two separate identifiable periods: before and after the beginning of the Arabic rule of Babylonia. Nothing is known about the office before the 2nd century, when it is first referenced in the Talmud, including any details about its origins. It can merely be said in general that the golah ("diaspora"), the Jews living in compact masses in various parts of Babylon, tended gradually to unite and create an organization, and that this tendency, together with the high regard in which the descendants of the house of David living in Babylon were held, brought it about that a member of this house was recognized as "head of the golah." The dignity became hereditary in this house and was finally recognized by the state, and hence became an established political institution, first of the Arsacid Empire and then the Sassanid. Such was the exilarchate as it appears in Talmudic literature, the chief source for its history during the first period, and which provides our only information regarding the rights and functions of the exilarchate. For the second, Arabic period, there is a very important and trustworthy description of the institution of the exilarchate (See the sections Installation ceremonies and Income and privileges). This description is also important for the first period as many of the details may be regarded as having persisted from it. In Baghdad, the privilege of using seals was limited to the exilarch and geonim. Serving under the authority of the caliph, they were extremely powerful as the highest authority for the Jewish people in the Caliphate. The use of seals was not limited to internal matters; their authority was recognized by Muslims as well. Based on the account of Benjamin of Tudela: "At the head of them all [the Jews under the Baghdad caliphate] is Daniel the son of Hisdai, who is styled 'Our Lord the Head of the Captivity of all Israel.' ... he has been invested with authority over all the congregations of Israel at the hands of the Emir al Muminim, the Lord of Islam." Holders of the office The following are exilarchs mentioned in the Seder Olam Zutta; most are likely legendary figures and have parallels in the text of 1 Chronicles 3: Probably-historical exilarchs listed in the Seder Olam Zutta or otherwise noted in the Talmud: The following is a list of Karaite exilarchs beginning in the 8th century, after the end of the tenure of the exilarch David I: History The Seder Olam Zuta states that the first exilarch was Jehoiachin, the king of Judah who was carried off to captivity in Babylonia in 597 BCE, wherein he established his residence at the city of Nehardea in Babylonia. This chronicle, written c. 800, presents a legendary origin to the early history of the house of the Babylonian exilarch. The captive king's advancement at Evil-Merodach's court—with which the narrative of the Second Book of Kings closes (2 Kings 25:27)—was regarded by the author of the Seder 'Olam Zuta as the origin of the office, and the basis for the exilarch's authority. A list of generations of the descendants of the king is given in the text, which closely parallels the names found in I Chronicles 3:17 et seq. A commentary to the Chronicles dating from the school of Saadia Gaon quotes Judah ibn Kuraish to the effect that the genealogical list of the descendants of David was added to the book at the end of the period of the Second Temple, a view which was shared by the author of the list of Babylonian exilarchs in Seder 'Olam Zuta. This list attempts to bridge the 700-year gap between Jehoiachin and the first exilarch mentioned in written sources, Nahum. It grants some specific hallmarks chronologically connecting personalities with the history of the Second Temple, such as Shechaniah, who is mentioned as having lived at the time of the Temple's destruction. The following are enumerated as his predecessors in office: Salathiel, Zerubbabel, Meshullam, Hananiah, Berechiah, Hasadiah, Jesaiah, Obadiah, and Shemaiah, Shecaniah, and Hezekiah. All of these names are also found in I Chron. 3., albeit in a confabulated order. This list cannot be historical given the limited number of generations presented. The name Akkub is also found at the end of the Davidic list in the Seder Olam Zuta, which is followed by Nahum, with whom the historic portion of the list begins, and who may be roughly assigned to the time of the destruction of Jerusalem (135). This is the period in which the first allusions in rabbinical literature are found to the office of the exilarch. In the account referring to the attempt of a teacher of the Law from the land of Israel, Hananiah, nephew of Joshua ben Hananiah, to render the Babylonian Jews independent of the Sanhedrin, the religious and political authority residing in the land of Judea, a man named 'Ahijah' is mentioned as the temporal head of the Babylonian Jews, possibly, one of the first historic exilarchs. Another rabbinical source substitutes the name Nehunyon for Ahijah. It is likely that this 'Nehunyon' is identical with the Nahum mentioned in the list. The political danger threatening the Sanhedrin eventually passed. At about this same time, Rabbi Nathan, a member of the house of exilarch, came to Galilee, where the Sanhedrin met and where the Nasi resided following the Jewish expulsion from Jerusalem. By virtue of his rabbinical scholarship, he was soon classed among the foremost tannaim of the post-Hadrianic epoch. His supposed Davidic genealogical origins suggested to Rabbi Meïr the plan of making the Babylonian scholar nasi (prince) in place of the Hillelite Simon ben Gamaliel. However, the conspiracy against the reigning Nasi failed. Rabbi Nathan was subsequently among the confidants of the Hillelite patriarchal house and the teacher of Simon ben Gamaliel's son, Judah I (also known as Judah haNasi). Rabbi Meïr's attempt, however, seems to have led Judah I to fear that the Babylonian exilarch might come to Judea to claim the office from Hillel the Elder's descendant. He discussed the subject with the Babylonian scholar Hiyya, a prominent member of his school, saying that he would pay due honor to the exilarch should the latter come but that he would not renounce the office of nasi in his favor. When the body of the exilarch Huna, who was the first incumbent of that office explicitly mentioned as such in Talmudic literature, was brought to Judea during the time of Judah I, Hiyya drew upon himself Judah's deep resentment by announcing the fact to him with the words "Huna is here". A tannaitic exposition of Genesis 49:10 which contrasts the Babylonian exilarchs, ruling by force, with Hillel's descendants, teaching in public, evidently intends to cast a negative reflection on the former. However, Judah I had to listen at his own table to the statement of the youthful sons of the Hiyya above about the same tannaitic exposition, that "the Messiah can not appear until the exilarchate at Babylon and the patriarchate at Jerusalem shall have ceased". According to the Seder Olam Zutta Nahum was followed by his brother Johanan, both of whom are called sons of Akkub in the text. Johanan's son Shaphat is listed next, who was succeeded by Anan, his son. Given the chronological similarities, the identification of the exilarch Anan with the Huna of the Talmud account is very likely. At the time of Anan's successor Nathan Ukban I, according to the Seder Olam Zuta, occurred the fall of the Parthian Empire and the founding of the Sassanid dynasty in CE 226, which is noted as follows in Seder Olam Zutta: "In the year 166 after the destruction of the Temple (c. CE 234) the Persian Empire advanced upon the Romans" (on the historical value of this statement. Nathan 'Ukban, also known as Mar 'Ukban, was the contemporary of Rav and Samuel, who also occupied a prominent position among the scholars of Babylon' and, according to Sherira Gaon, was also exilarch. As 'Ukban's successor is mentioned in the list his son (Huna II), whose chief advisers were Rav (died 247) and Samuel (died 254), and in whose time Papa ben Nazor destroyed Nehardea. Huna's son and successor, Nathan, whose chief advisers were Judah ben Ezekiel (died 299) and Shesheth, was called, like his grandfather, "Mar 'Ukban", and it is he, the second exilarch of this name, whose curious correspondence with Eleazar ben Pedat is referred to in the Talmud. He was succeeded by his brother (not his son, as stated in Seder Olam Zutta); his leading adviser was Shezbi. The "exilarch Nehemiah" is also mentioned in the Talmud; he is the same person as "Rabbanu Nehemiah," and he and his brother "Rabbeinu 'Ukban" (Mar Ukban II) are several times mentioned in the Talmud as sons of Rav's daughter (hence Huna II was Rav's son-in-law) and members of the house of the exilarchs. According to Seder Olam Zutta, in Nehemiah's time, the 245th year after the destruction of the Temple (313 CE), there took place a great religious persecution by the Persians, of which, however, no details are known. Nehemiah was succeeded by his son Mar 'Ukban III, whose chief advisers were Rabbah ben Nahmani (died 323) and Adda. He is mentioned as "'Ukban ben Nehemiah, resh galuta," in the Talmud. This Mar 'Ukban, the third exilarch of that name, was also called "Nathan," as were the first two, and has been made the hero of a legend under the name of "Nathan de-Ẓuẓita". The conquest of Armenia (337) by Shapur (Sapor) II is mentioned in the chronicle as a historical event occurring during the time of Nathan Ukban III. He was succeeded by his brother Huna Mar (Huna III), whose chief advisers were Abaye (died 338) and Raba; then followed Mar Ukban's son Abba, whose chief advisers were Raba (died 352) and Rabina. During Abba's time King Sapor conquered Nisibis. The designation of a certain Isaac as resh galuta in the time of Abaye and Raba is due to a clerical error [Brüll's Jahrbuch, vii. 115], and is therefore omitted from lists. Abba was succeeded first by his son Nathan and then by another son, Kahana I. The latter's son Huna is then mentioned as successor, being the fourth exilarch of that name; he died in 441, according to a trustworthy source, the "Seder Tannaim wa-Amoraim." Hence he was a contemporary of Rav Ashi, the great master of Sura, who died in 427. In the Talmud, however, Huna ben Nathan is mentioned as Ashi's contemporary, and according to Sherira it was he who was Mar Kahana's successor, a statement which is also confirmed by the Talmud. The statement of Seder Olam Zutta ought perhaps to be emended, since Huna was probably not the son of Mar Kahana, but the son of the latter's elder brother Nathan. Huna was succeeded by his brother Mar Zutra, whose chief adviser was Ahai of Diphti, the same who was defeated in 455 by Ashi's son Tabyomi (Mar) at the election for director of the school of Sura. Mar Zutra was succeeded by his son Kahana (Kahana II), whose chief adviser was Rabina, the editor of the Babylonian Talmud (died 499). Then followed two exilarchs by the same name: another son of Mar Zutra, Huna V, and a grandson of Mar Zutra, Huna VI, the son of Kahana. Huna V fell a victim to the persecutions under King Peroz (Firuz) of Persia, being executed, according to Sherira, in 470; Huna VI was not installed in office until some time later, the exilarchate being vacant during the persecutions under Peroz; he died in 508 [Sherira]. The Seder Olam Zutta connects with the birth of his son Mar Zutra the legend that is elsewhere told in connection with Bostanai's birth. Mar Zutra II, who came into office at the age of fifteen, took advantage of the confusion into which Mazdak's communistic attempts had plunged Persia, to obtain by force of arms for a short time a sort of political independence for the Jews of Babylon. King Kobad, however, punished him by crucifying him on the bridge of Mahuza (c. 502). A son was born to him on the day of his death, who was also named "Mar Zutra." The latter did not attain to the office of exilarch, but went to the land of Israel, where he became head of the Academy of Tiberias, under the title of "Resh Pirka" ('Aρχιφεκίτησ), several generations of his descendants succeeding him in this office. After Mar Zutra's death the exilarchate of Babylon remained unoccupied for some time. Mar Ahunai lived in the period succeeding Mar Zutra II, but for almost fifty years after the catastrophe he did not dare to appear in public, and it is not known whether even then (c. 550) he really acted as exilarch. At any rate the chain of succession of those who inherited the office was not broken. The names of Kafnai and his son Haninai, who were exilarchs in the second half of the 6th, have been preserved. Haninai's posthumous son Bostanai was the first of the exilarchs under Arabic rule. Bostanai was the ancestor of the exilarchs who were in office from the time when the Persian empire was conquered by the Arabs, in 642, down to the 11th century. Through him, the splendor of the office was renewed and its political position made secure. His tomb in Pumbedita was a place of worship as late as the 12th century, according to Benjamin of Tudela. Not much is known regarding Bostanai's successors down to the time of Saadia except their names; even the name of Bostanai's son is not known. The list of the exilarchs down to the end of the 9th century is given as follows in an old document: "Bostanai, Hanina ben Adoi, Hasdai I, Solomon, Isaac Iskawi I, Judah Zakkai (Babawai), Moses, Isaac Iskawi II, David ben Judah, Hasdai II." Hasdai I was probably Bostanai's grandson. The latter's son Solomon had a deciding voice in the appointments to the gaonate of Sura in the years 733 and 759 [Sherira]. Isaac Iskawi I died very soon after Solomon. In the dispute between David's sons Anan and Hananiah regarding the succession the latter was victor; Anan then proclaimed himself anti-exilarch, was imprisoned, and founded the etc. of the Karaites. So says the Jewish Encyclopedia of 1906; the origin of the Karaites is not uncontroversial. His descendants were regarded by the Karaites as the true exilarchs. The following list of Karaite exilarchs, father being succeeded always by son, is given in the genealogy of one of these "Karaite princes": Anan, Saul, Josiah, Boaz, Jehoshaphat, David, Solomon, Hezekiah, Hasdai, Solomon II. Anan's brother Hananiah is not mentioned in this list. Judah Zakkai, who is called "Zakkai ben Ahunai" by Sherira, had as rival candidate Natronai ben Habibai, who, however, was defeated and sent West in banishment; this Natronai was a great scholar, and, according to tradition, while in Spain wrote the Talmud from memory. David ben Judah also had to contend with an anti-exilarch, Daniel by name. The fact that the decision in this dispute rested with the calif Al-Ma'mun (825) indicates a decline in the power of the exilarchate. David ben Judah, who carried off the victory, appointed Isaac ben Hiyya as Gaon at Pumbedita in 833. Preceding Hasdai II's name in the list that of his father Natronai must be inserted. Both are designated as exilarchs in a geonic responsum. Ukban IV is mentioned as exilarch immediately following the death of Hasdai II; he was deposed at the instigation of Kohen-Zedek, Gaon of Pumbedita, but was reinstated in 918 on account of some Arabic verses with which he greeted the caliph al-Muqtadir. He was deposed again soon afterwards, and fled to Kairwan, where he was treated with great honor by the Jewish community there. 'Ukba's nephew, David II, became exilarch; but he had to contend for nearly two years with Kohen-Zedek before he was finally confirmed in his power (921). In consequence of Saadia's call to the gaonate of Sura and his controversy with David, the latter has become one of the best-known personages of Jewish history. Saadia had David's brother Josiah (Al-Hasan) elected anti-exilarch in 930, but the latter was defeated and banished to Chorasan. David ben Zakkai was the last exilarch to play an important part in history. He died a few years before Saadia; his son Judah died seven months afterward. Judah left a son (whose name is not mentioned) twelve years of age, whom Saadia took into his house and educated. His generous treatment of the grandson of his former adversary was continued until Saadia's death in 942. When Gaon Hai died in 1038, nearly a century after Saadia's death, the members of his academy could not find a more worthy successor than the exilarch Hezekiah, a great-grandson of David ben Zakkai, who thereafter filled both offices. But two years later, in 1040, Hezekiah, who was the last exilarch and also the last Gaon, fell a victim to calumny by a peer. He was imprisoned and tortured to death. Two of his sons fled to Spain, where they found refuge with Joseph, the son and successor of Samuel ha-Nagid. Alternatively, Jewish Quarterly Review mentions that Hezekiah was liberated from prison, and became head of the academy, and is mentioned as such by a contemporary in 1046. The title of exilarch is found occasionally even after the Babylonian exilarchate had ceased. Abraham ibn Ezra speaks of the "Davidic house" at Baghdad (before 1140), calling its members the "heads of the Exile." Benjamin of Tudela in 1170 mentions the Exilarch Hasdai, among whose pupils was the subsequent pseudo-Messiah David Alroy, and Hasdai's son, the Exilarch Daniel. Pethahiah of Regensburg also refers to the latter, but under the name of "Daniel ben Solomon"; hence it must be assumed that Hasdai was also called "Solomon". Yehuda Alharizi (after 1216) met at Mosul a descendant of the house of David, whom he calls "David, the head of the Exile." A long time previously a descendant of the ancient house of exilarchs had attempted to revive in Fatimid Egypt the dignity of exilarch which had become extinct in Babylon. This was David ben Daniel; he came to Egypt at the age of twenty, in 1081, and was proclaimed exilarch by the learned Jewish authorities of that country, who wished to divert to Egypt the leadership formerly enjoyed by Babylon. A contemporary document, the Megillah of the gaon Abiathar from the land of Israel, gives an authentic account of this episode of the Egyptian Exilarchate, which ended with the downfall of David ben Daniel in 1094. Descendants of the house of exilarchs were living in various places long after the office became extinct. A descendant of Hezekiah, Hiyya al-Daudi, Gaon of Andalucia, died in 1154 in Castile according to Abraham ibn Daud. Several families, as late as the 14th century, traced their descent back to Josiah, the brother of David ben Zakkai who had been banished to Chorasan (see the genealogies in. The descendants of the Karaite exilarchs have been referred to above. Character of the exilarchate before Arab expansion In accordance with the character of Talmudic tradition, it is the relation of the exilarchs to the heads and members of the schools that is especially referred to in Talmudic literature. The Seder 'Olam Zuta, the chronicle of the exilarchs that is the most important and in many cases the only source of information concerning their succession, has also preserved chiefly the names of those scholars who had certain official relations with the respective exilarchs. The phrase used in this connection ("hakamim debaruhu", "the scholars directed him") is the stereotyped phrase used also in connection with the fictitious exilarchs of the century of the Second Temple; in the latter case, however, it occurs without the specific mention of names—a fact in favor of the historicalness of those names that are given for the succeeding centuries. The authenticity of the names of the amoraim designated as the scholars "guiding" the several exilarchs, is, in the case of those passages in which the text is beyond dispute, supported by internal chronological evidence also. Some of the Babylonian amoraim were closely related to the house of the exilarchs, as, for example, Rabba ben Abuha, whom Gaon Sherira, claiming Davidian descent, named as his ancestor. Nahman ben Jacob (died 320) also became closely connected with the house of the exilarchs through his marriage with Rabba ben Abuha's daughter, the proud Yaltha; and he owed to this connection perhaps his office of chief judge of the Babylonian Jews. Huna, the head of the school of Sura, recognized Nahman ben Jacob's superior knowledge of the Law by saying that Nahman was very close to the "gate of the exilarch" ("baba di resh galuta"), where many cases were decided. The term "dayyanei di baba" ("judges of the gate"), which was applied in the post-Talmudic time to the members of the court of the exilarch, is derived from the phrase just quoted. Two details of Nahman ben Jacob's life cast light on his position at the court of the exilarch: he received the two scholars Rav Chisda and Rabba b. Huna, who had come to pay their respects to the exilarch; and when the exilarch was building a new house he asked Nahman to take charge of the placing of the mezuzah according to the Law. The scholars who formed part of the retinue of the exilarch were called "scholars of the house of the exilarch" ("rabbanan di-be resh galuta"). A remark of Samuel, the head of the school of Nehardea, shows that they wore certain badges on their garments to indicate their position. Once a woman came to Nahman ben Jacob, complaining that the exilarch and the scholars of his court sat at the festival in a stolen booth, the material for it having been taken from her. There are many anecdotes of the annoyances and indignities the scholars had to suffer at the hands of the exilarchs' servants, such as the case of Amram the Pious, of Hiyya of Parwa, and of Abba ben Marta. The modification of ritual requirements granted to the exilarchs and their households in certain concrete cases is characteristic of their relation to the religious law. Once when certain preparations which the exilarch was making in his park for alleviating the strictness of the Sabbath law were interrupted by Raba and his pupils, he exclaimed, in the words of Jeremiah 4:22, "They are wise to do evil, but to do good they have no knowledge". There are frequent references to questions, partly halakhic and exegetical in nature, which the exilarch laid before his scholars. Details are sometimes given of lectures that were delivered "at the entrance to the house of the exilarch" These lectures were probably delivered at the time of the assemblies, which brought many representatives of Babylonian Judaism to the court of the exilarch after the autumnal festivals. The luxurious banquets at the court of the exilarch were well known. An old anecdote was repeated in the land of Israel concerning a splendid feast which the exilarch once gave to the tanna Judah ben Bathyra at Nisibis on the eve of Tisha Beav. though in the more exact S. Buber's edition, the feast was given by the chief of the synagogue. Another story told in the land of Israel relates that an exilarch had music in his house morning and evening, and that Mar 'Ukba, who subsequently became exilarch, sent him as a warning this verse from Hosea: "Rejoice not, O Israel, for joy, as other people." The exilarch Nehemiah is said to have dressed entirely in silk. The Talmud says almost nothing in regard to the personal relations of the exilarchs to the royal court. One passage relates merely that Huna ben Nathan appeared before Yazdegerd I, who with his own hands girded him with the belt which was the sign of the exilarch's office. There are also two allusions dating from an earlier time, one by Hiyya, a Babylonian living in the land of Israel, and the other by Adda ben Ahaba, one of Rav's earlier pupils, from which it seems that the exilarch occupied a foremost position among the high dignitaries of the state when he appeared at the court first of the Arsacids, then of the Sassanids. An Arabic writer of the 9th century records the fact that the exilarch presented a gift of 4,000 dirhems on the Persian feast of Nauruz. Regarding the functions of the exilarch as the chief tax-collector for the Jewish population, there is the curious statement, preserved only in the Jerusalem Talmud, that once, in the time of Huna, the head of the school of Sura, the exilarch was commanded to furnish as much grain as would fill a room of 40 square ells. The most important function of the exilarch was the appointment of the judge. Both Rav and Samuel said that the judge who did not wish to be held personally responsible in case of an error of judgment, would have to accept his appointment from the house of the exilarch. When Rav went from the land of Israel to Nehardea he was appointed overseer of the market by the exilarch. The exilarch had jurisdiction in criminal cases also. Aha b. Jacob, a contemporary of Rav, was commissioned by the exilarch to take charge of a murder case. The story found in Bava Kamma 59a is an interesting example of the police jurisdiction exercised by the followers of the exilarch in the time of Samuel. From the same time dates a curious dispute regarding the etiquette of precedence among the scholars greeting the exilarch. The exilarch had certain privileges regarding real property. It is a specially noteworthy fact that in certain cases the exilarch judged according to the Persian law; and it was the exilarch 'Ukba b. Nehemiah who communicated to the head of the school of Pumbedita, Rabbah ben Nahmai, three Persian statutes which Samuel recognized as binding. A synagogal prerogative of the exilarch was mentioned in the land of Israel as a curiosity: The Torah roll was carried to the exilarch, while every one else had to go to the Torah to read from it. This prerogative is referred to also in the account of the installation of the exilarch in the Arabic period, and this gives color to the assumption that the ceremonies, as recounted in this document, were based in part on usages taken over from the Persian time. The account of the installation of the exilarch is supplemented by further details in regard to the exilarchate which are of great historical value; see the following section. Character of the exilarchate in the Arabic era Upon their conquest of Iraq, the Caliphate confirmed the authority of exilarch on Bustanai son of Haninai, and the continuation of his governance over the Jewish community. For his political services to the Arab authorities during the Islamic conquests, he was given the daughter of the former Sassanid Emperor as a slave. Muslim authorities regarded the office of exilarch with profound respect as they viewed its incumbent as a direct descendant of the ancient prophet David. The subsequent fragmentation of the authority of the Abbasids resulted in the waning of the authority of the exilarch beyond the former Abbasid realm. Additionally, the struggle for leadership between the Geonim of the rabbinical academies and exilarchs saw the slow diminishment of centralized power. Rabbinical decentralization favored the Geonim, but remained an office of reverence to which Muslim authorities showed respect. The following is a translation of a portion of an account of the exilarchy in the Arabic period, written by Nathan ha-Babli in the early 10th century, and included in Abraham Zacuto's "Yuhasin" and in Neubauer's "Mediaeval Jewish Chronicles,": The members of the two academies [Sura and Pumbedita], led by the two heads [the geonim] as well as by the leaders of the community, assemble in the house of an especially prominent man before the Sabbath on which the installation of the exilarch is to take place. The first homage is paid on Thursday in the synagogue, the event being announced by trumpets, and every one sends presents to the exilarch according to his means. The leaders of the community and the wealthy send handsome garments, jewelry, and gold and silver vessels. On Thursday and Friday the exilarch gives great banquets. On the morning of the Sabbath the nobles of the community call for him and accompany him to the synagogue. Here a wooden platform covered entirely with costly cloth has been erected, under which a picked choir of sweet-voiced youths well versed in the liturgy has been placed. This choir responds to the leader in prayer, who begins the service with 'Baruk she-amar.' After the morning prayer the exilarch, who until now has been standing in a covered place, appears; the whole congregation rises and remains standing until he has taken his place on the platform, and the two geonim, the one from Sura preceding, have taken seats to his right and left, each making an obeisance. A costly canopy has been erected over the seat of the exilarch. Then the leader in prayer steps in front of the platform and, in a low voice audible only to those close by, and accompanied by the 'Amen' of the choir, addresses the exilarch with a benediction, prepared long beforehand. Then the exilarch delivers a sermon on the text of the week or commissions the gaon of Sura to do so. After the discourse the leader in prayer recites the kaddish, and when he reaches the words 'during your life and in your days,' he adds the words 'and during the life of our prince, the exilarch.' After the kaddish he blesses the exilarch, the two heads of the schools, and the several provinces that contribute to the support of the academies, as well as the individuals who have been of especial service in this direction. Then the Torah is read. When the 'Kohen' and 'Levi' have finished reading, the leader in prayer carries the Torah roll to the exilarch, the whole congregation rising; the exilarch takes the roll in his hands and reads from it while standing. The two heads of the schools also rise, and the gaon of Sura recites the targum to the passage read by the exilarch. When the reading of the Torah is completed, a blessing is pronounced upon the exilarch. After the 'Musaf' prayer the exilarch leaves the synagogue, and all, singing, accompany him to his house. After that the exilarch rarely goes beyond the gate of his house, where services for the community are held on the Sabbaths and feastdays. When it becomes necessary for him to leave his house, he does so only in a carriage of state, accompanied by a large retinue. If the exilarch desires to pay his respects to the king, he first asks permission to do so. As he enters the palace the king's servants hasten to meet him, among whom he liberally distributes gold coin, for which provision has been made beforehand. When led before the king his seat is assigned to him. The king then asks what he desires. He begins with carefully prepared words of praise and blessing, reminds the king of the customs of his fathers, gains the favor of the king with appropriate words, and receives written consent to his demands; thereupon, rejoiced, he takes leave of the king." In regard to Nathan ha-Babli's additional account as to the income and the functions of the exilarch (which refers, however, only to the time of the narrator), it may be noted that he received taxes, amounting altogether to 700 gold denarii a year, chiefly from the provinces Nahrawan, Farsistan, and Holwan. The Muslim author of the 9th century, Al-Jahiz, who has been referred to above, makes special mention of the shofar, the wind-instrument which was used when the exilarch (ras al-jalut) excommunicated any one. The punishment of excommunication is the only ecclesiastical power the exilarch of the Jews and the Catholicos of the Christians may pronounce, for they are deprived of the right of inflicting punishment by imprisonment or flogging. Another Muslim author reports a conversation that took place in the 8th century between a follower of Islam and the exilarch, in which the latter boasted; "Seventy generations have passed between me and King David, yet the Jews still recognize the prerogatives of my royal descent, and regard it as their duty to protect me; but you have slain the grandson Husain of your prophet after one single generation". The son of a previous exilarch said to yet another Muslim author: "I formerly never rode by Karbala, the place where Husain was martyred, without spurring on my horse, for an old tradition said that on this spot the descendant of a prophet would be killed; only since Husain has been slain there and the prophecy has thus been fulfilled do I pass leisurely by the place". This last story indicates that the exilarch had by the Arab period become the subject of Muslim legend. That the person of the exilarch was familiar to Muslim circles is also shown by the fact that the Rabbinite Jews were called Jaluti, that is, those belonging to the exilarch, in contradistinction to the Karaites. In the first quarter of the 11th century, not long before the extinction of the exilarchate, Ibn Hazm made the following remark in regard to the dignity: "The ras al-jalut has no power whatever over the Jews or over other persons; he has merely a title, to which is attached neither authority nor prerogatives of any kind". To this day, the exilarchs are still mentioned in the Sabbath services of the Ashkenazi ritual. The Aramaic prayer "Yekum Purkan", which was used once in Babylon in pronouncing the blessing upon the leaders there, including the "reshe galwata" (the exilarchs), is still recited in most synagogues. The Jews of the Sephardic ritual have not preserved this anachronism, nor was it retained in most of the Reform synagogues. See also Notes References This article incorporates text from a publication now in the public domain: Singer, Isidore; et al., eds. (1901–1906). "Exilarch". The Jewish Encyclopedia. New York: Funk & Wagnalls. External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Sombrero_Galaxy] | [TOKENS: 2084] |
Contents Sombrero Galaxy The Sombrero Galaxy (also known as Messier Object 104, M104 or NGC 4594) is a peculiar galaxy of unclear classification in the constellation borders of Virgo and Corvus, being about 9.55 megaparsecs (31.1 million light-years) from the Milky Way galaxy. It is a member of the Virgo II Groups, a series of galaxies and galaxy clusters strung out from the southern edge of the Virgo Supercluster. It has an isophotal diameter of approximately 29.09 to 32.32 kiloparsecs (94,900 to 105,000 light-years), making it slightly larger than the Milky Way. It has a bright nucleus, an unusually large central bulge, and a prominent dust lane in its outer disk, which from Earth is viewed almost edge-on. The dark dust lane and the bulge give it the appearance of a sombrero hat (thus the name). Astronomers initially thought the halo was small and light, indicative of a spiral galaxy; but the Spitzer Space Telescope found that the halo was significantly larger and more massive than previously thought, indicative of a giant elliptical galaxy. The galaxy has an apparent magnitude of +8.0, making it easily visible with amateur telescopes, and is considered by some authors to be the galaxy with the highest absolute magnitude within a radius of 10 megaparsecs of the Milky Way. Its large bulge, central supermassive black hole, and dust lane all attract the attention of professional astronomers. Observation history The Sombrero Galaxy was discovered on May 11, 1781 by Pierre Méchain, who described the object in a May 1783 letter to J. Bernoulli that was later published in the Berliner Astronomisches Jahrbuch. Charles Messier made a handwritten note about this and five other objects (now collectively recognized as M104 – M109) to his personal list of objects now known as the Messier Catalogue, but it was not "officially" included until 1921. William Herschel independently discovered the object in 1784 and additionally noted the presence of a "dark stratum" in the galaxy's disc, what is now called a dust lane. Later astronomers were able to connect Méchain's and Herschel's observations. In 1921, Camille Flammarion found Messier's personal list of the Messier objects including the hand-written notes about the Sombrero Galaxy. This was identified with object 4594 in the New General Catalogue, and Flammarion declared that it should be included in the Messier Catalogue. Since this time, the Sombrero Galaxy has been known as M104. Dust ring As noted above, this galaxy's most striking feature is the dust lane that crosses in front of the bulge of the galaxy. This dust lane is actually a symmetrical ring that encloses the bulge of the galaxy. Most of the cold atomic hydrogen gas and the dust lie within this ring. The ring might also contain most of the Sombrero Galaxy's cold molecular gas, although this is an inference based on observations with low resolution and weak detections. Additional observations are needed to confirm that the Sombrero galaxy's molecular gas is constrained to the ring. Based on infrared spectroscopy, the dust ring is the primary site of star formation within this galaxy. Nucleus The nucleus of the Sombrero Galaxy is classified as a low-ionization nuclear emission-line region (LINER). These are nuclear regions where ionized gas is present, but the ions are only weakly ionized (i.e. the atoms are missing relatively few electrons). The source of energy for ionizing the gas in LINERs has been debated extensively. Some LINER nuclei may be powered by hot, young stars found in star formation regions, whereas other LINER nuclei may be powered by active galactic nuclei (highly energetic regions that contain supermassive black holes). Infrared spectroscopy observations have demonstrated that the nucleus of the Sombrero Galaxy is probably devoid of any significant star formation activity. However, a supermassive black hole has been identified in the nucleus (as discussed in the subsection below), so this active galactic nucleus is probably the energy source that weakly ionizes the gas in the Sombrero Galaxy. In the 1990s, a research group led by John Kormendy demonstrated that a supermassive black hole is present within the Sombrero Galaxy. Using spectroscopy data from both the CFHT and the Hubble Space Telescope, the group showed that the speed of revolution of the stars within the center of the galaxy could not be maintained unless a mass 1 billion times that of the Sun, 109 M☉, is present in the center. This is among the most massive black holes measured in any nearby galaxy, and is the nearest billion-solar-mass black hole to Earth. At radio and X-ray wavelengths, the nucleus is a strong source of synchrotron radiation. Synchrotron radiation is produced when high-velocity electrons oscillate as they pass through regions with strong magnetic fields. This emission is quite common for active galactic nuclei. Although radio synchrotron radiation may vary over time for some active galactic nuclei, the luminosity of the radio emission from the Sombrero Galaxy varies only 10–20%. In 2006, two groups published measurements of the terahertz radiation from the nucleus of the Sombrero Galaxy at a wavelength of 850 μm. This terahertz radiation was found not to originate from the thermal emission from dust (which is commonly seen at infrared and submillimeter wavelengths), synchrotron radiation (which is commonly seen at radio wavelengths), bremsstrahlung emission from hot gas (which is uncommonly seen at millimeter wavelengths), or molecular gas (which commonly produces submillimeter spectral lines). The source of the terahertz radiation remains unidentified. Globular clusters The Sombrero Galaxy has a relatively large number of globular clusters, observational studies of which have produced population estimates in the range of 1,200 to 2,000. The ratio of globular clusters to the galaxy's total luminosity is high compared to the Milky Way and similar galaxies with small bulges, but comparable to other galaxies with large bulges. These results have often been used to demonstrate that the number of a galaxy's globular clusters is thought to be related to the size of its bulge. The surface density of the globular clusters generally follows the bulge's light profile, except near the galaxy's center. Distance, mass and brightness At least two methods have been used to measure the distance to the Sombrero Galaxy. The first method relies on comparing the measured fluxes from the galaxy's planetary nebulae to the known luminosity of planetary nebulae in the Milky Way. This method gave the distance to the Sombrero Galaxy as 29 ± 2 Mly (8,890 ± 610 kpc). The second method is the surface brightness fluctuations method, which uses the grainy appearance of the galaxy's bulge to estimate the distance to it. Nearby galaxy bulges appear very grainy, while more distant bulges appear smooth. Early measurements using this technique gave distances of 30.6 ± 1.3 Mly (9,380 ± 400 kpc). Later, after some refinement of the technique, a distance of 32 ± 3 Mly (9,810 ± 920 kpc) was measured. This was even further refined in 2003 to 29.6 ± 2.5 Mly (9,080 ± 770 kpc). The average distance measured through these two techniques is 29.3 ± 1.6 Mly (8,980 ± 490 kpc).[a] The mass of M104 is estimated to be 800 billion solar masses The galaxy's absolute magnitude (in the blue) is estimated as −21.9 at 30.6 Mly (9,400 kpc) (−21.8 at the average distance of above)—which, as stated above, makes it the brightest galaxy in a radius of 32.6 Mly (10,000 kpc) around the Milky Way. A 2016 report used the Hubble Space Telescope to measure the distance to M104 based on the tip of the red-giant branch method, yielding 9.55 ± 0.13 ± 0.31 Mpc. Nearby galaxies and galaxy group information The Sombrero Galaxy lies within a complex, filament-like cloud of galaxies that extends to the south of the Virgo Cluster. However, it is unclear whether it is part of a formal galaxy group. Hierarchical methods for identifying groups, which determine group membership by considering whether individual galaxies belong to a larger aggregate of galaxies, typically produce results showing that the Sombrero Galaxy is part of a group that includes NGC 4487, NGC 4504, NGC 4802, UGCA 289, and possibly a few other galaxies. However, results that rely on the percolation method (also known as the friends-of-friends method), which links individual galaxies together to determine group membership, indicate that either the Sombrero Galaxy is not in a group or that it may be only part of a galaxy pair with UGCA 287. Besides that, M104 is also accompanied by an ultra-compact dwarf galaxy, discovered in 2009, with an absolute magnitude of −12.3, an effective radius of just 47.9 ly (3.03 million astronomical units), and a mass of 3.3×107 M☉ Amateur astronomy The Sombrero Galaxy is 11.5° west of Spica and 5.5° north-east of Eta Corvi. Although it is visible with 7×35 binoculars or a 4-inch (100 mm) amateur telescope, an 8-inch (200 mm) telescope is needed to distinguish the bulge from the disk, and a 10- or 12-inch (250 or 300 mm) telescope to see the dark dust lane. Gallery See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Computer#cite_ref-71] | [TOKENS: 10628] |
Contents Computer A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (computation). Modern digital electronic computers can perform generic sets of operations known as programs, which enable computers to perform a wide range of tasks. The term computer system may refer to a nominally complete computer that includes the hardware, operating system, software, and peripheral equipment needed and used for full operation, or to a group of computers that are linked and function together, such as a computer network or computer cluster. A broad range of industrial and consumer products use computers as control systems, including simple special-purpose devices like microwave ovens and remote controls, and factory devices like industrial robots. Computers are at the core of general-purpose devices such as personal computers and mobile devices such as smartphones. Computers power the Internet, which links billions of computers and users. Early computers were meant to be used only for calculations. Simple manual instruments like the abacus have aided people in doing calculations since ancient times. Early in the Industrial Revolution, some mechanical devices were built to automate long, tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II, both electromechanical and using thermionic valves. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power, and versatility of computers have been increasing dramatically ever since then, with transistor counts increasing at a rapid pace (Moore's law noted that counts doubled every two years), leading to the Digital Revolution during the late 20th and early 21st centuries. Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a microprocessor, together with some type of computer memory, typically semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers, etc.), and input/output devices that perform both functions (e.g. touchscreens). Peripheral devices allow information to be retrieved from an external source, and they enable the results of operations to be saved and retrieved. Etymology It was not until the mid-20th century that the word acquired its modern definition; according to the Oxford English Dictionary, the first known use of the word computer was in a different sense, in a 1613 book called The Yong Mans Gleanings by the English writer Richard Brathwait: "I haue [sic] read the truest computer of Times, and the best Arithmetician that euer [sic] breathed, and he reduceth thy dayes into a short number." This usage of the term referred to a human computer, a person who carried out calculations or computations. The word continued to have the same meaning until the middle of the 20th century. During the latter part of this period, women were often hired as computers because they could be paid less than their male counterparts. By 1943, most human computers were women. The Online Etymology Dictionary gives the first attested use of computer in the 1640s, meaning 'one who calculates'; this is an "agent noun from compute (v.)". The Online Etymology Dictionary states that the use of the term to mean "'calculating machine' (of any type) is from 1897." The Online Etymology Dictionary indicates that the "modern use" of the term, to mean 'programmable digital electronic computer' dates from "1945 under this name; [in a] theoretical [sense] from 1937, as Turing machine". The name has remained, although modern computers are capable of many higher-level functions. History Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was most likely a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, likely livestock or grains, sealed in hollow unbaked clay containers.[a] The use of counting rods is one example. The abacus was initially used for arithmetic tasks. The Roman abacus was developed from devices used in Babylonia as early as 2400 BCE. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money. The Antikythera mechanism is believed to be the earliest known mechanical analog computer, according to Derek J. de Solla Price. It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to approximately c. 100 BCE. Devices of comparable complexity to the Antikythera mechanism would not reappear until the fourteenth century. Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th century. The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BCE and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer and gear-wheels was invented by Abi Bakr of Isfahan, Persia in 1235. Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar astrolabe, an early fixed-wired knowledge processing machine with a gear train and gear-wheels, c. 1000 AD. The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation. The planimeter was a manual instrument to calculate the area of a closed figure by tracing over it with a mechanical linkage. The slide rule was invented around 1620–1630, by the English clergyman William Oughtred, shortly after the publication of the concept of the logarithm. It is a hand-operated analog computer for doing multiplication and division. As slide rule development progressed, added scales provided reciprocals, squares and square roots, cubes and cube roots, as well as transcendental functions such as logarithms and exponentials, circular and hyperbolic trigonometry and other functions. Slide rules with special scales are still used for quick performance of routine calculations, such as the E6B circular slide rule used for time and distance calculations on light aircraft. In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automaton) that could write holding a quill pen. By switching the number and order of its internal wheels different letters, and hence different messages, could be produced. In effect, it could be mechanically "programmed" to read instructions. Along with two other complex machines, the doll is at the Musée d'Art et d'Histoire of Neuchâtel, Switzerland, and still operates. In 1831–1835, mathematician and engineer Giovanni Plana devised a Perpetual Calendar machine, which through a system of pulleys and cylinders could predict the perpetual calendar for every year from 0 CE (that is, 1 BCE) to 4000 CE, keeping track of leap years and varying day length. The tide-predicting machine invented by the Scottish scientist Sir William Thomson in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location. The differential analyser, a mechanical analog computer designed to solve differential equations by integration, used wheel-and-disc mechanisms to perform the integration. In 1876, Sir William Thomson had already discussed the possible construction of such calculators, but he had been stymied by the limited output torque of the ball-and-disk integrators. In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that allowed these machines to work. Starting in the 1920s, Vannevar Bush and others developed mechanical differential analyzers. In the 1890s, the Spanish engineer Leonardo Torres Quevedo began to develop a series of advanced analog machines that could solve real and complex roots of polynomials, which were published in 1901 by the Paris Academy of Sciences. Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the "father of the computer", he conceptualized and invented the first mechanical computer in the early 19th century. After working on his difference engine he announced his invention in 1822, in a paper to the Royal Astronomical Society, titled "Note on the application of machinery to the computation of astronomical and mathematical tables". He also designed to aid in navigational calculations, in 1833 he realized that a much more general design, an analytical engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The engine would incorporate an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. The machine was about a century ahead of its time. All the parts for his machine had to be made by hand – this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage's failure to complete the analytical engine can be chiefly attributed to political and financial difficulties as well as his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906. In his work Essays on Automatics published in 1914, Leonardo Torres Quevedo wrote a brief history of Babbage's efforts at constructing a mechanical Difference Engine and Analytical Engine. The paper contains a design of a machine capable to calculate formulas like a x ( y − z ) 2 {\displaystyle a^{x}(y-z)^{2}} , for a sequence of sets of values. The whole machine was to be controlled by a read-only program, which was complete with provisions for conditional branching. He also introduced the idea of floating-point arithmetic. In 1920, to celebrate the 100th anniversary of the invention of the arithmometer, Torres presented in Paris the Electromechanical Arithmometer, which allowed a user to input arithmetic problems through a keyboard, and computed and printed the results, demonstrating the feasibility of an electromechanical analytical engine. During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers. The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson (later to become Lord Kelvin) in 1872. The differential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the elder brother of the more famous Sir William Thomson. The art of mechanical analog computing reached its zenith with the differential analyzer, completed in 1931 by Vannevar Bush at MIT. By the 1950s, the success of digital electronic computers had spelled the end for most analog computing machines, but analog computers remained in use during the 1950s in some specialized applications such as education (slide rule) and aircraft (control systems).[citation needed] Claude Shannon's 1937 master's thesis laid the foundations of digital computing, with his insight of applying Boolean algebra to the analysis and synthesis of switching circuits being the basic concept which underlies all electronic digital computers. By 1938, the United States Navy had developed the Torpedo Data Computer, an electromechanical analog computer for submarines that used trigonometry to solve the problem of firing a torpedo at a moving target. During World War II, similar devices were developed in other countries. Early digital computers were electromechanical; electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939 in Berlin, was one of the earliest examples of an electromechanical relay computer. In 1941, Zuse followed his earlier machine up with the Z3, the world's first working electromechanical programmable, fully automatic digital computer. The Z3 was built with 2000 relays, implementing a 22-bit word length that operated at a clock frequency of about 5–10 Hz. Program code was supplied on punched film while data could be stored in 64 words of memory or supplied from the keyboard. It was quite similar to modern machines in some respects, pioneering numerous advances such as floating-point numbers. Rather than the harder-to-implement decimal system (used in Charles Babbage's earlier design), using a binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time. The Z3 was not itself a universal computer but could be extended to be Turing complete. Zuse's next computer, the Z4, became the world's first commercial computer; after initial delay due to the Second World War, it was completed in 1950 and delivered to the ETH Zurich. The computer was manufactured by Zuse's own company, Zuse KG, which was founded in 1941 as the first company with the sole purpose of developing computers in Berlin. The Z4 served as the inspiration for the construction of the ERMETH, the first Swiss computer and one of the first in Europe. Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the same time that digital calculation replaced analog. The engineer Tommy Flowers, working at the Post Office Research Station in London in the 1930s, began to explore the possible use of electronics for the telephone exchange. Experimental equipment that he built in 1934 went into operation five years later, converting a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum tubes. In the US, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed and tested the Atanasoff–Berry Computer (ABC) in 1942, the first "automatic electronic digital computer". This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory. During World War II, the British code-breakers at Bletchley Park achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was first attacked with the help of the electro-mechanical bombes which were often run by women. To crack the more sophisticated German Lorenz SZ 40/42 machine, used for high-level Army communications, Max Newman and his colleagues commissioned Flowers to build the Colossus. He spent eleven months from early February 1943 designing and building the first Colossus. After a functional test in December 1943, Colossus was shipped to Bletchley Park, where it was delivered on 18 January 1944 and attacked its first message on 5 February. Colossus was the world's first electronic digital programmable computer. It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Colossus Mark I contained 1,500 thermionic valves (tubes), but Mark II with 2,400 valves, was both five times faster and simpler to operate than Mark I, greatly speeding the decoding process. The ENIAC (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the U.S. Although the ENIAC was similar to the Colossus, it was much faster, more flexible, and it was Turing-complete. Like the Colossus, a "program" on the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that came later. Once a program was written, it had to be mechanically set into the machine with manual resetting of plugs and switches. The programmers of the ENIAC were six women, often known collectively as the "ENIAC girls". It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors. The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper, On Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine" and that is now known as a universal Turing machine. He proved that such a machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable. The fundamental concept of Turing's design is the stored program, where all the instructions for computing are stored in memory. Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine. Early computing machines had fixed programs. Changing its function required the re-wiring and re-structuring of the machine. With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set and can store in memory a set of instructions (a program) that details the computation. The theoretical basis for the stored-program computer was laid out by Alan Turing in his 1936 paper. In 1945, Turing joined the National Physical Laboratory and began work on developing an electronic stored-program digital computer. His 1945 report "Proposed Electronic Calculator" was the first specification for such a device. John von Neumann at the University of Pennsylvania also circulated his First Draft of a Report on the EDVAC in 1945. The Manchester Baby was the world's first stored-program computer. It was built at the University of Manchester in England by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948. It was designed as a testbed for the Williams tube, the first random-access digital storage device. Although the computer was described as "small and primitive" by a 1998 retrospective, it was the first working machine to contain all of the elements essential to a modern electronic computer. As soon as the Baby had demonstrated the feasibility of its design, a project began at the university to develop it into a practically useful computer, the Manchester Mark 1. The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first commercially available general-purpose computer. Built by Ferranti, it was delivered to the University of Manchester in February 1951. At least seven of these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam. In October 1947 the directors of British catering company J. Lyons & Company decided to take an active role in promoting the commercial development of computers. Lyons's LEO I computer, modelled closely on the Cambridge EDSAC of 1949, became operational in April 1951 and ran the world's first routine office computer job. The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947, which was followed by Shockley's bipolar junction transistor in 1948. From 1955 onwards, transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialized applications. At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developed transistors instead of valves. Their first transistorized computer and the first in the world, was operational by 1953, and a second version was completed there in April 1955. However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer. That distinction goes to the Harwell CADET of 1955, built by the electronics division of the Atomic Energy Research Establishment at Harwell. The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented at Bell Labs between 1955 and 1960 and was the first truly compact transistor that could be miniaturized and mass-produced for a wide range of uses. With its high scalability, and much lower power consumption and higher density than bipolar junction transistors, the MOSFET made it possible to build high-density integrated circuits. In addition to data processing, it also enabled the practical use of MOS transistors as memory cell storage elements, leading to the development of MOS semiconductor memory, which replaced earlier magnetic-core memory in computers. The MOSFET led to the microcomputer revolution, and became the driving force behind the computer revolution. The MOSFET is the most widely used transistor in computers, and is the fundamental building block of digital electronics. The next great advance in computing power came with the advent of the integrated circuit (IC). The idea of the integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C., on 7 May 1952. The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor. Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958. In his patent application of 6 February 1959, Kilby described his new device as "a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated". However, Kilby's invention was a hybrid integrated circuit (hybrid IC), rather than a monolithic integrated circuit (IC) chip. Kilby's IC had external wire connections, which made it difficult to mass-produce. Noyce also came up with his own idea of an integrated circuit half a year later than Kilby. Noyce's invention was the first true monolithic IC chip. His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium. Noyce's monolithic IC was fabricated using the planar process, developed by his colleague Jean Hoerni in early 1959. In turn, the planar process was based on Carl Frosch and Lincoln Derick work on semiconductor surface passivation by silicon dioxide. Modern monolithic ICs are predominantly MOS (metal–oxide–semiconductor) integrated circuits, built from MOSFETs (MOS transistors). The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962. General Microelectronics later introduced the first commercial MOS IC in 1964, developed by Robert Norman. Following the development of the self-aligned gate (silicon-gate) MOS transistor by Robert Kerwin, Donald Klein and John Sarace at Bell Labs in 1967, the first silicon-gate MOS IC with self-aligned gates was developed by Federico Faggin at Fairchild Semiconductor in 1968. The MOSFET has since become the most critical device component in modern ICs. The development of the MOS integrated circuit led to the invention of the microprocessor, and heralded an explosion in the commercial and personal use of computers. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term "microprocessor", it is largely undisputed that the first single-chip microprocessor was the Intel 4004, designed and realized by Federico Faggin with his silicon-gate MOS IC technology, along with Ted Hoff, Masatoshi Shima and Stanley Mazor at Intel.[b] In the early 1970s, MOS IC technology enabled the integration of more than 10,000 transistors on a single chip. System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of a coin. They may or may not have integrated RAM and flash memory. If not integrated, the RAM is usually placed directly above (known as Package on package) or below (on the opposite side of the circuit board) the SoC, and the flash memory is usually placed right next to the SoC. This is done to improve data transfer speeds, as the data signals do not have to travel long distances. Since ENIAC in 1945, computers have advanced enormously, with modern SoCs (such as the Snapdragon 865) being the size of a coin while also being hundreds of thousands of times more powerful than ENIAC, integrating billions of transistors, and consuming only a few watts of power. The first mobile computers were heavy and ran from mains power. The 50 lb (23 kg) IBM 5100 was an early example. Later portables such as the Osborne 1 and Compaq Portable were considerably lighter but still needed to be plugged in. The first laptops, such as the Grid Compass, removed this requirement by incorporating batteries – and with the continued miniaturization of computing resources and advancements in portable battery life, portable computers grew in popularity in the 2000s. The same developments allowed manufacturers to integrate computing resources into cellular mobile phones by the early 2000s. These smartphones and tablets run on a variety of operating systems and recently became the dominant computing device on the market. These are powered by System on a Chip (SoCs), which are complete computers on a microchip the size of a coin. Types Computers can be classified in a number of different ways, including: A computer does not need to be electronic, nor even have a processor, nor RAM, nor even a hard disk. While popular usage of the word "computer" is synonymous with a personal electronic computer,[c] a typical modern definition of a computer is: "A device that computes, especially a programmable [usually] electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information." According to this definition, any device that processes information qualifies as a computer. Hardware The term hardware covers all of those parts of a computer that are tangible physical objects. Circuits, computer chips, graphic cards, sound cards, memory (RAM), motherboard, displays, power supplies, cables, keyboards, printers and "mice" input devices are all hardware. A general-purpose computer has four main components: the arithmetic logic unit (ALU), the control unit, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by buses, often made of groups of wires. Inside each of these parts are thousands to trillions of small electrical circuits which can be turned off or on by means of an electronic switch. Each circuit represents a bit (binary digit) of information so that when the circuit is on it represents a "1", and when off it represents a "0" (in positive logic representation). The circuits are arranged in logic gates so that one or more of the circuits may control the state of one or more of the other circuits. Input devices are the means by which the operations of a computer are controlled and it is provided with data. Examples include: Output devices are the means by which a computer provides the results of its calculations in a human-accessible form. Examples include: The control unit (often called a control system or central controller) manages the computer's various components; it reads and interprets (decodes) the program instructions, transforming them into control signals that activate other parts of the computer.[e] Control systems in advanced computers may change the order of execution of some instructions to improve performance. A key component common to all CPUs is the program counter, a special memory cell (a register) that keeps track of which location in memory the next instruction is to be read from.[f] The control system's function is as follows— this is a simplified description, and some of these steps may be performed concurrently or in a different order depending on the type of CPU: Since the program counter is (conceptually) just another set of memory cells, it can be changed by calculations done in the ALU. Adding 100 to the program counter would cause the next instruction to be read from a place 100 locations further down the program. Instructions that modify the program counter are often known as "jumps" and allow for loops (instructions that are repeated by the computer) and often conditional instruction execution (both examples of control flow). The sequence of operations that the control unit goes through to process an instruction is in itself like a short computer program, and indeed, in some more complex CPU designs, there is another yet smaller computer called a microsequencer, which runs a microcode program that causes all of these events to happen. The control unit, ALU, and registers are collectively known as a central processing unit (CPU). Early CPUs were composed of many separate components. Since the 1970s, CPUs have typically been constructed on a single MOS integrated circuit chip called a microprocessor. The ALU is capable of performing two classes of operations: arithmetic and logic. The set of arithmetic operations that a particular ALU supports may be limited to addition and subtraction, or might include multiplication, division, trigonometry functions such as sine, cosine, etc., and square roots. Some can operate only on whole numbers (integers) while others use floating point to represent real numbers, albeit with limited precision. However, any computer that is capable of performing just the simplest operations can be programmed to break down the more complex operations into simple steps that it can perform. Therefore, any computer can be programmed to perform any arithmetic operation—although it will take more time to do so if its ALU does not directly support the operation. An ALU may also compare numbers and return Boolean truth values (true or false) depending on whether one is equal to, greater than or less than the other ("is 64 greater than 65?"). Logic operations involve Boolean logic: AND, OR, XOR, and NOT. These can be useful for creating complicated conditional statements and processing Boolean logic. Superscalar computers may contain multiple ALUs, allowing them to process several instructions simultaneously. Graphics processors and computers with SIMD and MIMD features often contain ALUs that can perform arithmetic on vectors and matrices. A computer's memory can be viewed as a list of cells into which numbers can be placed or read. Each cell has a numbered "address" and can store a single number. The computer can be instructed to "put the number 123 into the cell numbered 1357" or to "add the number that is in cell 1357 to the number that is in cell 2468 and put the answer into cell 1595." The information stored in memory may represent practically anything. Letters, numbers, even computer instructions can be placed into memory with equal ease. Since the CPU does not differentiate between different types of information, it is the software's responsibility to give significance to what the memory sees as nothing but a series of numbers. In almost all modern computers, each memory cell is set up to store binary numbers in groups of eight bits (called a byte). Each byte is able to represent 256 different numbers (28 = 256); either from 0 to 255 or −128 to +127. To store larger numbers, several consecutive bytes may be used (typically, two, four or eight). When negative numbers are required, they are usually stored in two's complement notation. Other arrangements are possible, but are usually not seen outside of specialized applications or historical contexts. A computer can store any kind of information in memory if it can be represented numerically. Modern computers have billions or even trillions of bytes of memory. The CPU contains a special set of memory cells called registers that can be read and written to much more rapidly than the main memory area. There are typically between two and one hundred registers depending on the type of CPU. Registers are used for the most frequently needed data items to avoid having to access main memory every time data is needed. As data is constantly being worked on, reducing the need to access main memory (which is often slow compared to the ALU and control units) greatly increases the computer's speed. Computer main memory comes in two principal varieties: RAM can be read and written to anytime the CPU commands it, but ROM is preloaded with data and software that never changes, therefore the CPU can only read from it. ROM is typically used to store the computer's initial start-up instructions. In general, the contents of RAM are erased when the power to the computer is turned off, but ROM retains its data indefinitely. In a PC, the ROM contains a specialized program called the BIOS that orchestrates loading the computer's operating system from the hard disk drive into RAM whenever the computer is turned on or reset. In embedded computers, which frequently do not have disk drives, all of the required software may be stored in ROM. Software stored in ROM is often called firmware, because it is notionally more like hardware than software. Flash memory blurs the distinction between ROM and RAM, as it retains its data when turned off but is also rewritable. It is typically much slower than conventional ROM and RAM however, so its use is restricted to applications where high speed is unnecessary.[g] In more sophisticated computers there may be one or more RAM cache memories, which are slower than registers but faster than main memory. Generally computers with this sort of cache are designed to move frequently needed data into the cache automatically, often without the need for any intervention on the programmer's part. I/O is the means by which a computer exchanges information with the outside world. Devices that provide input or output to the computer are called peripherals. On a typical personal computer, peripherals include input devices like the keyboard and mouse, and output devices such as the display and printer. Hard disk drives, floppy disk drives and optical disc drives serve as both input and output devices. Computer networking is another form of I/O. I/O devices are often complex computers in their own right, with their own CPU and memory. A graphics processing unit might contain fifty or more tiny computers that perform the calculations necessary to display 3D graphics.[citation needed] Modern desktop computers contain many smaller computers that assist the main CPU in performing I/O. A 2016-era flat screen display contains its own computer circuitry. While a computer may be viewed as running one gigantic program stored in its main memory, in some systems it is necessary to give the appearance of running several programs simultaneously. This is achieved by multitasking, i.e. having the computer switch rapidly between running each program in turn. One means by which this is done is with a special signal called an interrupt, which can periodically cause the computer to stop executing instructions where it was and do something else instead. By remembering where it was executing prior to the interrupt, the computer can return to that task later. If several programs are running "at the same time". Then the interrupt generator might be causing several hundred interrupts per second, causing a program switch each time. Since modern computers typically execute instructions several orders of magnitude faster than human perception, it may appear that many programs are running at the same time, even though only one is ever executing in any given instant. This method of multitasking is sometimes termed "time-sharing" since each program is allocated a "slice" of time in turn. Before the era of inexpensive computers, the principal use for multitasking was to allow many people to share the same computer. Seemingly, multitasking would cause a computer that is switching between several programs to run more slowly, in direct proportion to the number of programs it is running, but most programs spend much of their time waiting for slow input/output devices to complete their tasks. If a program is waiting for the user to click on the mouse or press a key on the keyboard, then it will not take a "time slice" until the event it is waiting for has occurred. This frees up time for other programs to execute so that many programs may be run simultaneously without unacceptable speed loss. Some computers are designed to distribute their work across several CPUs in a multiprocessing configuration, a technique once employed in only large and powerful machines such as supercomputers, mainframe computers and servers. Multiprocessor and multi-core (multiple CPUs on a single integrated circuit) personal and laptop computers are now widely available, and are being increasingly used in lower-end markets as a result. Supercomputers in particular often have highly unique architectures that differ significantly from the basic stored-program architecture and from general-purpose computers.[h] They often feature thousands of CPUs, customized high-speed interconnects, and specialized computing hardware. Such designs tend to be useful for only specialized tasks due to the large scale of program organization required to use most of the available resources at once. Supercomputers usually see usage in large-scale simulation, graphics rendering, and cryptography applications, as well as with other so-called "embarrassingly parallel" tasks. Software Software is the part of a computer system that consists of the encoded information that determines the computer's operation, such as data or instructions on how to process the data. In contrast to the physical hardware from which the system is built, software is immaterial. Software includes computer programs, libraries and related non-executable data, such as online documentation or digital media. It is often divided into system software and application software. Computer hardware and software require each other and neither is useful on its own. When software is stored in hardware that cannot easily be modified, such as with BIOS ROM in an IBM PC compatible computer, it is sometimes called "firmware". The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that some type of instructions (the program) can be given to the computer, and it will process them. Modern computers based on the von Neumann architecture often have machine code in the form of an imperative programming language. In practical terms, a computer program may be just a few instructions or extend to many millions of instructions, as do the programs for word processors and web browsers for example. A typical modern computer can execute billions of instructions per second (gigaflops) and rarely makes a mistake over many years of operation. Large computer programs consisting of several million instructions may take teams of programmers years to write, and due to the complexity of the task almost certainly contain errors. This section applies to most common RAM machine–based computers. In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to the instruction following that jump instruction. Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention. Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time, with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. The following example is written in the MIPS assembly language: Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in a fraction of a second. In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode; the command to multiply them would have a different opcode, and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from, each with a unique numerical code. Since the computer's memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of these instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer in the same way as numeric data. The fundamental concept of storing programs in the computer's memory alongside the data they operate on is the crux of the von Neumann, or stored program, architecture. In some cases, a computer might store some or all of its program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches. While it is possible to write computer programs as long lists of numbers (machine language) and while this technique was used with many early computers,[i] it is extremely tedious and potentially error-prone to do so in practice, especially for complicated programs. Instead, each basic instruction can be given a short name that is indicative of its function and easy to remember – a mnemonic such as ADD, SUB, MULT or JUMP. These mnemonics are collectively known as a computer's assembly language. Converting programs written in assembly language into something the computer can actually understand (machine language) is usually done by a computer program called an assembler. A programming language is a notation system for writing the source code from which a computer program is produced. Programming languages provide various ways of specifying programs for computers to run. Unlike natural languages, programming languages are designed to permit no ambiguity and to be concise. They are purely written languages and are often difficult to read aloud. They are generally either translated into machine code by a compiler or an assembler before being run, or translated directly at run time by an interpreter. Sometimes programs are executed by a hybrid method of the two techniques. There are thousands of programming languages—some intended for general purpose programming, others useful for only highly specialized applications. Machine languages and the assembly languages that represent them (collectively termed low-level programming languages) are generally unique to the particular architecture of a computer's central processing unit (CPU). For instance, an ARM architecture CPU (such as may be found in a smartphone or a hand-held videogame) cannot understand the machine language of an x86 CPU that might be in a PC.[j] Historically a significant number of other CPU architectures were created and saw extensive use, notably including the MOS Technology 6502 and 6510 in addition to the Zilog Z80. Although considerably easier than in machine language, writing long programs in assembly language is often difficult and is also error prone. Therefore, most practical programs are written in more abstract high-level programming languages that are able to express the needs of the programmer more conveniently (and thereby help reduce programmer error). High level languages are usually "compiled" into machine language (or sometimes into assembly language and then into machine language) using another computer program called a compiler.[k] High level languages are less related to the workings of the target computer than assembly language, and more related to the language and structure of the problem(s) to be solved by the final program. It is therefore often possible to use different compilers to translate the same high level language program into the machine language of many different types of computer. This is part of the means by which software like video games may be made available for different computer architectures such as personal computers and various video game consoles. Program design of small programs is relatively simple and involves the analysis of the problem, collection of inputs, using the programming constructs within languages, devising or using established procedures and algorithms, providing data for output devices and solutions to the problem as applicable. As problems become larger and more complex, features such as subprograms, modules, formal documentation, and new paradigms such as object-oriented programming are encountered. Large programs involving thousands of line of code and more require formal software methodologies. The task of developing large software systems presents a significant intellectual challenge. Producing software with an acceptably high reliability within a predictable schedule and budget has historically been difficult; the academic and professional discipline of software engineering concentrates specifically on this challenge. Errors in computer programs are called "bugs". They may be benign and not affect the usefulness of the program, or have only subtle effects. However, in some cases they may cause the program or the entire system to "hang", becoming unresponsive to input such as mouse clicks or keystrokes, to completely fail, or to crash. Otherwise benign bugs may sometimes be harnessed for malicious intent by an unscrupulous user writing an exploit, code designed to take advantage of a bug and disrupt a computer's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design.[l] Admiral Grace Hopper, an American computer scientist and developer of the first compiler, is credited for having first used the term "bugs" in computing after a dead moth was found shorting a relay in the Harvard Mark II computer in September 1947. Networking and the Internet Computers have been used to coordinate information between multiple physical locations since the 1950s. The U.S. military's SAGE system was the first large-scale example of such a system, which led to a number of special-purpose commercial systems such as Sabre. In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. The effort was funded by ARPA (now DARPA), and the computer network that resulted was called the ARPANET. Logic gates are a common abstraction which can apply to most of the above digital or analog paradigms. The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a minimum capability (being Turing-complete) is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, any type of computer (netbook, supercomputer, cellular automaton, etc.) is able to perform the same computational tasks, given enough time and storage capacity. In the 20th century, artificial intelligence systems were predominantly symbolic: they executed code that was explicitly programmed by software developers. Machine learning models, however, have a set parameters that are adjusted throughout training, so that the model learns to accomplish a task based on the provided data. The efficiency of machine learning (and in particular of neural networks) has rapidly improved with progress in hardware for parallel computing, mainly graphics processing units (GPUs). Some large language models are able to control computers or robots. AI progress may lead to the creation of artificial general intelligence (AGI), a type of AI that could accomplish virtually any intellectual task at least as well as humans. Professions and organizations As the use of computers has spread throughout society, there are an increasing number of careers involving computers. The need for computers to work well together and to be able to exchange information has spawned the need for many standards organizations, clubs and societies of both a formal and informal nature. See also Notes References Sources External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/BBC_News#cite_note-18] | [TOKENS: 8810] |
Contents BBC News BBC News is an operational business division of the British Broadcasting Corporation (BBC) responsible for the gathering and broadcasting of news and current affairs in the UK and around the world. The department is the world's largest broadcast news organisation and generates about 120 hours of radio and television output each day, as well as online news coverage. The service has over 5,500 journalists working across its output including in 50 foreign news bureaus where more than 250 foreign correspondents are stationed. Deborah Turness has been the CEO of news and current affairs since September 2022. In 2019, it was reported in an Ofcom report that the BBC spent £136m on news during the period April 2018 to March 2019. BBC News' domestic, global and online news divisions are housed within the largest live newsroom in Europe, in Broadcasting House in central London. Parliamentary coverage is produced and broadcast from studios in London. Through BBC English Regions, the BBC also has regional centres across England and national news centres in Northern Ireland, Scotland and Wales. All nations and English regions produce their own local news programmes and other current affairs and sport programmes. The BBC is a quasi-autonomous corporation authorised by royal charter, making it operationally independent of the government. As of 2024, the BBC reaches an average of 450 million people per week, with the BBC World Service accounting for 320 million people. History This is London calling – 2LO calling. Here is the first general news bulletin, copyright by Reuters, Press Association, Exchange Telegraph and Central News. — BBC news programme opening during the 1920s The British Broadcasting Company broadcast its first radio bulletin from radio station 2LO on 14 November 1922. Wishing to avoid competition, newspaper publishers persuaded the government to ban the BBC from broadcasting news before 7 pm, and to force it to use wire service copy instead of reporting on its own. The BBC gradually gained the right to edit the copy and, in 1934, created its own news operation. However, it could not broadcast news before 6 p.m. until World War II. In addition to news, Gaumont British and Movietone cinema newsreels had been broadcast on the TV service since 1936, with the BBC producing its own equivalent Television Newsreel programme from January 1948. A weekly Children's Newsreel was inaugurated on 23 April 1950, to around 350,000 receivers. The network began simulcasting its radio news on television in 1946, with a still picture of Big Ben. Televised bulletins began on 5 July 1954, broadcast from leased studios within Alexandra Palace in London. The public's interest in television and live events was stimulated by Elizabeth II's coronation in 1953. It is estimated that up to 27 million people viewed the programme in the UK, overtaking radio's audience of 12 million for the first time. Those live pictures were fed from 21 cameras in central London to Alexandra Palace for transmission, and then on to other UK transmitters opened in time for the event. That year, there were around two million TV Licences held in the UK, rising to over three million the following year, and four and a half million by 1955. Television news, although physically separate from its radio counterpart, was still firmly under radio news' control in the 1950s. Correspondents provided reports for both outlets, and the first televised bulletin, shown on 5 July 1954 on the then BBC television service and presented by Richard Baker, involved his providing narration off-screen while stills were shown. This was then followed by the customary Television Newsreel with a recorded commentary by John Snagge (and on other occasions by Andrew Timothy). On-screen newsreaders were introduced a year later in 1955 – Kenneth Kendall (the first to appear in vision), Robert Dougall, and Richard Baker—three weeks before ITN's launch on 21 September 1955. Mainstream television production had started to move out of Alexandra Palace in 1950 to larger premises – mainly at Lime Grove Studios in Shepherd's Bush, west London – taking Current Affairs (then known as Talks Department) with it. It was from here that the first Panorama, a new documentary programme, was transmitted on 11 November 1953, with Richard Dimbleby becoming anchor in 1955. In 1958, Hugh Carleton Greene became head of News and Current Affairs. On 1 January 1960, Greene became Director-General. Greene made changes that were aimed at making BBC reporting more similar to its competitor ITN, which had been highly rated by study groups held by Greene. A newsroom was created at Alexandra Palace, television reporters were recruited and given the opportunity to write and voice their own scripts, without having to cover stories for radio too. On 20 June 1960, Nan Winton, the first female BBC network newsreader, appeared in vision. 19 September 1960 saw the start of the radio news and current affairs programme The Ten O'clock News. BBC2 started transmission on 20 April 1964 and began broadcasting a new show, Newsroom. The World at One, a lunchtime news programme, began on 4 October 1965 on the then Home Service, and the year before News Review had started on television. News Review was a summary of the week's news, first broadcast on Sunday, 26 April 1964 on BBC 2 and harking back to the weekly Newsreel Review of the Week, produced from 1951, to open programming on Sunday evenings–the difference being that this incarnation had subtitles for the deaf and hard-of-hearing. As this was the decade before electronic caption generation, each superimposition ("super") had to be produced on paper or card, synchronised manually to studio and news footage, committed to tape during the afternoon, and broadcast early evening. Thus Sundays were no longer a quiet day for news at Alexandra Palace. The programme ran until the 1980s – by then using electronic captions, known as Anchor – to be superseded by Ceefax subtitling (a similar Teletext format), and the signing of such programmes as See Hear (from 1981). On Sunday 17 September 1967, The World This Weekend, a weekly news and current affairs programme, launched on what was then Home Service, but soon-to-be Radio 4. Preparations for colour began in the autumn of 1967 and on Thursday 7 March 1968 Newsroom on BBC2 moved to an early evening slot, becoming the first UK news programme to be transmitted in colour – from Studio A at Alexandra Palace. News Review and Westminster (the latter a weekly review of Parliamentary happenings) were "colourised" shortly after. However, much of the insert material was still in black and white, as initially only a part of the film coverage shot in and around London was on colour reversal film stock, and all regional and many international contributions were still in black and white. Colour facilities at Alexandra Palace were technically very limited for the next eighteen months, as it had only one RCA colour Quadruplex videotape machine and, eventually two Pye plumbicon colour telecines–although the news colour service started with just one. Black and white national bulletins on BBC 1 continued to originate from Studio B on weekdays, along with Town and Around, the London regional "opt out" programme broadcast throughout the 1960s (and the BBC's first regional news programme for the South East), until it started to be replaced by Nationwide on Tuesday to Thursday from Lime Grove Studios early in September 1969. Town and Around was never to make the move to Television Centre – instead it became London This Week which aired on Mondays and Fridays only, from the new TVC studios. The BBC moved production out of Alexandra Palace in 1969. BBC Television News resumed operations the next day with a lunchtime bulletin on BBC1 – in black and white – from Television Centre, where it remained until March 2013. This move to a smaller studio with better technical facilities allowed Newsroom and News Review to replace back projection with colour-separation overlay. During the 1960s, satellite communication had become possible; however, it was some years before digital line-store conversion was able to undertake the process seamlessly. On 14 September 1970, the first Nine O'Clock News was broadcast on television. Robert Dougall presented the first week from studio N1 – described by The Guardian as "a sort of polystyrene padded cell"—the bulletin having been moved from the earlier time of 20.50 as a response to the ratings achieved by ITN's News at Ten, introduced three years earlier on the rival ITV. Richard Baker and Kenneth Kendall presented subsequent weeks, thus echoing those first television bulletins of the mid-1950s. Angela Rippon became the first female news presenter of the Nine O'Clock News in 1975. Her work outside the news was controversial at the time, appearing on The Morecambe and Wise Christmas Show in 1976 singing and dancing. The first edition of John Craven's Newsround, initially intended only as a short series and later renamed just Newsround, came from studio N3 on 4 April 1972. Afternoon television news bulletins during the mid to late 1970s were broadcast from the BBC newsroom itself, rather than one of the three news studios. The newsreader would present to camera while sitting on the edge of a desk; behind him staff would be seen working busily at their desks. This period corresponded with when the Nine O'Clock News got its next makeover, and would use a CSO background of the newsroom from that very same camera each weekday evening. Also in the mid-1970s, the late night news on BBC2 was briefly renamed Newsnight, but this was not to last, or be the same programme as we know today – that would be launched in 1980 – and it soon reverted to being just a news summary with the early evening BBC2 news expanded to become Newsday. News on radio was to change in the 1970s, and on Radio 4 in particular, brought about by the arrival of new editor Peter Woon from television news and the implementation of the Broadcasting in the Seventies report. These included the introduction of correspondents into news bulletins where previously only a newsreader would present, as well as the inclusion of content gathered in the preparation process. New programmes were also added to the daily schedule, PM and The World Tonight as part of the plan for the station to become a "wholly speech network". Newsbeat launched as the news service on Radio 1 on 10 September 1973. On 23 September 1974, a teletext system which was launched to bring news content on television screens using text only was launched. Engineers originally began developing such a system to bring news to deaf viewers, but the system was expanded. The Ceefax service became much more diverse before it ceased on 23 October 2012: it not only had subtitling for all channels, it also gave information such as weather, flight times and film reviews. By the end of the decade, the practice of shooting on film for inserts in news broadcasts was declining, with the introduction of ENG technology into the UK. The equipment would gradually become less cumbersome – the BBC's first attempts had been using a Philips colour camera with backpack base station and separate portable Sony U-matic recorder in the latter half of the decade. In 1980, the Iranian Embassy Siege had been shot electronically by the BBC Television News Outside broadcasting team, and the work of reporter Kate Adie, broadcasting live from Prince's Gate, was nominated for BAFTA actuality coverage, but this time beaten by ITN for the 1980 award. Newsnight, the news and current affairs programme, was due to go on air on 23 January 1980, although trade union disagreements meant that its launch from Lime Grove was postponed by a week. On 27 August 1981 Moira Stuart became the first African Caribbean female newsreader to appear on British television. By 1982, ENG technology had become sufficiently reliable for Bernard Hesketh to use an Ikegami camera to cover the Falklands War, coverage for which he won the "Royal Television Society Cameraman of the Year" award and a BAFTA nomination – the first time that BBC News had relied upon an electronic camera, rather than film, in a conflict zone. BBC News won the BAFTA for its actuality coverage, however the event has become remembered in television terms for Brian Hanrahan's reporting where he coined the phrase "I'm not allowed to say how many planes joined the raid, but I counted them all out and I counted them all back" to circumvent restrictions, and which has become cited as an example of good reporting under pressure. The first BBC breakfast television programme, Breakfast Time also launched during the 1980s, on 17 January 1983 from Lime Grove Studio E and two weeks before its ITV rival TV-am. Frank Bough, Selina Scott, and Nick Ross helped to wake viewers with a relaxed style of presenting. The Six O'Clock News first aired on 3 September 1984, eventually becoming the most watched news programme in the UK (however, since 2006 it has been overtaken by the BBC News at Ten). In October 1984, images of millions of people starving to death in the Ethiopian famine were shown in Michael Buerk's Six O'Clock News reports. The BBC News crew were the first to document the famine, with Buerk's report on 23 October describing it as "a biblical famine in the 20th century" and "the closest thing to hell on Earth". The BBC News report shocked Britain, motivating its citizens to inundate relief agencies, such as Save the Children, with donations, and to bring global attention to the crisis in Ethiopia. The news report was also watched by Bob Geldof, who would organise the charity single "Do They Know It's Christmas?" to raise money for famine relief followed by the Live Aid concert in July 1985. Starting in 1981, the BBC gave a common theme to its main news bulletins with new electronic titles–a set of computer-animated "stripes" forming a circle on a red background with a "BBC News" typescript appearing below the circle graphics, and a theme tune consisting of brass and keyboards. The Nine used a similar (striped) number 9. The red background was replaced by a blue from 1985 until 1987. By 1987, the BBC had decided to re-brand its bulletins and established individual styles again for each one with differing titles and music, the weekend and holiday bulletins branded in a similar style to the Nine, although the "stripes" introduction continued to be used until 1989 on occasions where a news bulletin was screened out of the running order of the schedule. In 1987, John Birt resurrected the practice of correspondents working for both TV and radio with the introduction of bi-media journalism. During the 1990s, a wider range of services began to be offered by BBC News, with the split of BBC World Service Television to become BBC World (news and current affairs), and BBC Prime (light entertainment). Content for a 24-hour news channel was thus required, followed in 1997 with the launch of domestic equivalent BBC News 24. Rather than set bulletins, ongoing reports and coverage was needed to keep both channels functioning and meant a greater emphasis in budgeting for both was necessary. In 1998, after 66 years at Broadcasting House, the BBC Radio News operation moved to BBC Television Centre. New technology, provided by Silicon Graphics, came into use in 1993 for a re-launch of the main BBC 1 bulletins, creating a virtual set which appeared to be much larger than it was physically. The relaunch also brought all bulletins into the same style of set with only small changes in colouring, titles, and music to differentiate each. A computer generated cut-glass sculpture of the BBC coat of arms was the centrepiece of the programme titles until the large scale corporate rebranding of news services in 1999. In November 1997, BBC News Online was launched, following individual webpages for major news events such as the 1996 Olympic Games, 1997 general election, and the death of Princess Diana. In 1999, the biggest relaunch occurred, with BBC One bulletins, BBC World, BBC News 24, and BBC News Online all adopting a common style. One of the most significant changes was the gradual adoption of the corporate image by the BBC regional news programmes, giving a common style across local, national and international BBC television news. This also included Newyddion, the main news programme of Welsh language channel S4C, produced by BBC News Wales. Following the relaunch of BBC News in 1999, regional headlines were included at the start of the BBC One news bulletins in 2000. The English regions did however lose five minutes at the end of their bulletins, due to a new headline round-up at 18:55. 2000 also saw the Nine O'Clock News moved to the later time of 22:00. This was in response to ITN who had just moved their popular News at Ten programme to 23:00. ITN briefly returned News at Ten but following poor ratings when head-to-head against the BBC's Ten O'Clock News, the ITN bulletin was moved to 22.30, where it remained until 14 January 2008. The retirement in 2009 of Peter Sissons and departure of Michael Buerk from the Ten O'Clock News led to changes in the BBC One bulletin presenting team on 20 January 2003. The Six O'Clock News became double headed with George Alagiah and Sophie Raworth after Huw Edwards and Fiona Bruce moved to present the Ten. A new set design featuring a projected fictional newsroom backdrop was introduced, followed on 16 February 2004 by new programme titles to match those of BBC News 24. BBC News 24 and BBC World introduced a new style of presentation in December 2003, that was slightly altered on 5 July 2004 to mark 50 years of BBC Television News. On 7 March 2005 director general Mark Thompson launched the "Creative Futures" project to restructure the organisation. The individual positions of editor of the One and Six O'Clock News were replaced by a new daytime position in November 2005. Kevin Bakhurst became the first Controller of BBC News 24, replacing the position of editor. Amanda Farnsworth became daytime editor while Craig Oliver was later named editor of the Ten O'Clock News. Bulletins received new titles and a new set design in May 2006, to allow for Breakfast to move into the main studio for the first time since 1997. The new set featured Barco videowall screens with a background of the London skyline used for main bulletins and originally an image of cirrus clouds against a blue sky for Breakfast. This was later replaced following viewer criticism. The studio bore similarities with the ITN-produced ITV News in 2004, though ITN uses a CSO Virtual studio rather than the actual screens at BBC News. BBC News became part of a new BBC Journalism group in November 2006 as part of a restructuring of the BBC. The then-Director of BBC News, Helen Boaden reported to the then-Deputy Director-General and head of the journalism group, Mark Byford until he was made redundant in 2010. On 18 October 2007, ED Mark Thompson announced a six-year plan, "Delivering Creative Futures" (based on his project begun in March 2005), merging the television current affairs department into a new "News Programmes" division. Thompson's announcement, in response to a £2 billion shortfall in funding, would, he said, deliver "a smaller but fitter BBC" in the digital age, by cutting its payroll and, in 2013, selling Television Centre. The various separate newsrooms for television, radio and online operations were merged into a single multimedia newsroom. Programme making within the newsrooms was brought together to form a multimedia programme making department. BBC World Service director Peter Horrocks said that the changes would achieve efficiency at a time of cost-cutting at the BBC. In his blog, he wrote that by using the same resources across the various broadcast media meant fewer stories could be covered, or by following more stories, there would be fewer ways to broadcast them. A new graphics and video playout system was introduced for production of television bulletins in January 2007. This coincided with a new structure to BBC World News bulletins, editors favouring a section devoted to analysing the news stories reported on. The first new BBC News bulletin since the Six O'Clock News was announced in July 2007 following a successful trial in the Midlands. The summary, lasting 90 seconds, has been broadcast at 20:00 on weekdays since December 2007 and bears similarities with 60 Seconds on BBC Three, but also includes headlines from the various BBC regions and a weather summary. As part of a long-term cost cutting programme, bulletins were renamed the BBC News at One, Six and Ten respectively in April 2008 while BBC News 24 was renamed BBC News and moved into the same studio as the bulletins at BBC Television Centre. BBC World was renamed BBC World News and regional news programmes were also updated with the new presentation style, designed by Lambie-Nairn. 2008 also saw tri-media introduced across TV, radio, and online. The studio moves also meant that Studio N9, previously used for BBC World, was closed, and operations moved to the previous studio of BBC News 24. Studio N9 was later refitted to match the new branding, and was used for the BBC's UK local elections and European elections coverage in early June 2009. A strategy review of the BBC in March 2010, confirmed that having "the best journalism in the world" would form one of five key editorial policies, as part of changes subject to public consultation and BBC Trust approval. After a period of suspension in late 2012, Helen Boaden ceased to be the Director of BBC News. On 16 April 2013, incoming BBC Director-General Tony Hall named James Harding, a former editor of The Times of London newspaper as Director of News and Current Affairs. From August 2012 to March 2013, all news operations moved from Television Centre to new facilities in the refurbished and extended Broadcasting House, in Portland Place. The move began in October 2012, and also included the BBC World Service, which moved from Bush House following the expiry of the BBC's lease. This new extension to the north and east, referred to as "New Broadcasting House", includes several new state-of-the-art radio and television studios centred around an 11-storey atrium. The move began with the domestic programme The Andrew Marr Show on 2 September 2012, and concluded with the move of the BBC News channel and domestic news bulletins on 18 March 2013. The newsroom houses all domestic bulletins and programmes on both television and radio, as well as the BBC World Service international radio networks and the BBC World News international television channel. BBC News and CBS News established an editorial and newsgathering partnership in 2017, replacing an earlier long-standing partnership between BBC News and ABC News. In an October 2018 Simmons Research survey of 38 news organisations, BBC News was ranked the fourth most trusted news organisation by Americans, behind CBS News, ABC News and The Wall Street Journal. In January 2020 the BBC announced a BBC News savings target of £80 million per year by 2022, involving about 450 staff reductions from the current 6,000. BBC director of news and current affairs Fran Unsworth said there would be further moves toward digital broadcasting, in part to attract back a youth audience, and more pooling of reporters to stop separate teams covering the same news. A further 70 staff reductions were announced in July 2020. BBC Three began airing the news programme The Catch Up in February 2022. It is presented by Levi Jouavel, Kirsty Grant, and Callum Tulley and aims to get the channel's target audience (16 to 34-year olds) to make sense of the world around them while also highlighting optimistic stories. Compared to its predecessor 60 Seconds, The Catch Up is three times longer, running for about three minutes and not airing during weekends. According to its annual report as of December 2021[update], India has the largest number of people using BBC services in the world. In May 2025, following the earthquake that hit Myanmar and Thailand, a television news bulletin (BBC News Myanmar) from the Burmese service using a vacated Voice of America satellite frequency began its broadcasts. Programming and reporting In November 2023, BBC News joined with the International Consortium of Investigative Journalists, Paper Trail Media [de] and 69 media partners including Distributed Denial of Secrets and the Organised Crime and Corruption Reporting Project (OCCRP) and more than 270 journalists in 55 countries and territories to produce the 'Cyprus Confidential' report on the financial network which supports the regime of Vladimir Putin, mostly with connections to Cyprus, and showed Cyprus to have strong links with high-up figures in the Kremlin, some of whom have been sanctioned. Government officials including Cyprus president Nikos Christodoulides and European lawmakers began responding to the investigation's findings in less than 24 hours, calling for reforms and launching probes. BBC News is responsible for the news programmes and documentary content on the BBC's general television channels, as well as the news coverage on the BBC News Channel in the UK, and 22 hours of programming for the corporation's international BBC World News channel. Coverage for BBC Parliament is carried out on behalf of the BBC at Millbank Studios, though BBC News provides editorial and journalistic content. BBC News content is also output onto the BBC's digital interactive television services under the BBC Red Button brand, and until 2012, on the Ceefax teletext system. The music on all BBC television news programmes was introduced in 1999 and composed by David Lowe. It was part of the re-branding which commenced in 1999 and features 'BBC Pips'. The general theme was used on bulletins on BBC One, News 24, BBC World and local news programmes in the BBC's Nations and Regions. Lowe was also responsible for the music on Radio One's Newsbeat. The theme has had several changes since 1999, the latest in March 2013. The BBC Arabic Television news channel launched on 11 March 2008, a Persian-language channel followed on 14 January 2009, broadcasting from the Peel wing of Broadcasting House; both include news, analysis, interviews, sports and highly cultural programmes and are run by the BBC World Service and funded from a grant-in-aid from the British Foreign Office (and not the television licence). The BBC Verify service was launched in 2023 to fact-check news stories, followed by BBC Verify Live in 2025. BBC Radio News produces bulletins for the BBC's national radio stations and provides content for local BBC radio stations via the General News Service (GNS), a BBC-internal news distribution service. BBC News does not produce the BBC's regional news bulletins, which are produced individually by the BBC nations and regions themselves. The BBC World Service broadcasts to some 150 million people in English as well as 27 languages across the globe. BBC Radio News is a patron of the Radio Academy. BBC News Online is the BBC's news website. Launched in November 1997, it is one of the most popular news websites, with 1.2 billion website visits in April 2021, as well as being used by 60% of the UK's internet users for news. The website contains international news coverage as well as entertainment, sport, science, and political news. Mobile apps for Android, iOS and Windows Phone systems have been provided since 2010. Many television and radio programmes are also available to view on the BBC iPlayer and BBC Sounds services. The BBC News channel is also available to view 24 hours a day, while video and radio clips are also available within online news articles. In October 2019, BBC News Online launched a mirror on the dark web anonymity network Tor in an effort to circumvent censorship. Criticism The BBC is required by its charter to be free from both political and commercial influence and answers only to its viewers and listeners. This political objectivity is sometimes questioned. For instance, The Daily Telegraph (3 August 2005) carried a letter from the KGB defector Oleg Gordievsky, referring to it as "The Red Service". Books have been written on the subject, including anti-BBC works like Truth Betrayed by W J West and The Truth Twisters by Richard Deacon. The BBC has been accused of bias by Conservative MPs. The BBC's Editorial Guidelines on Politics and Public Policy state that while "the voices and opinions of opposition parties must be routinely aired and challenged", "the government of the day will often be the primary source of news". The BBC is regularly accused by the government of the day of bias in favour of the opposition and, by the opposition, of bias in favour of the government. Similarly, during times of war, the BBC is often accused by the UK government, or by strong supporters of British military campaigns, of being overly sympathetic to the view of the enemy. An edition of Newsnight at the start of the Falklands War in 1982 was described as "almost treasonable" by John Page, MP, who objected to Peter Snow saying "if we believe the British". During the first Gulf War, critics of the BBC took to using the satirical name "Baghdad Broadcasting Corporation". During the Kosovo War, the BBC were labelled the "Belgrade Broadcasting Corporation" (suggesting favouritism towards the FR Yugoslavia government over ethnic Albanian rebels) by British ministers, although Slobodan Milosević (then FRY president) claimed that the BBC's coverage had been biased against his nation. Conversely, some of those who style themselves anti-establishment in the United Kingdom or who oppose foreign wars have accused the BBC of pro-establishment bias or of refusing to give an outlet to "anti-war" voices. Following the 2003 invasion of Iraq, a study by the Cardiff University School of Journalism of the reporting of the war found that nine out of 10 references to weapons of mass destruction during the war assumed that Iraq possessed them, and only one in 10 questioned this assumption. It also found that, out of the main British broadcasters covering the war, the BBC was the most likely to use the British government and military as its source. It was also the least likely to use independent sources, like the Red Cross, who were more critical of the war. When it came to reporting Iraqi casualties, the study found fewer reports on the BBC than on the other three main channels. The report's author, Justin Lewis, wrote "Far from revealing an anti-war BBC, our findings tend to give credence to those who criticised the BBC for being too sympathetic to the government in its war coverage. Either way, it is clear that the accusation of BBC anti-war bias fails to stand up to any serious or sustained analysis." Prominent BBC appointments are constantly assessed by the British media and political establishment for signs of political bias. The appointment of Greg Dyke as Director-General was highlighted by press sources because Dyke was a Labour Party member and former activist, as well as a friend of Tony Blair. The BBC's former Political Editor, Nick Robinson, was some years ago a chairman of the Young Conservatives and did, as a result, attract informal criticism from the former Labour government, but his predecessor Andrew Marr faced similar claims from the right because he was editor of The Independent, a liberal-leaning newspaper, before his appointment in 2000. Mark Thompson, former Director-General of the BBC, admitted the organisation has been biased "towards the left" in the past. He said, "In the BBC I joined 30 years ago, there was, in much of current affairs, in terms of people's personal politics, which were quite vocal, a massive bias to the left". He then added, "The organization did struggle then with impartiality. Now it is a completely different generation. There is much less overt tribalism among the young journalists who work for the BBC." Following the EU referendum in 2016, some critics suggested that the BBC was biased in favour of leaving the EU. For instance, in 2018, the BBC received complaints from people who took issue that the BBC was not sufficiently covering anti-Brexit marches while giving smaller-scale events hosted by former UKIP leader Nigel Farage more airtime. On the other hand, a poll released by YouGov showed that 45% of people who voted to leave the EU thought that the BBC was 'actively anti-Brexit' compared to 13% of the same kinds of voters who think the BBC is pro-Brexit. In 2008, the BBC Hindi was criticised by some Indian outlets for referring to the terrorists who carried out the 2008 Mumbai attacks as "gunmen". The response to this added to prior criticism from some Indian commentators suggesting that the BBC may have an Indophobic bias. In March 2015, the BBC was criticised for a BBC Storyville documentary interviewing one of the rapists in India. In spite of a ban ordered by the Indian High court, the BBC still aired the documentary "India's Daughter" outside India. BBC News was at the centre of a political controversy following the 2003 invasion of Iraq. Three BBC News reports (Andrew Gilligan's on Today, Gavin Hewitt's on The Ten O'Clock News and another on Newsnight) quoted an anonymous source that stated the British government (particularly the Prime Minister's office) had embellished the September Dossier with misleading exaggerations of Iraq's weapons of mass destruction capabilities. The government denounced the reports and accused the corporation of poor journalism. In subsequent weeks the corporation stood by the report, saying that it had a reliable source. Following intense media speculation, David Kelly was named in the press as the source for Gilligan's story on 9 July 2003. Kelly was found dead, by suicide, in a field close to his home early on 18 July. An inquiry led by Lord Hutton was announced by the British government the following day to investigate the circumstances leading to Kelly's death, concluding that "Dr. Kelly took his own life." In his report on 28 January 2004, Lord Hutton concluded that Gilligan's original accusation was "unfounded" and the BBC's editorial and management processes were "defective". In particular, it specifically criticised the chain of management that caused the BBC to defend its story. The BBC Director of News, Richard Sambrook, the report said, had accepted Gilligan's word that his story was accurate in spite of his notes being incomplete. Davies had then told the BBC Board of Governors that he was happy with the story and told the Prime Minister that a satisfactory internal inquiry had taken place. The Board of Governors, under the chairman's, Gavyn Davies, guidance, accepted that further investigation of the Government's complaints were unnecessary. Because of the criticism in the Hutton report, Davies resigned on the day of publication. BBC News faced an important test, reporting on itself with the publication of the report, but by common consent (of the Board of Governors) managed this "independently, impartially and honestly". Davies' resignation was followed by the resignation of Director General, Greg Dyke, the following day, and the resignation of Gilligan on 30 January. While undoubtedly a traumatic experience for the corporation, an ICM poll in April 2003 indicated that it had sustained its position as the best and most trusted provider of news. The BBC has faced accusations of holding both anti-Israel and anti-Palestine bias. Douglas Davis, the London correspondent of The Jerusalem Post, has described the BBC's coverage of the Arab–Israeli conflict as "a relentless, one-dimensional portrayal of Israel as a demonic, criminal state and Israelis as brutal oppressors [which] bears all the hallmarks of a concerted campaign of vilification that, wittingly or not, has the effect of delegitimising the Jewish state and pumping oxygen into a dark old European hatred that dared not speak its name for the past half-century.". However two large independent studies, one conducted by Loughborough University and the other by Glasgow University's Media Group concluded that Israeli perspectives are given greater coverage. Critics of the BBC argue that the Balen Report proves systematic bias against Israel in headline news programming. The Daily Mail and The Daily Telegraph criticised the BBC for spending hundreds of thousands of British tax payers' pounds from preventing the report being released to the public. Jeremy Bowen, the Middle East Editor for BBC world news, was singled out specifically for bias by the BBC Trust which concluded that he violated "BBC guidelines on accuracy and impartiality." An independent panel appointed by the BBC Trust was set up in 2006 to review the impartiality of the BBC's coverage of the Israeli–Palestinian conflict. The panel's assessment was that "apart from individual lapses, there was little to suggest deliberate or systematic bias." While noting a "commitment to be fair accurate and impartial" and praising much of the BBC's coverage the independent panel concluded "that BBC output does not consistently give a full and fair account of the conflict. In some ways the picture is incomplete and, in that sense, misleading." It notes that, "the failure to convey adequately the disparity in the Israeli and Palestinian experience, [reflects] the fact that one side is in control and the other lives under occupation". Writing in the Financial Times, Philip Stephens, one of the panellists, later accused the BBC's director-general, Mark Thompson, of misrepresenting the panel's conclusions. He further opined "My sense is that BBC news reporting has also lost a once iron-clad commitment to objectivity and a necessary respect for the democratic process. If I am right, the BBC, too, is lost". Mark Thompson published a rebuttal in the FT the next day. The description by one BBC correspondent reporting on the funeral of Yassir Arafat that she had been left with tears in her eyes led to other questions of impartiality, particularly from Martin Walker in a guest opinion piece in The Times, who picked out the apparent case of Fayad Abu Shamala, the BBC Arabic Service correspondent, who told a Hamas rally on 6 May 2001, that journalists in Gaza were "waging the campaign shoulder to shoulder together with the Palestinian people". Walker argues that the independent inquiry was flawed for two reasons. Firstly, because the time period over which it was conducted (August 2005 to January 2006) surrounded the Israeli withdrawal from Gaza and Ariel Sharon's stroke, which produced more positive coverage than usual. Furthermore, he wrote, the inquiry only looked at the BBC's domestic coverage, and excluded output on the BBC World Service and BBC World. Tom Gross accused the BBC of glorifying Hamas suicide bombers, and condemned its policy of inviting guests such as Jenny Tonge and Tom Paulin who have compared Israeli soldiers to Nazis. Writing for the BBC, Paulin said Israeli soldiers should be "shot dead" like Hitler's S.S, and said he could "understand how suicide bombers feel". The BBC also faced criticism for not airing a Disasters Emergency Committee aid appeal for Palestinians who suffered in Gaza during 22-day war there between late 2008 and early 2009. Most other major UK broadcasters did air this appeal, but rival Sky News did not. British journalist Julie Burchill has accused BBC of creating a "climate of fear" for British Jews over its "excessive coverage" of Israel compared to other nations. In light of the Gaza war, the BBC suspended seven Arab journalists over allegations of expressing support for Hamas via social media. BBC and ABC share video segments and reporters as needed in producing their newscasts. with the BBC showing ABC World News Tonight with David Muir in the UK. However, in July 2017, the BBC announced a new partnership with CBS News allows both organisations to share video, editorial content, and additional newsgathering resources in New York, London, Washington and around the world. BBC News subscribes to wire services from leading international agencies including PA Media (formerly Press Association), Reuters, and Agence France-Presse. In April 2017, the BBC dropped Associated Press in favour of an enhanced service from AFP. BBC News reporters and broadcasts are now and have in the past been banned in several countries primarily for reporting which has been unfavourable to the ruling government. For example, correspondents were banned by the former apartheid regime of South Africa. The BBC was banned in Zimbabwe under Mugabe for eight years as a terrorist organisation until being allowed to operate again over a year after the 2008 elections. The BBC was banned in Burma (officially Myanmar) after their coverage and commentary on anti-government protests there in September 2007. The ban was lifted four years later in September 2011. Other cases have included Uzbekistan, China, and Pakistan. BBC Persian, the BBC's Persian language news site, was blocked from the Iranian internet in 2006. The BBC News website was made available in China again in March 2008, but as of October 2014[update], was blocked again. In June 2015, the Rwandan government placed an indefinite ban on BBC broadcasts following the airing of a controversial documentary regarding the 1994 Rwandan genocide, Rwanda's Untold Story, broadcast on BBC2 on 1 October 2014. The UK's Foreign Office recognised "the hurt caused in Rwanda by some parts of the documentary". In February 2017, reporters from the BBC (as well as the Daily Mail, The New York Times, Politico, CNN, and others) were denied access to a United States White House briefing. In 2017, BBC India was banned for a period of five years from covering all national parks and sanctuaries in India. Following the withdrawal of CGTN's UK broadcaster licence on 4 February 2021 by Ofcom, China banned BBC News from airing in China. See also References External links |below = Category }} |
======================================== |
[SOURCE: https://www.wired.com/video/watch/inside-the-new-york-city-date-night-for-ai-lovers] | [TOKENS: 370] |
Inside the New York City Date Night for AI Lovers Released on 02/13/2026 Would you take your AI lover out on a date? Ahead of Valentine's Day, Eva AI, an AI companion company, set up a popup cafe in New York City allowing people to bring their AI partners and chill in a romantic setting. Visitors could also try speed dating with some of Eva AI's 100 AI companions. The app allows you to chat or even live video call with your AI companion. According to a 2025 study from the Kinsey Institute, a leading sex research center, about 16% of respondents said that they had experimented with AI partners, but there weren't that many actual people at the cafe who were there with their AI partners. One of the people we did talk to says he talks to Eva AI's companions to improve his communication. Some studies have also shown that people feel a certain amount of shame or social stigma from having AI relationships, which could explain why there weren't that many people there. What their AI partners. While this may seem fringe for now, with AI relationships on the rise, it may not be long before we see more people taking their AI partners out for a date night. So, where would you take your AI boo? Trending video Collectibles Expert Answers Collectibles Questions Olympian Answers Figure Skating Questions Paralympian Answers Paralympics Questions I Escaped Chinese Mafia Crypto Slavery Professor Answers Olympic History Questions © 2026 Condé Nast. All rights reserved. WIRED may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices |
======================================== |
[SOURCE: https://www.theverge.com/podcast/880778/ai-talent-war-hiring-frenzy-openai-anthropic-ipo] | [TOKENS: 2640] |
PodcastsClosePodcastsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PodcastsAICloseAIPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All AIBusinessCloseBusinessPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All BusinessMoney no longer matters to AI’s top talentThe AI industry is rife with defections, FOMO, and radical mission statements. It’s about to get supercharged.by Nilay PatelCloseNilay PatelEditor-in-ChiefPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Nilay PatelFeb 19, 2026, 3:00 PM UTCLinkSharePodcastsClosePodcastsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PodcastsAICloseAIPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All AIBusinessCloseBusinessPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All BusinessMoney no longer matters to AI’s top talentThe AI industry is rife with defections, FOMO, and radical mission statements. It’s about to get supercharged.by Nilay PatelCloseNilay PatelEditor-in-ChiefPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Nilay PatelFeb 19, 2026, 3:00 PM UTCLinkShareNilay PatelCloseNilay PatelPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Nilay Patel is editor-in-chief of The Verge, host of the Decoder podcast, and co-host of The Vergecast.Today on Decoder we’re going to talk about the war for AI talent. Right now, the hottest job market on the planet is for AI researchers.The vast majority of these people are concentrated into a small number of hugely valuable, extremely fast-growing companies in the San Francisco Bay Area. Nowadays, such companies are paying some of the highest salaries in the history of the tech industry to poach researchers from one another.It feels like every time one of these AI researchers leaves one company for another, they tell us exactly why. Sometimes they’re simply resigning to go be a poet. Sometimes they’re chasing a mission. Sometimes they’re worried that AI is going to imperil humanity, destroy all jobs, and plunge the world into chaos.Verge subscribers, don’t forget you get exclusive access to ad-free Decoder wherever you get your podcasts. Head here. Not a subscriber? You can sign up here.They’re really saying these things. They’re publishing these notes on X, in blog posts, or in the case of one former OpenAI safety researcher by writing a full New York Times op-ed.I’ve been dying to really dig in and try to unpack what’s going on with all these talent moves in AI. So my guest today is Verge senior AI reporter Hayden Field, who’s been covering the revolving door of the AI industry really closely and also the broader culture that’s motivating the AI workers to jump ship and the companies that are ruthlessly trying to hire them.Those motivations vary. Sure, all these people are paid extravagant salaries, but as you’ll hear Hayden say, a stronger motivating force is ideology and mission. The people working on AI, by and large, believe that what they’re doing is going to radically change the world, and they’re not really in desperate need of more money. So that really changes the incentive structures that might push people to leave, say, OpenAI for Anthropic, or to quit Elon Musk’s xAI now that it’s been acquired by SpaceX.At the same time, the incentives of the AI companies themselves are going from raising money to making money. Reporting suggests OpenAI and maybe even Anthropic could go public this year, and doing so would create a historic amount of wealth. It would also put new kinds of pressure on these companies to be more transparent about how they spend money and to be much more accountable for returning on the huge investments that they’ve raised so far.There’s a lot in this conversation. The AI industry right now is full of drama. There’s big characters, bitter rivalries, lots of money, and really, really long blog posts about the end of the world. If you’d like to read more about what we discussed in this episode, check out these links:What’s behind the mass exodus at xAI? | The VergeOpenClaw founder Peter Steinberger is joining OpenAI | The VergeTwo more xAI co-founders leave after the SpaceX merger | The VergeAI safety leader says ‘world is in peril’ and quits to study poetry | BBCOpenAI is making the mistakes Facebook made. I quit. | The New York TimesAnthropic’s chief on AI: ‘We don’t know if the models are conscious’ | The New York TimesMeet the one woman Anthropic trusts to teach AI morals | The Wall Street JournalOpenAI plans fourth-quarter IPO in race to beat Anthropic to market | The Wall Street JournalQuestions or comments about this episode? Hit us up at decoder@theverge.com. We really do read every email!Decoder with Nilay PatelA podcast from The Verge about big ideas and other problems.SUBSCRIBE NOW!Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Nilay PatelCloseNilay PatelEditor-in-ChiefPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Nilay PatelAICloseAIPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All AIAnthropicCloseAnthropicPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All AnthropicBusinessCloseBusinessPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All BusinessDecoderCloseDecoderPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All DecoderOpenAICloseOpenAIPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All OpenAIPodcastsClosePodcastsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PodcastsTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechxAIClosexAIPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All xAIMost PopularMost PopularXbox chief Phil Spencer is leaving MicrosoftThe RAM shortage is coming for everything you care aboutRead Microsoft gaming CEO Asha Sharma’s first memo on the future of XboxAmazon blames human employees for an AI coding agent’s mistakeA $10K+ bounty is waiting for anyone who can unplug Ring doorbells from Amazon’s cloudThe Verge DailyA free daily digest of the news that matters most.Email (required)Sign UpBy submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.Advertiser Content FromThis is the title for the native ad Posts from this topic will be added to your daily email digest and your homepage feed. See All Podcasts Posts from this topic will be added to your daily email digest and your homepage feed. See All AI Posts from this topic will be added to your daily email digest and your homepage feed. See All Business Money no longer matters to AI’s top talent The AI industry is rife with defections, FOMO, and radical mission statements. It’s about to get supercharged. Posts from this author will be added to your daily email digest and your homepage feed. See All by Nilay Patel Posts from this topic will be added to your daily email digest and your homepage feed. See All Podcasts Posts from this topic will be added to your daily email digest and your homepage feed. See All AI Posts from this topic will be added to your daily email digest and your homepage feed. See All Business Money no longer matters to AI’s top talent The AI industry is rife with defections, FOMO, and radical mission statements. It’s about to get supercharged. Posts from this author will be added to your daily email digest and your homepage feed. See All by Nilay Patel Posts from this author will be added to your daily email digest and your homepage feed. See All by Nilay Patel Today on Decoder we’re going to talk about the war for AI talent. Right now, the hottest job market on the planet is for AI researchers. The vast majority of these people are concentrated into a small number of hugely valuable, extremely fast-growing companies in the San Francisco Bay Area. Nowadays, such companies are paying some of the highest salaries in the history of the tech industry to poach researchers from one another. It feels like every time one of these AI researchers leaves one company for another, they tell us exactly why. Sometimes they’re simply resigning to go be a poet. Sometimes they’re chasing a mission. Sometimes they’re worried that AI is going to imperil humanity, destroy all jobs, and plunge the world into chaos. Verge subscribers, don’t forget you get exclusive access to ad-free Decoder wherever you get your podcasts. Head here. Not a subscriber? You can sign up here. They’re really saying these things. They’re publishing these notes on X, in blog posts, or in the case of one former OpenAI safety researcher by writing a full New York Times op-ed. I’ve been dying to really dig in and try to unpack what’s going on with all these talent moves in AI. So my guest today is Verge senior AI reporter Hayden Field, who’s been covering the revolving door of the AI industry really closely and also the broader culture that’s motivating the AI workers to jump ship and the companies that are ruthlessly trying to hire them. Those motivations vary. Sure, all these people are paid extravagant salaries, but as you’ll hear Hayden say, a stronger motivating force is ideology and mission. The people working on AI, by and large, believe that what they’re doing is going to radically change the world, and they’re not really in desperate need of more money. So that really changes the incentive structures that might push people to leave, say, OpenAI for Anthropic, or to quit Elon Musk’s xAI now that it’s been acquired by SpaceX. At the same time, the incentives of the AI companies themselves are going from raising money to making money. Reporting suggests OpenAI and maybe even Anthropic could go public this year, and doing so would create a historic amount of wealth. It would also put new kinds of pressure on these companies to be more transparent about how they spend money and to be much more accountable for returning on the huge investments that they’ve raised so far. There’s a lot in this conversation. The AI industry right now is full of drama. There’s big characters, bitter rivalries, lots of money, and really, really long blog posts about the end of the world. If you’d like to read more about what we discussed in this episode, check out these links: Questions or comments about this episode? Hit us up at decoder@theverge.com. We really do read every email! Decoder with Nilay Patel A podcast from The Verge about big ideas and other problems. Posts from this author will be added to your daily email digest and your homepage feed. See All by Nilay Patel Posts from this topic will be added to your daily email digest and your homepage feed. See All AI Posts from this topic will be added to your daily email digest and your homepage feed. See All Anthropic Posts from this topic will be added to your daily email digest and your homepage feed. See All Business Posts from this topic will be added to your daily email digest and your homepage feed. See All Decoder Posts from this topic will be added to your daily email digest and your homepage feed. See All OpenAI Posts from this topic will be added to your daily email digest and your homepage feed. See All Podcasts Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech Posts from this topic will be added to your daily email digest and your homepage feed. See All xAI Most Popular The Verge Daily A free daily digest of the news that matters most. This is the title for the native ad More in Podcasts This is the title for the native ad Top Stories © 2026 Vox Media, LLC. All Rights Reserved |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Facebook_onion_address] | [TOKENS: 496] |
Contents Facebook onion address The Facebook onion address located at https://www.facebookwkhpilnemxj7asaniu7vnjjbiltxjqhye3mhbshg7kx5tfyd.onion/ (formerly facebookcorewwwi.onion) is a site that allows access to Facebook through the Tor protocol, using its .onion top-level domain. Purported benefits Prior to the release of an official .onion domain, accessing Facebook through Tor would sometimes lead to error messages and inability to access the website. ProPublica explicitly referenced the existence of Facebook's .onion site when they started their own onion service. The site also makes it easier for Facebook to differentiate between accounts that have been caught up in a botnet and those that legitimately access Facebook through Tor. As of its 2014 release, the site was still in early stages, with much work remaining to polish the code for Tor access. It has been speculated that other companies will follow suit and release their own Tor-accessible sites. History In October 2014, Facebook announced that users could connect to the website through a Tor onion service using the privacy-protecting Tor browser and encrypted using HTTPS. Announcing the feature, Alec Muffett said, "Facebook's onion address provides a way to access Facebook through Tor without losing the cryptographic protections provided by the Tor cloud. ... it provides end-to-end communication, from your browser directly into a Facebook datacentre." The network address it used at the time – facebookcorewwwi.onion – is a backronym that stands for Facebook's Core WWW Infrastructure. In April 2016, it had been used by over 1 million people monthly, up from 525,000 in 2015. Google does not operate sites through Tor, and Facebook has been applauded for allowing such access, which makes it available in countries that actively try to block Facebook. In May 2021 it updated to an onion version 3 address at facebookwkhpilnemxj7asaniu7vnjjbiltxjqhye3mhbshg7kx5tfyd.onion. This was due to the Tor Project's planned July 2021 deprecation of v2 addresses due to their inherent crackability using brute-force attacks by modern hardware that did not exist at the time of their introduction (many private keys are known to equal the same v2 address due to a hash collision). References External links |
======================================== |
[SOURCE: https://www.mako.co.il/food-feed/2026-m02_w03/shorts-1c659a69f407c91027.htm] | [TOKENS: 637] |
נוסעים צפונה? הכירו את הפנינה האיטלקית שמסתתרת בתוך קיבוץ הגושריםבלב קיבוץ הגושרים מסתתרת לה לונה - מסעדה איטלקית משפחתית של השף איתי תמיר עם פיצת מחמצת פריכה, פסטות בעבודת יד וישיבה על הדשא מול שקיעה גלילית. ככה נראית ההתעוררות הקולינרית של הגליל העליון, ואנחנו עפים על זה (רק לא מבינים איך פספסנו אותה עד עכשיו) לין לוי18.02.2026לכתבה נוסעים צפונה? הכירו את הפנינה האיטלקית שמסתתרת בתוך קיבוץ הגושרים בלב קיבוץ הגושרים מסתתרת לה לונה - מסעדה איטלקית משפחתית של השף איתי תמיר עם פיצת מחמצת פריכה, פסטות בעבודת יד וישיבה על הדשא מול שקיעה גלילית. ככה נראית ההתעוררות הקולינרית של הגליל העליון, ואנחנו עפים על זה (רק לא מבינים איך פספסנו אותה עד עכשיו) לין לוי |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/The_Upshot] | [TOKENS: 282] |
Contents The Upshot The Upshot is a website published by The New York Times which releases articles combining data visualization with conventional journalistic analysis of news. History The Upshot was first announced in March 2014 and was officially launched on April 22, 2014. Steve Duenes, a graphics director at the New York Times, won a newsroom contest by coming up with the name "The Upshot". The site started with fifteen full-time staff, including founding editor David Leonhardt. Because The Upshot was launched soon after Nate Silver and FiveThirtyEight left the Times, it was widely described as a planned replacement for FiveThirtyEight and Silver. However, Leonhardt stated in an April 2014 interview that The Upshot was not intended to replace Silver. In 2014, The Upshot produced two of the twenty most-read stories on the Times' website, and it was responsible for 5% of the paper's web traffic in October of that year. Also in 2014, the site was a finalist for an Online Journalism Award in the category "Online Commentary, Large Newsroom", but it lost to NPR's Code Switch. In 2016, Amanda Cox, who had been a founding member of The Upshot, replaced Leonhardt as its editor. References External links This article about a news website is a stub. You can help Wikipedia by adding missing information. |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.