text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/DNS_root_zone] | [TOKENS: 1442] |
Contents DNS root zone The DNS root zone is the top-level DNS zone in the hierarchical namespace of the Domain Name System (DNS) of the Internet. Before October 1, 2016, the root zone had been overseen by the Internet Corporation for Assigned Names and Numbers (ICANN) which delegates the management to a subsidiary acting as the Internet Assigned Numbers Authority (IANA). Distribution services are provided by Verisign. Prior to this, ICANN performed management responsibility under oversight of the National Telecommunications and Information Administration (NTIA), an agency of the United States Department of Commerce. Oversight responsibility transitioned to the global stakeholder community represented within ICANN's governance structures. A combination of limits in the DNS definition and in certain protocols, namely the practical size of unfragmented User Datagram Protocol (UDP) packets, resulted in a practical maximum of 13 root name server addresses that can be accommodated in DNS name query responses. However, as of 2007, the root zone was serviced by several hundred servers at over 130 locations in many countries. Initialization of DNS service The DNS root zone is served by thirteen root server clusters which are authoritative for queries to the top-level domains of the Internet. Thus, every name resolution either starts with a query to a root server or uses information that was once obtained from a root server. The root servers clusters have the official names a.root-servers.net to m.root-servers.net. To resolve these names into addresses, a DNS resolver must first find an authoritative server for the net zone. To avoid this circular dependency, the address of at least one root server must be known for bootstrapping access to the DNS. For this purpose, operating systems or DNS servers or resolver software packages typically include a file with all addresses of the DNS root servers. Even if the IP addresses of some root servers change, at least one is needed to retrieve the current list of all name servers. This address file is called named.cache in the BIND name server reference implementation. The current official version is distributed by ICANN's InterNIC. With the address of a single functioning root server, all other DNS information may be discovered recursively, and information about any domain name may be found. Redundancy and diversity The root DNS servers are essential to the function of the Internet, as most Internet services, such as the World Wide Web and email, are based on domain names. The DNS servers are potential points of failure for the entire Internet. For this reason, multiple root servers are distributed worldwide. The DNS packet size of 512 octets limits a DNS response to thirteen addresses, until protocol extensions (see Extension Mechanisms for DNS) lifted this restriction. While it is possible to fit more entries into a packet of this size when using label compression, thirteen was chosen as a reliable limit. Since the introduction of IPv6, the successor Internet Protocol to IPv4, previous practices are being modified and extra space is filled with IPv6 name servers. The root name servers are hosted in multiple secure sites with high-bandwidth access to accommodate the traffic load. At first, all of these installations were located in the United States; however, the distribution has shifted and this is no longer the case. Usually each DNS server installation at a given site is a cluster of computers with load-balancing routers. A comprehensive list of servers, their locations, and properties is available at https://root-servers.org/. As of 24 June 2023[update], there were 1708 root servers worldwide. The modern trend is to use anycast addressing and routing to provide resilience and load balancing across a wide geographic area. For example, the j.root-servers.net server, maintained by Verisign, is represented by 104 (as of January 2016[update]) individual server systems located around the world, which can be queried using anycast addressing. Management The content of the Internet root zone file is coordinated by a subsidiary of ICANN which performs the Internet Assigned Numbers Authority (IANA) functions. Verisign generates and distributes the zone file to the various root server operators. In 1997, when the Internet was transferred from U.S. government control to private hands, NTIA exercised stewardship over the root zone. A 1998 Commerce Department document stated the agency was "committed to a transition that will allow the private sector to take leadership for DNS management" by the year 2000, however, no steps to make the transition happen were taken. In March 2014, NTIA announced it would transition its stewardship to a "global stakeholder community". According to Assistant Secretary of Commerce for Communications and Information, Lawrence E. Strickling, March 2014 was the right time to start a transition of the role to the global Internet community. The move came after pressure in the fallout of revelations that the United States and its allies had engaged in surveillance. The chairman of the board of ICANN denied the two were connected, however, and said the transition process had been ongoing for a long time. ICANN president Fadi Chehadé called the move historic and said that ICANN would move toward multi-stakeholder control. Various prominent figures in Internet history not affiliated with ICANN also applauded the move. NTIA's announcement did not immediately affect how ICANN performs its role. On March 11, 2016, NTIA announced that it had received a proposed plan to transition its stewardship role over the root zone, and would review it in the next 90 days. The proposal was adopted, and ICANN's renewed contract to perform the IANA function lapsed on September 30, 2016, resulting in the transition of oversight responsibility to the global stakeholder community represented within ICANN's governance structures. As a component of the transition plan, it created a new subsidiary called Public Technical Identifiers (PTI) to perform the IANA functions which include managing the DNS root zone. Data protection of the root zone Since July 2010, the root zone has been signed with a DNSSEC signature, providing a single trust anchor for the Domain Name System that can in turn be used to provide a trust anchor for other public key infrastructure (PKI). The root zone DNSKEY section is re-signed periodically with the root zone key signing key performed in a verifiable manner in front of witnesses in a key signing ceremony. The KSK2017 with ID 20326 is valid as of 2020. While the root zone file is signed with DNSSEC, some DNS records, such as NS records, are not covered by DNSSEC signatures. To address this weakness, a new DNS Resource Record, called ZONEMD, was introduced in RFC 8976. ZONEMD doesn't replace DNSSEC. ZONEMD and DNSSEC must be used together to ensure the full protection of the DNS root zone file. The ZONEMD deployment for the DNS root zone was completed on December 6, 2023. The B-Root DNS servers offer experimental support for DNS over TLS (DoT) on port 853. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Nazi_Germany] | [TOKENS: 18314] |
Contents Nazi Germany Nazi Germany,[i] officially the German Reich[j] and later the Greater German Reich,[k] was the German state between 1933 and 1945, when Adolf Hitler and the Nazi Party controlled the country, transforming it into a totalitarian dictatorship. The Third Reich,[l] meaning "Third Realm" or "Third Empire", referred to the Nazi claim that Nazi Germany was the successor to the Holy Roman Empire (800–1806) and the German Empire (1871–1918). The Third Reich, which the Nazis referred to as the Thousand-Year Reich,[m] ended in May 1945, after 12 years, when the Allies defeated Germany and entered the capital, Berlin, ending World War II in Europe. After Hitler was appointed Chancellor of Germany in 1933, the Nazi Party began to eliminate political opposition and consolidate power. A 1934 referendum confirmed Hitler as sole Führer (leader). Power was centralised in Hitler's person, and his word became the highest law. The government was not a coordinated, cooperating body, but rather a collection of factions struggling to amass power. To address the Great Depression, the Nazis used heavy military spending, extensive public works projects, including the Autobahnen (motorways), and a massive secret rearmament programme, forming the Wehrmacht (armed forces), all financed by deficit spending. The return to economic stability and end of mass unemployment boosted the regime's popularity. Hitler made increasingly aggressive territorial demands, seizing Austria in the Anschluss of 1938 and the Sudetenland region of Czechoslovakia. Germany signed a non-aggression pact with the Soviet Union and invaded Poland in 1939, launching World War II in Europe. In alliance with Fascist Italy and other Axis powers, Germany conquered most of Europe by 1940, but failed to subdue Britain. Racism, Nazi eugenics, anti-Slavism, and especially antisemitism were central ideological features of the regime. The Nazis considered Germanic peoples to be the "master race", the purest branch of the Aryan race. Jews, Romani people, Slavs, homosexuals, liberals, socialists, communists, other political opponents, Jehovah's Witnesses, Freemasons, those who refused to work and other "undesirables" were imprisoned, deported or murdered. Christian churches and citizens that opposed Hitler's rule were oppressed and leaders imprisoned. Education focused on racial biology, population policy and fitness for military service. Career and educational opportunities for women were curtailed. The Nazi Propaganda Ministry disseminated films and antisemitic canards, and organised mass rallies, fostering a pervasive cult of personality around Hitler to influence public opinion. The government controlled artistic expression, promoting specific art forms and banning or discouraging others. Genocide, mass murder and large-scale forced labour became hallmarks of the regime; the implementation of its racial policies culminated in the Holocaust. After invading the Soviet Union in 1941, Nazi Germany implemented the Generalplan Ost and Hunger Plan, as part of its war of extermination in Eastern Europe. The Soviet resurgence and entry of the United States into the war meant Germany lost the initiative in 1943, and by late 1944, had been pushed back to the 1939 border. Large-scale aerial bombing of Germany escalated, and the Axis powers were driven back in Eastern and Southern Europe. Germany was conquered by the Soviet Union from the east and the other allies from the west, and capitulated in 1945. Hitler's refusal to admit defeat led to massive destruction of German infrastructure and additional war-related deaths in the closing months of the war. The Allies subsequently initiated a policy of denazification and put many of the surviving Nazi leadership on trial for war crimes at the Nuremberg trials. Name Common English terms for the German state in the Nazi era are "Nazi Germany" and the "Third Reich", which Hitler and the Nazis also referred to as the "Thousand-Year Reich" (Tausendjähriges Reich). The latter, a translation of the Nazi propaganda term Drittes Reich, was first used in Das Dritte Reich, a 1923 book by Arthur Moeller van den Bruck. The book counted the Holy Roman Empire as the first Reich and the German Empire as the second. Background Severe setbacks to the German economy began after World War I ended, partly because of reparations payments required under the 1919 Treaty of Versailles. The government printed money to make the payments and to repay the country's war debt, but the resulting hyperinflation led to inflated prices, economic chaos, and food riots. When the government defaulted on their reparations payments in January 1923, French troops occupied German industrial areas along the Ruhr and widespread civil unrest followed. The National Socialist German Workers' Party, commonly known as the Nazi Party, was founded in 1920. The Nazi party platform included destruction of the Weimar Republic, rejection of the Treaty of Versailles, radical antisemitism, and anti-Bolshevism. They promised a strong central government, increased Lebensraum ("living space") for Germanic peoples, formation of a national community based on race, and racial cleansing via the active suppression of Jews, who would be stripped of their citizenship and civil rights. The Nazis proposed national and cultural renewal based upon the Völkisch movement. The party, especially its paramilitary organisation Sturmabteilung (SA; Storm Detachment), or Brownshirts, used physical violence to advance their political position, disrupting the meetings of rival organisations and attacking their members as well as Jewish people on the streets. Such far-right armed groups were common in Bavaria, and were tolerated by the sympathetic far-right state government of Gustav Ritter von Kahr. When the stock market in the United States crashed in 1929, the effect in Germany was dire. Millions were thrown out of work and several major banks collapsed. Hitler and the Nazis prepared to take advantage of the emergency to gain support for their party. They promised to strengthen the economy and provide jobs. Many voters decided the Nazi Party was capable of restoring order, quelling civil unrest, and improving Germany's international reputation. After the federal election of 1932, the party was the largest in the Reichstag, holding 230 seats with 37.4 per cent of the popular vote. History Although the Nazis won the greatest share of the popular vote in the two Reichstag general elections of 1932, they did not have a majority. Hitler refused to participate in a coalition government unless he was its leader. Under pressure from politicians, industrialists, and the business community, President Paul von Hindenburg appointed Hitler Chancellor of Germany on 30 January 1933. This event is known as the Machtergreifung ("seizure of power"). On the night of 27 February 1933, the Reichstag building was set afire. A Dutch communist named Marinus van der Lubbe was found guilty of starting the blaze. Hitler proclaimed that the arson marked the start of a communist uprising. The Reichstag Fire Decree, imposed on 28 February 1933, rescinded most civil liberties, including rights of assembly and freedom of the press. The decree also allowed the police to detain people indefinitely without charges. The legislation was accompanied by a propaganda campaign that led to public support for the measure. Violent suppression of communists by the SA was undertaken nationwide and 4,000 members of the Communist Party of Germany were arrested. On 23 March 1933 the Enabling Act, an amendment to the Weimar Constitution, passed in the Reichstag by a vote of 444 to 94. This amendment allowed Hitler and his cabinet to pass laws—even laws that violated the constitution—without the consent of the president or the Reichstag. As the bill required a two-thirds majority to pass, the Nazis used intimidation tactics as well as the provisions of the Reichstag Fire Decree to keep several Social Democratic deputies from attending, and the Communists had already been banned. The Enabling Act would subsequently serve as the legal foundation for the dictatorship the Nazis established. On 10 May the government seized the assets of the Social Democrats, and they were banned on 22 June. On 21 June the SA raided the offices of the German National People's Party – their former coalition partners – which then disbanded on 29 June. The remaining major political parties followed suit. On 14 July 1933 Germany became a one-party state with the passage of the Law Against the Formation of Parties, decreeing the Nazi Party to be the sole legal party in Germany. The founding of new parties was also made illegal, and all remaining political parties which had not already been dissolved were banned. Further elections in November 1933, 1936 and 1938 were Nazi-controlled, with only members of the Party and a small number of independents elected. All civilian organisations had their leadership replaced with Nazi sympathisers or party members, and either merged with the Nazi Party or faced dissolution. The Nazi government declared a "Day of National Labour" for May Day 1933, and invited many trade union delegates to Berlin for celebrations. The day after, SA stormtroopers demolished union offices around the country; all trade unions were forced to dissolve and their leaders were arrested. The Law for the Restoration of the Professional Civil Service, passed in April, removed from their jobs all teachers, professors, judges, magistrates, and government officials who were Jewish or whose commitment to the party was suspect. This meant the only non-political institutions not under control of the Nazis were the churches. The Nazi regime abolished the symbols of the Weimar Republic—including the black, red, and gold tricolour flag—and adopted reworked symbolism. The previous imperial black, white, and red tricolour was restored as one of Germany's two official flags; the second was the swastika flag of the Nazi Party, which became the sole national flag in September 1935. The Party anthem "Horst-Wessel-Lied" ("Horst Wessel Song") became a second national anthem. Germany was still in a dire economic situation, as six million people were unemployed and the balance of trade deficit was daunting. Using deficit spending, public works projects were undertaken beginning in 1934, creating 1.7 million new jobs by the end of that year alone. Average wages began to rise. The SA leadership continued to apply pressure for greater political and military power. In response, Hitler used the Schutzstaffel (SS) and the Gestapo—the secret police—to purge the entire SA leadership. Hitler targeted SA Stabschef (Chief of Staff) Ernst Röhm and other SA leaders who—along with a number of Hitler's political adversaries (such as Gregor Strasser and the former chancellor Kurt von Schleicher)—were arrested and shot. Up to 200 people were killed from 30 June to 2 July 1934 in an event that became known as the Night of the Long Knives. On 2 August 1934 Hindenburg died. The previous day, the cabinet had enacted the "Law Concerning the Head of State of the German Reich", which stated that upon Hindenburg's death the office of Reich President would be abolished and its powers merged with those of Reich Chancellor. Hitler thus became head of state as well as head of government and was formally named Führer und Reichskanzler ("Leader and Chancellor"), although eventually Reichskanzler was dropped. Germany was now a totalitarian state with Hitler at its head. As head of state, Hitler became Supreme Commander of the armed forces. The new law provided an altered loyalty oath for servicemen so that they affirmed loyalty to Hitler personally rather than the office of supreme commander or the state. On 19 August the merger of the presidency with the chancellorship was approved by 90 per cent of the electorate in a plebiscite. Most Germans were relieved that the conflicts and street fighting of the Weimar era had ended. They were deluged with propaganda orchestrated by the Minister of Public Enlightenment and Propaganda, Joseph Goebbels, who promised peace and plenty for all in a united, Marxist-free country without the constraints of the Treaty of Versailles. The Nazi Party obtained and legitimised power through its initial revolutionary activities, then through manipulation of legal mechanisms, the use of police powers, and by taking control of the state and federal institutions. The first major Nazi concentration camp, initially for political prisoners, was opened at Dachau in 1933. Hundreds of camps of varying size and function were created by the end of the war. Beginning in April 1933, scores of measures defining the status of Jews and their rights were instituted. These measures culminated in the establishment of the Nuremberg Laws of 1935, which stripped them of their basic rights. The Nazis would take from the Jews their wealth, their right to intermarry with non-Jews, and their right to occupy many fields of labour (such as law, medicine, or education). Eventually the Nazis declared the Jews as undesirable to remain among German citizens and society. As early as February 1933, Hitler announced that rearmament must begin, albeit clandestinely at first, as to do so was in violation of the Versailles Treaty. On 17 May 1933 Hitler gave a speech before the Reichstag outlining his desire for world peace and accepted an offer from American President Franklin D. Roosevelt for military disarmament, provided the other nations of Europe did the same. When the other European powers failed to accept this offer, Hitler pulled Germany out of the World Disarmament Conference and the League of Nations in October, claiming its disarmament clauses were unfair if they applied only to Germany. In a referendum held in November, 95 per cent of voters supported Germany's withdrawal. In 1934 Hitler told his military leaders that rearmament needed to be complete by 1942, as by then the German people would require more living space and resources, so Germany would have to start a war of conquest to obtain more territory. The Saarland, which had been placed under League of Nations supervision for 15 years at the end of World War I, voted in January 1935 to become part of Germany. In March 1935, Hitler announced the creation of an air force, and that the Reichswehr would be increased to 550,000 men. Britain agreed to Germany building a naval fleet with the signing of the Anglo-German Naval Agreement on 18 June 1935. When the Italian invasion of Ethiopia led to only mild protests by the British and French governments, on 7 March 1936 Hitler used the Franco-Soviet Treaty of Mutual Assistance as a pretext to order the army to march 3,000 troops into the demilitarised zone in the Rhineland in violation of the Versailles Treaty. As the territory was part of Germany, the British and French governments did not feel that attempting to enforce the treaty was worth the risk of war. In the one-party election held on 29 March, the Nazis received 98.9 per cent support. In 1936 Hitler signed an Anti-Comintern Pact with the Empire of Japan and a non-aggression agreement with the dictator of Fascist Italy, Benito Mussolini, who was soon referring to a "Rome-Berlin Axis". Hitler sent military supplies and assistance to the Nationalist forces of General Francisco Franco in the Spanish Civil War, which began in July 1936. The German Condor Legion included a range of aircraft and their crews, as well as a tank contingent. The aircraft of the Legion destroyed the city of Guernica in 1937. The Nationalists were victorious in 1939 and became an informal ally of Nazi Germany. In February 1938 Hitler emphasised to Austrian Chancellor Kurt Schuschnigg the need for Germany to secure its frontiers. Schuschnigg scheduled a plebiscite regarding Austrian independence for 13 March, but Hitler sent an ultimatum to Schuschnigg on 11 March demanding that he hand over all power to the Austrian Nazi Party or face an invasion. German troops entered Austria the next day, to be greeted with enthusiasm by the populace. The Republic of Czechoslovakia was home to a substantial minority of Germans, who lived mostly in the Sudetenland. Under pressure from separatist groups within the Sudeten German Party, the Czechoslovak government offered economic concessions to the region. Hitler decided not just to incorporate the Sudetenland into the Reich, but to destroy the country of Czechoslovakia entirely. The Nazis undertook a propaganda campaign to try to generate support for an invasion. Top German military leaders opposed the plan, as Germany was not yet ready for war. The crisis led to war preparations by Britain, Czechoslovakia, and France (Czechoslovakia's ally). Attempting to avoid war, British Prime Minister Neville Chamberlain arranged a series of meetings, the result of which was the Munich Agreement, signed on 29 September 1938. The Czechoslovak government was forced to accept the Sudetenland's annexation into Germany. Chamberlain was greeted with cheers when he landed in London, saying the agreement brought "peace for our time". Austrian and Czech foreign exchange reserves were seized by the Nazis, as were stockpiles of raw materials such as metals and completed goods such as weaponry and aircraft, which were shipped to Germany. The Reichswerke Hermann Göring industrial conglomerate took control of steel and coal production facilities in both countries. In January 1934 Germany signed a non-aggression pact with Poland. In March 1939 Hitler demanded the return of the Free City of Danzig and the Polish Corridor, a strip of land that separated East Prussia from the rest of Germany. The British announced they would come to the aid of Poland if it was attacked. Hitler, believing the British would not take action, ordered an invasion plan should be readied for September 1939. On 23 May Hitler described to his generals his overall plan of not only seizing the Polish Corridor but greatly expanding German territory eastward at the expense of Poland. He expected this time they would be met by force. The Germans reaffirmed their alliance with Italy and signed non-aggression pacts with Denmark, Estonia, and Latvia whilst trade links were formalised with Romania, Norway, and Sweden. Foreign Minister Joachim von Ribbentrop arranged in negotiations with the Soviet Union a non-aggression pact, the Molotov–Ribbentrop Pact, signed in August 1939. The treaty also contained secret protocols dividing Poland and the Baltic states into German and Soviet spheres of influence. Germany's wartime foreign policy involved the creation of allied governments controlled directly or indirectly from Berlin. They intended to obtain soldiers from allies such as Italy and Hungary and workers and food supplies from allies such as Vichy France. Hungary was the fourth nation to join the Axis, signing the Tripartite Pact on 27 September 1940. Bulgaria signed the pact on 17 November. German efforts to secure oil included negotiating a supply from their new ally, Romania, who signed the Pact on 23 November, alongside the Slovak Republic. By late 1942 there were 24 divisions from Romania on the Eastern Front, 10 from Italy, and 10 from Hungary. Germany assumed full control in France in 1942, Italy in 1943, and Hungary in 1944. Although Japan was a powerful ally, the relationship was distant, with little co-ordination or co-operation. For example, Germany refused to share their formula for synthetic oil from coal until late in the war. Germany invaded Poland and captured the Free City of Danzig on 1 September 1939, beginning World War II in Europe. Honouring their treaty obligations, Britain and France declared war on Germany two days later. Poland fell quickly, as the Soviet Union attacked from the east on 17 September. Reinhard Heydrich, chief of the Sicherheitspolizei (SiPo; Security Police) and Sicherheitsdienst (SD; Security Service), ordered on 21 September that Polish Jews should be rounded up and concentrated into cities with good rail links. Initially the intention was to deport them farther east, or possibly to Madagascar. Using lists prepared in advance, some 65,000 Polish intelligentsia, noblemen, clergy, and teachers were murdered by the end of 1939 in an attempt to destroy Poland's identity as a nation. Soviet forces advanced into Finland in the Winter War, and German forces saw action at sea. But little other activity occurred until May, so the period became known as the "Phoney War". From the start of the war, a British blockade on shipments to Germany affected its economy. Germany was particularly dependent on foreign supplies of oil, coal, and grain. Thanks to trade embargoes and the blockade, imports into Germany declined by 80 per cent. To safeguard Swedish iron ore shipments to Germany, Hitler ordered the invasion of Denmark and Norway, which began on 9 April. Denmark fell after less than a day, while most of Norway followed by the end of the month. By early June, Germany occupied all of Norway. Against the advice of many of his senior military officers, in May 1940 Hitler ordered an attack on France and the Low Countries. They quickly conquered Luxembourg and the Netherlands and outmanoeuvred the Allies in Belgium, forcing the evacuation of many British and French troops at Dunkirk. France fell as well, surrendering to Germany on 22 June. The victory in France resulted in an upswing in Hitler's popularity and an upsurge in war fever in Germany. In violation of the provisions of the Hague Convention, industrial firms in the Netherlands, France, and Belgium were put to work producing war materiel for Germany. The Nazis seized from the French thousands of locomotives and rolling stock, stockpiles of weapons, and raw materials such as copper, tin, oil, and nickel. Payments for occupation costs were levied upon France, Belgium, and Norway. Barriers to trade led to hoarding, black markets, and uncertainty about the future. Food supplies were precarious; production dropped in most of Europe. Famine was experienced in many occupied countries. Hitler's peace overtures to the new British Prime Minister Winston Churchill were rejected in July 1940. Grand Admiral Erich Raeder had advised Hitler in June that air superiority was a pre-condition for a successful invasion of Britain, so Hitler ordered a series of aerial attacks on Royal Air Force (RAF) airbases and radar stations, as well as nightly air raids on British cities, including London, Plymouth, and Coventry. The German Luftwaffe failed to defeat the RAF in what became known as the Battle of Britain, and by the end of October, Hitler realised that air superiority would not be achieved. He permanently postponed the invasion, a plan which the commanders of the German army had never taken entirely seriously.[n] Several historians, including Andrew Gordon, believe the primary reason for the failure of the invasion plan was the superiority of the Royal Navy, not the actions of the RAF. In February 1941 the German Afrika Korps arrived in Libya to aid the Italians in the North African Campaign. On 6 April Germany launched an invasion of Yugoslavia and Greece. All of Yugoslavia and parts of Greece were subsequently divided between Germany, Hungary, Italy, and Bulgaria. On 22 June 1941, contravening the Molotov–Ribbentrop Pact, about 3.8 million Axis troops attacked the Soviet Union. In addition to Hitler's stated purpose of acquiring Lebensraum, this large-scale offensive—codenamed Operation Barbarossa—was intended to destroy the Soviet Union and seize its natural resources for subsequent aggression against the Western powers. The reaction among Germans was one of surprise and trepidation as many were concerned about how much longer the war would continue or suspected that Germany could not win a war fought on two fronts. The invasion conquered a huge area, including the Baltic states, Belarus, and west Ukraine. After the successful Battle of Smolensk in September 1941, Hitler ordered Army Group Centre to halt its advance to Moscow and temporarily divert its Panzer groups to aid in the encirclement of Leningrad and Kiev. This pause provided the Red Army with an opportunity to mobilise fresh reserves. The Moscow offensive, which resumed in October 1941, ended disastrously in December. On 7 December 1941 Japan attacked Pearl Harbor, Hawaii. Four days later, Germany declared war on the United States. Food was in short supply in the conquered areas of the Soviet Union and Poland, as the retreating armies had burned the crops in some areas, and much of the remainder was sent back to the Reich. In Germany, rations were cut in 1942. In his role as Plenipotentiary of the Four Year Plan, Hermann Göring demanded increased shipments of grain from France and fish from Norway. The 1942 harvest was good, and food supplies remained adequate in Western Europe. Germany and Europe as a whole were almost totally dependent on foreign oil imports. In an attempt to resolve the shortage, in June 1942 Germany launched Fall Blau ("Case Blue"), an offensive against the Caucasian oilfields. The Red Army launched a counter-offensive on 19 November and encircled the Axis forces, who were trapped in Stalingrad on 23 November. Göring assured Hitler that the 6th Army could be supplied by air, but this turned out to be infeasible. Hitler's refusal to allow a retreat led to the deaths of 200,000 German and Romanian soldiers; of the 91,000 men who surrendered in the city on 31 January 1943, only 6,000 survivors returned to Germany after the war. Losses continued to mount after Stalingrad, leading to a sharp reduction in the popularity of the Nazi Party and deteriorating morale. Soviet forces continued to push westward after the failed German offensive at the Battle of Kursk in the summer of 1943. By the end of 1943, the Germans had lost most of their eastern territorial gains. In Egypt, Field Marshal Erwin Rommel's Afrika Korps were defeated by British forces under Field Marshal Bernard Montgomery in October 1942. The Allies landed in Sicily in July 1943 and were on the Italian peninsula by September. Meanwhile, American and British bomber fleets based in Britain began operations against Germany. Many sorties were intentionally given civilian targets in an effort to destroy German morale. The bombing of aircraft factories as well as Peenemünde Army Research Center, where V-1 and V-2 rockets were being developed and produced, were also deemed particularly important. German aircraft production could not keep pace with losses, and without air cover the Allied bombing campaign became even more devastating. By targeting oil refineries and factories, they crippled the German war effort by late 1944. On 6 June 1944, American, British, and Canadian forces established a front in France with the D-Day landings in Normandy. On 20 July 1944, Hitler survived an assassination attempt. He ordered brutal reprisals, resulting in 7,000 arrests and the execution of more than 4,900 people. The failed Ardennes Offensive (16 December 1944 – 25 January 1945) was the last major German offensive on the western front, and Soviet forces entered Germany on 27 January. Hitler's refusal to admit defeat and his insistence that the war be fought to the last man led to unnecessary death and destruction in the war's closing months. Through his Justice Minister Otto Georg Thierack, Hitler ordered that anyone who was not prepared to fight should be court-martialed, and thousands of people were executed. In many areas, people surrendered to the approaching Allies in spite of exhortations of local leaders to continue to fight. Hitler ordered the destruction of transport, bridges, industries, and other infrastructure—a scorched earth decree—but Armaments Minister Albert Speer prevented this order from being fully carried out. During the Battle of Berlin (16 April – 2 May 1945), Hitler and his staff lived in the underground Führerbunker while the Red Army approached. On 30 April, when Soviet troops were within two blocks of the Reich Chancellery, Hitler and his wife, Eva Braun, committed suicide. On 2 May, General Helmuth Weidling unconditionally surrendered Berlin to Soviet General Vasily Chuikov. Hitler was succeeded by Grand Admiral Karl Dönitz as Reich President and Goebbels as Reich Chancellor. Goebbels and his wife Magda committed suicide the next day after murdering their six children. Between 4 and 8 May 1945, most of the remaining German armed forces unconditionally surrendered. The German Instrument of Surrender was signed 8 May, marking the end of the Nazi regime and the end of World War II in Europe. Popular support for Hitler almost completely disappeared as the war drew to a close. Suicide rates in Germany increased, particularly in areas where the Red Army was advancing. Among soldiers and party personnel, suicide was often deemed an honourable and heroic alternative to surrender. First-hand accounts and propaganda about the uncivilised behaviour of the advancing Soviet troops caused panic among civilians on the Eastern Front, especially women, who feared being raped. More than a thousand people (out of a population of around 16,000) committed suicide in Demmin around 1 May 1945 as the 65th Army of 2nd Belorussian Front first broke into a distillery and then rampaged through the town, committing mass rapes, arbitrarily executing civilians, and setting fire to buildings. High numbers of suicides took place in many other locations, including Neubrandenburg (600 dead), Stolp in Pommern (1,000 dead), and Berlin, where at least 7,057 people committed suicide in 1945. Estimates of the total German war dead range from 5.5 to 6.9 million persons. A study by the historian Rüdiger Overmans puts the number of German military dead and missing at 5.3 million, including 900,000 men conscripted from outside of Germany's 1937 borders. Richard Overy estimated in 2014 that about 353,000 civilians were killed in Allied air raids. Other civilian deaths include 300,000 Germans (including Jews) who were victims of Nazi political, racial, and religious persecution and 200,000 who were murdered in the Nazi euthanasia program. Political courts called Sondergerichte sentenced some 12,000 members of the German resistance to death, and civil courts sentenced an additional 40,000 Germans. Mass rapes of German women also took place. Geography As a result of their defeat in World War I and the resulting Treaty of Versailles, Germany lost Alsace-Lorraine, Northern Schleswig, and Memel. The Saarland became a protectorate of France under the condition that its residents would later decide by referendum which country to join, and Poland became a separate nation and was given access to the sea by the creation of the Polish Corridor, which separated Prussia from the rest of Germany, while Danzig was made a free city. Germany regained control of the Saarland through a referendum held in 1935 and annexed Austria in the Anschluss of 1938. The Munich Agreement of 1938 gave Germany control of the Sudetenland, and they seized the remainder of Czechoslovakia six months later. Under threat of invasion by sea, Lithuania surrendered the Memel district in March 1939. Between 1939 and 1941, German forces invaded Poland, Denmark, Norway, France, Luxembourg, the Netherlands, Belgium, Yugoslavia, Greece, and the Soviet Union. Germany annexed parts of northern Yugoslavia in April 1941, while Mussolini ceded Trieste, South Tyrol, and Istria to Germany in 1943. Some of the conquered territories were incorporated into Germany as part of Hitler's long-term goal of creating a Greater Germanic Reich. Several areas, such as Alsace-Lorraine, were placed under the authority of an adjacent Gau (regional district). The Reichskommissariate (Reich Commissariats), quasi-colonial regimes, were established in some occupied countries. Areas placed under German administration included the Protectorate of Bohemia and Moravia, Reichskommissariat Ostland (encompassing the Baltic states and Belarus), and Reichskommissariat Ukraine. Conquered areas of Belgium and France were placed under control of the Military Administration in Belgium and Northern France. Belgian Eupen-Malmedy, which had been part of Germany until 1919, was annexed. Part of Poland was incorporated into the Reich, and the General Government was established in occupied central Poland. The governments of Denmark, Norway (Reichskommissariat Norwegen), and the Netherlands (Reichskommissariat Niederlande) were placed under civilian administrations staffed largely by natives.[o] Hitler intended to eventually incorporate many of these areas into the Reich. Germany occupied the Italian protectorate of Albania and the Italian governorate of Montenegro in 1943 and installed a puppet government in occupied Serbia in 1941. Politics Final Solution Pre-Machtergreifung Post-Machtergreifung Parties The Nazis were a far-right fascist political party that arose during the social and financial upheavals that occurred following the end of World War I. The Party remained small and marginalised, receiving 2.6% of the federal vote in 1928, prior to the onset of the Great Depression in 1929. By 1930 the Party won 18.3% of the federal vote, making it the Reichstag's second largest political party. While in prison after the failed Beer Hall Putsch of 1923, Hitler wrote Mein Kampf, which laid out his plan for transforming German society into one based on race. Nazi ideology brought together elements of antisemitism, racial hygiene, and eugenics, and combined them with pan-Germanism and territorial expansionism with the goal of obtaining more Lebensraum for the Germanic people. The regime attempted to obtain this new territory by attacking Poland and the Soviet Union, intending to mass-murder or deport the Jews and Slavs living there, who it viewed as being inferior to the Aryan master race and part of a Jewish-Bolshevik conspiracy. The Nazi regime believed that only Germany could defeat the forces of Bolshevism and save humanity from world domination by International Jewry. Other people deemed life unworthy of life by the Nazis included the mentally and physically disabled, Romani people, homosexuals, Jehovah's Witnesses, and social misfits. Additionally, Freemasons were heavily monitored and persecuted. Influenced by the Völkisch movement, the regime was against cultural modernism and supported the development of an extensive military at the expense of intellectualism. Creativity and art were stifled, except where they could serve as propaganda media. The party used symbols such as the Blood Flag and rituals such as the Nazi Party rallies to foster unity and bolster the regime's popularity. Hitler ruled Germany autocratically by asserting the Führerprinzip ("leader principle"), which called for absolute obedience by all subordinates. He viewed the government structure as a pyramid, with himself—the infallible leader—at the apex. Party rank was not determined by elections, and positions were filled through appointment by those of higher rank. The party used propaganda to develop a cult of personality around Hitler. Historians such as Kershaw emphasise the psychological impact of Hitler's skill as an orator. Roger Gill states: "His moving speeches captured the minds and hearts of a vast number of the German people: he virtually hypnotized his audiences". While top officials reported to Hitler and followed his policies, they had considerable autonomy. He expected officials to "work towards the Führer" – to take the initiative in promoting policies and actions in line with party goals and Hitler's wishes, without his involvement in day-to-day decision-making. The government was a disorganised collection of factions led by the party elite, who struggled to amass power and gain the Führer's favour. Hitler's leadership style was to give contradictory orders to his subordinates and to place them in positions where their duties and responsibilities overlapped. In this way he fostered distrust, competition, and infighting among his subordinates to consolidate and maximise his own power. Successive Reichsstatthalter decrees between 1933 and 1935 abolished the existing Länder (constituent states) of Germany and replaced them with new administrative divisions, the Gaue, governed by Nazi leaders (Gauleiters). The change was never fully implemented, as the Länder were still used as administrative divisions for some government departments such as education. This led to a bureaucratic tangle of overlapping jurisdictions and responsibilities typical of the administrative style of the Nazi regime. Jewish civil servants lost their jobs in 1933, except for those who had seen military service in World War I. Members of the Party or party supporters were appointed in their place. As part of the process of Gleichschaltung, the Reich Local Government Law of 1935 abolished local elections, and mayors were appointed by the Ministry of the Interior. In August 1934, civil servants and members of the military were required to swear an oath of unconditional obedience to Hitler. These laws became the basis of the Führerprinzip, the concept that Hitler's word overrode all existing laws. Any acts that were sanctioned by Hitler—even murder—thus became legal. All legislation proposed by cabinet ministers had to be approved by the office of Deputy Führer Rudolf Hess, who could also veto top civil service appointments. Most of the judicial system and legal codes of the Weimar Republic remained in place to deal with non-political crimes. The courts issued and carried out far more death sentences than before the Nazis took power. People who were convicted of three or more offences—even petty ones—could be deemed habitual offenders and jailed indefinitely. People such as prostitutes and pickpockets were judged to be inherently criminal and a threat to the community. Thousands were arrested and confined indefinitely without trial. A new type of court, the Volksgerichtshof ("People's Court"), was established in 1934 to deal with political cases. This court handed out over 5,000 death sentences until its dissolution in 1945. The death penalty could be issued for offences such as being a communist, printing seditious leaflets, or even making jokes about Hitler or other officials. The Gestapo was in charge of investigative policing to enforce Nazi ideology as they located and confined political offenders, Jews, and others deemed undesirable. Political offenders who were released from prison were often immediately re-arrested by the Gestapo and confined in a concentration camp. The Nazis used propaganda to promulgate the concept of Rassenschande ("race defilement") to justify the need for racial laws. In September 1935, the Nuremberg Laws were enacted. These laws initially prohibited sexual relations and marriages between Aryans and Jews and were later extended to include "Gypsies, Negroes or their bastard offspring". The law also forbade the employment of German women under the age of 45 as domestic servants in Jewish households. The Reich Citizenship Law stated that only those of "German or related blood" could be citizens. Thus Jews and other non-Aryans were stripped of their German citizenship. The law also permitted the Nazis to deny citizenship to anyone who was not supportive enough of the regime. A supplementary decree issued in November defined as Jewish anyone with three Jewish grandparents, or two grandparents if the Jewish faith was followed. Military and paramilitary The unified armed forces of Germany from 1935 to 1945 were called the Wehrmacht (defence force). This included the Heer (army), Kriegsmarine (navy), and the Luftwaffe (air force). From 2 August 1934, members of the armed forces were required to pledge an oath of unconditional obedience to Hitler personally. In contrast to the previous oath, which required allegiance to the constitution of the country and its lawful establishments, this new oath required members of the military to obey Hitler even if they were being ordered to do something illegal. Hitler decreed that the army would have to tolerate and even offer logistical support to the Einsatzgruppen—the mobile death squads responsible for millions of murders in Eastern Europe—when it was tactically possible to do so. Wehrmacht troops also participated directly in the Holocaust by shooting civilians or committing genocide under the guise of anti-partisan operations. The party line was that the Jews were the instigators of the partisan struggle and therefore needed to be eliminated. On 8 July 1941 Heydrich announced that all Jews in the eastern conquered territories were to be regarded as partisans and gave the order for all male Jews between the ages of 15 and 45 to be shot. By August, this was extended to include the entire Jewish population. In spite of efforts to prepare the country militarily, the economy could not sustain a lengthy war of attrition. A strategy was developed based on the tactic of Blitzkrieg ("lightning war"), which involved using quick coordinated assaults that avoided enemy strong points. Attacks began with artillery bombardment, followed by bombing and strafing runs. Next the tanks would attack and finally the infantry would move in to secure the captured area. Victories continued through mid-1940, but the failure to defeat Britain was the first major turning point in the war. The decision to attack the Soviet Union and the decisive defeat at Stalingrad led to the retreat of the German armies and the eventual loss of the war. The total number of soldiers who served in the Wehrmacht from 1935 to 1945 was around 18.2 million, of whom 5.3 million died. The Sturmabteilung (SA; Storm Detachment), or Brownshirts, founded in 1921, was the first paramilitary wing of the Nazi Party; their initial assignment was to protect Nazi leaders at rallies and assemblies. They also took part in street battles against the forces of rival political parties and violent actions against Jews and others. Under Ernst Röhm's leadership the SA grew by 1934 to over half a million members—4.5 million including reserves—at a time when the regular army was still limited to 100,000 men by the Versailles Treaty. Röhm hoped to assume command of the army and absorb it into the ranks of the SA. Hindenburg and Defence Minister Werner von Blomberg threatened to impose martial law if the activities of the SA were not curtailed. Therefore, less than a year and a half after seizing power, Hitler ordered the deaths of the SA leadership, including Rohm. After the purge of 1934, the SA was no longer a major force. Initially a small bodyguard unit under the auspices of the SA, the Schutzstaffel (SS; Protection Squadron) grew to become one of the largest and most powerful groups in Nazi Germany. Led by Reichsführer-SS Heinrich Himmler from 1929, the SS had over a quarter million members by 1938. Himmler initially envisioned the SS as being an elite group of guards, Hitler's last line of defence. The Waffen-SS, the military branch of the SS, evolved into a second army. It was dependent on the regular army for heavy weaponry and equipment, and most units were under tactical control of the High Command of the Armed Forces (OKW). By the end of 1942, the stringent selection and racial requirements that had initially been in place were no longer followed. With recruitment and conscription based only on expansion, by 1943 the Waffen-SS could not longer claim to be an elite fighting force. SS formations committed many war crimes against civilians and allied servicemen. From 1935 onward, the SS spearheaded the persecution of Jews, who were rounded up into ghettos and concentration camps. With the outbreak of World War II, the SS Einsatzgruppen units followed the army into Poland and the Soviet Union, where from 1941 to 1945 they murdered more than two million people, including 1.3 million Jews. A third of the Einsatzgruppen members were recruited from Waffen-SS personnel. The SS-Totenkopfverbände (death's head units) ran the concentration camps and extermination camps, where millions more were murdered. Up to 60,000 Waffen-SS men served in the camps. In 1931 Himmler organised an SS intelligence service which became known as the Sicherheitsdienst (SD; Security Service) under his deputy, Heydrich. This organisation was tasked with locating and arresting communists and other political opponents. Himmler established the beginnings of a parallel economy under the auspices of the SS Economy and Administration Head Office. This holding company owned housing corporations, factories, and publishing houses. Economy The most pressing economic matter the Nazis initially faced was the 30 per cent national unemployment rate. The economist Hjalmar Schacht, President of the Reichsbank and Minister of Economics, created a scheme for deficit financing in May 1933. Capital projects were paid for with the issuance of promissory notes called Mefo bills. When the notes were presented for payment, the Reichsbank printed money. Hitler and his economic team expected that the upcoming territorial expansion would provide the means of repaying the soaring national debt. Schacht's administration achieved a rapid decline in the unemployment rate, the largest of any country during the Great Depression. Economic recovery was uneven, with reduced hours of work and erratic availability of necessities, leading to disenchantment with the regime as early as 1934. In October 1933 the Junkers Aircraft Works was expropriated. In concert with other aircraft manufacturers and under the direction of Aviation Minister Göring, production was ramped up. From a workforce of 3,200 people producing 100 units per year in 1932, the industry grew to employ a quarter of a million workers manufacturing over 10,000 technically advanced aircraft annually less than ten years later. An elaborate bureaucracy was created to regulate imports of raw materials and finished goods with the intention of eliminating foreign competition in the German marketplace and improving the nation's balance of payments. The Nazis encouraged the development of synthetic replacements for materials such as oil and textiles. As the market was experiencing a glut and prices for petroleum were low, in 1933 the Nazi government made a profit-sharing agreement with IG Farben, guaranteeing them a 5 per cent return on capital invested in their synthetic oil plant at Leuna. Any profits in excess of that amount would be turned over to the Reich. By 1936, Farben regretted making the deal, as excess profits were by then being generated. In another attempt to secure an adequate wartime supply of petroleum, Germany intimidated Romania into signing a trade agreement in March 1939. Major public works projects financed with deficit spending included the construction of a network of Autobahnen and providing funding for programmes initiated by the previous government for housing and agricultural improvements. To stimulate the construction industry, credit was offered to private businesses and subsidies were made available for home purchases and repairs. On the condition that the wife would leave the workforce, a loan of up to 1,000 Reichsmarks could be accessed by young couples of Aryan descent who intended to marry, and the amount that had to be repaid was reduced by 25 per cent for each child born. The caveat that the woman had to remain unemployed outside the home was dropped by 1937 due to a shortage of skilled labourers. Envisioning widespread car ownership as part of the new Germany, Hitler arranged for designer Ferdinand Porsche to draw up plans for the KdF-wagen (Strength Through Joy car), intended to be an automobile that everyone could afford. A prototype was displayed at the International Motor Show in Berlin on 17 February 1939. With the outbreak of World War II, the factory was converted to produce military vehicles. None were sold until after the war, when the vehicle was renamed the Volkswagen (people's car). Six million people were unemployed when the Nazis took power in 1933 and by 1937 there were fewer than a million. This was in part due to the removal of women from the workforce. Real wages dropped by 25 per cent between 1933 and 1938. After the dissolution of the trade unions in May 1933, their funds were seized and their leadership arrested, including those who attempted to co-operate with the Nazis. A new organisation, the German Labour Front, was created and placed under the Nazi Party functionary Robert Ley. Many unemployed people were forcibly drafted into this organisation, where they were given uniforms and tools and put to work. As a result, unemployed people disappeared from the streets, contributing to the perception that the Nazis were improving economic conditions. The average work week was 43 hours in 1933; by 1939 this increased to 47 hours. By early 1934, the focus shifted towards rearmament. By 1935, military expenditures accounted for 73 per cent of the government's purchases of goods and services. On 18 October 1936, Hitler named Göring as Plenipotentiary of the Four Year Plan, intended to speed up rearmament. In addition to calling for the rapid construction of steel mills, synthetic rubber plants, and other factories, Göring instituted wage and price controls and restricted the issuance of stock dividends. Large expenditures were made on rearmament in spite of growing deficits. Plans unveiled in late 1938 for massive increases to the navy and air force were impossible to fulfil, as Germany lacked the finances and material resources to build the planned units, as well as the necessary fuel required to keep them running. With the introduction of compulsory military service in 1935, the Reichswehr, which had been limited to 100,000 by the terms of the Versailles Treaty, expanded to 750,000 on active service at the start of World War II, with a million more in the reserve. By January 1939, unemployment was down to 301,800 and it dropped to only 77,500 by September. The Nazi war economy was a mixed economy that combined a free market with central planning. The historian Richard Overy describes it as being somewhere in between the command economy of the Soviet Union and the capitalist system of the United States. In 1942, after the death of Armaments Minister Fritz Todt, Hitler appointed Albert Speer as his replacement. Wartime rationing of consumer goods led to an increase in personal savings, funds which were in turn lent to the government to support the war effort. By 1944, the war was consuming 75 per cent of Germany's gross domestic product, compared to 60 per cent in the Soviet Union and 55 per cent in Britain. Speer improved production by centralising planning and control, reducing production of consumer goods, and using forced labour and slavery. The wartime economy eventually relied heavily upon the large-scale employment of slave labour. Germany imported and enslaved some 12 million people from 20 European countries to work in factories and on farms. Approximately 75 per cent were Eastern European. Many were casualties of Allied bombing, as they received poor air raid protection. Poor living conditions led to high rates of sickness, injury, and death, as well as sabotage and criminal activity. The wartime economy also relied upon large-scale robbery, initially through the state seizing the property of Jewish citizens and later by plundering the resources of occupied territories. Foreign workers brought into Germany were put into four classifications: guest workers, military internees, civilian workers, and Eastern workers. Each group was subject to different regulations. The Nazis issued a ban on sexual relations between Germans and foreign workers. By 1944 over a half million women served as auxiliaries in the German armed forces. The number of women in paid employment only increased by 271,000 (1.8 per cent) from 1939 to 1944. As the production of consumer goods had been cut back, women left those industries for employment in the war economy. They also took jobs formerly held by men, especially on farms and in family-owned shops. Very heavy strategic bombing by the Allies targeted refineries producing synthetic oil and gasoline, as well as the German transportation system, especially rail yards and canals. The armaments industry began to break down by September 1944. By November, fuel coal was no longer reaching its destinations and the production of new armaments was no longer possible. Overy argues that the bombing strained the German war economy and forced it to divert up to one-fourth of its manpower and industry into anti-aircraft resources, which very likely shortened the war. Racial policy and eugenics Racism and antisemitism were basic tenets of the Nazi Party and the Nazi regime. Nazi Germany's racial policy was based on their belief in the existence of a superior master race. The Nazis postulated the existence of a racial conflict between the Aryan master race and inferior races, particularly Jews, who were viewed as a mixed race that had infiltrated society and were responsible for the exploitation and repression of the Aryan race. Discrimination against Jews began immediately after the seizure of power. Following a month-long series of attacks by members of the SA on Jewish businesses and synagogues, on 1 April 1933 Hitler declared a national boycott of Jewish businesses. The Law for the Restoration of the Professional Civil Service passed on 7 April forced all non-Aryan civil servants to retire from the legal profession and civil service. Similar legislation soon deprived other Jewish professionals of their right to practise, and on 11 April a decree was promulgated that stated anyone who had even one Jewish parent or grandparent was considered non-Aryan. As part of the drive to remove Jewish influence from cultural life, members of the National Socialist German Students' League removed from libraries any books considered un-German, and a nationwide book burning was held on 10 May. The regime used violence and economic pressure to encourage Jews to leave the country voluntarily. Jewish businesses were denied access to markets, forbidden to advertise, and deprived of access to government contracts. Citizens were harassed and subjected to violent attacks. Many towns posted signs forbidding entry to Jews. On 7 November 1938 a young Jewish man, Herschel Grynszpan, shot and killed Ernst vom Rath, a legation secretary at the German embassy in Paris, to protest against his family's treatment in Germany. This incident provided the pretext for a pogrom the Nazis incited against the Jews two days later. Members of the SA damaged or destroyed synagogues and Jewish property throughout Germany. At least 91 German Jews were murdered during this pogrom, later called Kristallnacht, the Night of Broken Glass. Further restrictions were imposed on Jews in the coming months – they were forbidden to own businesses or work in retail shops, drive cars, go to the cinema, visit the library, or own weapons, and Jewish pupils were removed from schools. The Jewish community was fined one billion marks to pay for the damage caused by Kristallnacht and told that any insurance settlements would be confiscated. By 1939 around 250,000 of Germany's 437,000 Jews had emigrated to the United States, Argentina, the United Kingdom, Palestine, and other countries. Many chose to stay in continental Europe. Emigrants to Palestine were allowed to transfer property there under the terms of the Haavara Agreement, but those moving to other countries had to leave virtually all their property behind, and it was seized by the government. Like the Jews, the Romani were subjected to persecution from the early days of the regime. The Romani were forbidden to marry people of German extraction. They were shipped to concentration camps starting in 1935 and many were murdered. Following the invasion of Poland, 2,500 Roma and Sinti people were deported from Germany to the General Government, where they were imprisoned in labour camps. The survivors were likely exterminated at Bełżec, Sobibor, or Treblinka. A further 5,000 Sinti and Austrian Lalleri people were deported to the Łódź Ghetto in late 1941, where half were estimated to have died. The Romani survivors of the ghetto were subsequently moved to the Chełmno extermination camp in early 1942. The Nazis intended on deporting all Romani people from Germany, and confined them to Zigeunerlager (Gypsy camps) for this purpose. Himmler ordered their deportation from Germany in December 1942, with few exceptions. A total of 23,000 Romani were deported to Auschwitz concentration camp, of whom 19,000 died. Outside of Germany, the Romani people were regularly used for forced labour, though many were murdered outright. In the Baltic states and the Soviet Union, 30,000 Romani were murdered by the SS, the German Army, and Einsatzgruppen. In occupied Serbia, 1,000 to 12,000 Romani were murdered, while nearly all 25,000 Romani living in the Independent State of Croatia were murdered. The estimates at end of the war put the total number of Romani victims at around 220,000, which equalled approximately 25 per cent of the Romani population in Europe. Action T4 was a programme of systematic murder of the physically and mentally handicapped and patients in psychiatric hospitals that took place mainly from 1939 to 1941, and continued until the end of the war. Initially the victims were shot by the Einsatzgruppen and others; gas chambers and gas vans using carbon monoxide were used by early 1940. Under the Law for the Prevention of Hereditarily Diseased Offspring, enacted on 14 July 1933, over 400,000 individuals underwent compulsory sterilisation. Over half were those considered mentally deficient, which included not only people who scored poorly on intelligence tests, but those who deviated from expected standards of behaviour regarding thrift, sexual behaviour, and cleanliness. Most of the victims came from disadvantaged groups such as prostitutes, the poor, the homeless, and criminals. Other groups persecuted and murdered included Jehovah's Witnesses, homosexuals, social misfits, and members of the political and religious opposition. Germany's war in the East was based on Hitler's long-standing view that Jews were the great enemy of the German people and that Lebensraum was needed for Germany's expansion. Hitler focused his attention on Eastern Europe, aiming to conquer Poland and the Soviet Union. Hitler's belief in the racial inferiority of Russians, as well as Slavs in general, had convinced him that a German conquest of Russia was inevitable. After the occupation of Poland in 1939, all Jews living in the General Government were confined to ghettos, and those who were physically fit were required to perform compulsory labour. In 1941 Hitler decided to destroy the Polish nation completely; within 15 to 20 years the General Government was to be cleared of ethnic Poles and resettled by German colonists. About 3.8 to 4 million Poles would remain as slaves, part of a slave labour force of 14 million the Nazis intended to create using citizens of conquered nations. To determine who should be killed, Himmler created the Volksliste, a system of classification of people deemed to be of German blood. He ordered that those of Germanic descent who refused to be classified as ethnic Germans should be deported to concentration camps, have their children taken away, or be assigned to forced labour. The plan also included the kidnapping of children deemed to have Aryan-Nordic traits, who were presumed to be of German descent. The goal was to implement Generalplan Ost after the conquest of the Soviet Union, but when the invasion failed Hitler had to consider other options. One suggestion was a mass forced deportation of Jews to Poland, Palestine, or Madagascar. In addition, the Nazis planned to reduce the population of the conquered territories by 30 million people through starvation in an action called the Hunger Plan. Food supplies would be diverted to the German army and German civilians. Cities would be razed and the land allowed to return to forest or resettled by German colonists. Together, the Hunger Plan and Generalplan Ost would have led to the starvation of 80 million people in the Soviet Union. These partially fulfilled plans resulted in the democidal deaths of an estimated 19.3 million civilians and prisoners of war (POWs) throughout the USSR and elsewhere in Europe. During the course of the war, the Soviet Union lost a total of 27 million people; less than nine million of these were combat deaths. One in four of the Soviet population were killed or wounded. Around the time of the failed offensive against Moscow in December 1941, Hitler resolved that the Jews of Europe were to be exterminated immediately. While the murder of Jewish civilians had been ongoing in the occupied territories of Poland and the Soviet Union, plans for the total eradication of the Jewish population of Europe—eleven million people—were formalised at the Wannsee Conference on 20 January 1942. Some would be worked to death and the rest would be murdered in the implementation of the Final Solution to the Jewish Question. Initially the victims were murdered by Einsatzgruppen firing squads, then by stationary gas chambers or by gas vans, but these methods proved impractical for an operation of this scale. By 1942 extermination camps equipped with gas chambers were established at Auschwitz, Chełmno, Sobibor, Treblinka, and elsewhere. The total number of Jews murdered is estimated at 5.5 to six million, including over a million children. The Allies received information about the murders from the Polish government-in-exile and Polish leadership in Warsaw, based mostly on intelligence from the Polish underground. German citizens had access to information about what was happening, as soldiers returning from the occupied territories reported on what they had seen and done. Historian Richard J. Evans states that most German citizens disapproved of the genocide.[p] Poles were viewed by Nazis as subhuman non-Aryans, and during the German occupation of Poland 2.7 million ethnic Poles died. Polish civilians were subject to forced labour in German industry, internment, wholesale expulsions to make way for German colonists, and mass executions. The German authorities engaged in a systematic effort to destroy Polish culture and national identity. During operation AB-Aktion, many university professors and members of the Polish intelligentsia were arrested, transported to concentration camps, or executed. During the war, Poland lost an estimated 39 to 45 per cent of its physicians and dentists, 26 to 57 per cent of its lawyers, 15 to 30 per cent of its teachers, 30 to 40 per cent of its scientists and university professors, and 18 to 28 per cent of its clergy. The Nazis captured 5.75 million Soviet prisoners of war, more than they took from all the other Allied powers combined. Of these, they killed an estimated 3.3 million, with 2.8 million of them being killed between June 1941 and January 1942. Many POWs starved to death or resorted to cannibalism while being held in open-air pens at Auschwitz and elsewhere. From 1942 onward, Soviet POWs were viewed as a source of forced labour, and received better treatment so they could work. By December 1944, 750,000 Soviet POWs were working, including in German armaments factories (in violation of the Hague and Geneva conventions), mines, and farms. Society Antisemitic legislation passed in 1933 led to the removal of all Jewish teachers, professors, and officials from the education system. Most teachers were required to belong to the Nationalsozialistischer Lehrerbund (NSLB; National Socialist Teachers League) and university professors were required to join the National Socialist German Lecturers. Teachers had to take an oath of loyalty and obedience to Hitler, and those who failed to show sufficient conformity to party ideals were often reported by students or fellow teachers and dismissed. Lack of funding for salaries led to many teachers leaving the profession. The average class size increased from 37 in 1927 to 43 in 1938 due to the resulting teacher shortage. Frequent and often contradictory directives were issued by Interior Minister Wilhelm Frick, Bernhard Rust of the Reich Ministry of Science, Education and Culture, and other agencies regarding content of lessons and acceptable textbooks for use in primary and secondary schools. Books deemed unacceptable to the regime were removed from school libraries. Indoctrination in Nazi ideology was made compulsory in January 1934. Students selected as future members of the party elite were indoctrinated from the age of 12 at Adolf Hitler Schools for primary education and National Political Institutes of Education for secondary education. Detailed indoctrination of future holders of elite military rank was undertaken at Order Castles. Primary and secondary education focused on racial biology, population policy, culture, geography, and physical fitness. The curriculum in most subjects, including biology, geography, and even arithmetic, was altered to change the focus to race. Military education became the central component of physical education, and education in physics was oriented toward subjects with military applications, such as ballistics and aerodynamics. Students were required to watch all films prepared by the school division of the Reich Ministry of Public Enlightenment and Propaganda. At universities, appointments to top posts were the subject of power struggles between the education ministry, the university boards, and the National Socialist German Students' League. In spite of pressure from the League and various government ministries, most university professors did not make changes to their lectures or syllabus during the Nazi period. This was especially true of universities located in predominantly Catholic regions. Enrolment at German universities declined from 104,000 students in 1931 to 41,000 in 1939, but enrolment in medical schools rose sharply as Jewish doctors had been forced to leave the profession, so medical graduates had good job prospects. From 1934, university students were required to attend frequent and time-consuming military training sessions run by the SA. First-year students also had to serve six months in a labour camp for the Reich Labour Service; an additional ten weeks service were required of second-year students. Women were a cornerstone of Nazi social policy. The Nazis opposed the feminist movement, claiming that it was the creation of Jewish intellectuals, instead advocating a patriarchal society in which the German woman would recognise that her "world is her husband, her family, her children, and her home". Feminist groups were shut down or incorporated into the National Socialist Women's League, which coordinated groups throughout the country to promote motherhood and household activities. Courses were offered on childrearing, sewing, and cooking. Prominent feminists, including Anita Augspurg, Lida Gustava Heymann, and Helene Stöcker, felt forced to live in exile. The League published the NS-Frauen-Warte, the only Nazi-approved women's magazine in Nazi Germany; despite some propaganda aspects, it was predominantly an ordinary woman's magazine. Women were encouraged to leave the workforce, and the creation of large families by racially suitable women was promoted through propaganda campaigns. Women received a bronze award—known as the Ehrenkreuz der Deutschen Mutter (Cross of Honour of the German Mother)—for giving birth to four children, silver for six, and gold for eight or more. Large families received subsidies to help with expenses. Though the measures led to increases in the birth rate, the number of families having four or more children declined by five per cent between 1935 and 1940. Removing women from the workforce did not have the intended effect of freeing up jobs for men, as women were for the most part employed as domestic servants, weavers, or in the food and drink industries—jobs that were not of interest to men. Nazi philosophy prevented large numbers of women from being hired to work in munitions factories in the build-up to the war, so foreign labourers were brought in. After the war started, slave labourers were extensively used. In January 1943, Hitler signed a decree requiring all women under the age of fifty to report for work assignments to help the war effort. Thereafter women were funnelled into agricultural and industrial jobs, and by September 1944 14.9 million women were working in munitions production. Nazi leaders endorsed the idea that rational and theoretical work was alien to a woman's nature, and as such discouraged women from seeking higher education. A law passed in April 1933 limited the number of women admitted to university to ten per cent of the number of men. This resulted in female enrolment in secondary schools dropping from 437,000 in 1926 to 205,000 in 1937. The number of women enrolled in post-secondary schools dropped from 128,000 in 1933 to 51,000 in 1938. However, with the requirement that men be enlisted into the armed forces during the war, women comprised half of the enrolment in the post-secondary system by 1944. Women were expected to be strong, healthy, and vital. The sturdy peasant woman who worked the land and bore strong children was considered ideal, and women were praised for being athletic and tanned from working outdoors. Organisations were created for the indoctrination of Nazi values. From 25 March 1939 membership in the Hitler Youth was made compulsory for all children over the age of ten. The Jungmädelbund (Young Girls League) section of the Hitler Youth was for girls age 10 to 14, and the Bund Deutscher Mädel (BDM; League of German Girls) for young women age 14 to 18. The BDM's activities focused on physical education. The Nazi regime promoted a liberal code of conduct regarding sexual matters and was sympathetic to women who bore children out of wedlock. Promiscuity increased as the war progressed, with unmarried soldiers often intimately involved with several women simultaneously. Soldiers' wives were frequently involved in extramarital relationships. Sex was sometimes used as a commodity to obtain better work from a foreign labourer. Pamphlets enjoined German women to avoid sexual relations with foreign workers as a danger to their blood. With Hitler's approval, Himmler intended that the new society of the Nazi regime should destigmatise illegitimate births, particularly of children fathered by members of the SS, who were vetted for racial purity. His hope was that each SS family would have between four and six children. The Lebensborn (Fountain of Life) association, founded by Himmler in 1935, created a series of maternity homes to accommodate single mothers during their pregnancies. Both parents were examined for racial suitability before acceptance. The resulting children were often adopted into SS families. The homes were also made available to the wives of SS and Nazi Party members, who quickly filled over half the available spots. Existing laws banning abortion except for medical reasons were strictly enforced by the Nazi regime. The number of abortions declined from 35,000 per year at the start of the 1930s to fewer than 2,000 per year at the end of the decade, though in 1935 a law was passed allowing abortions for eugenics reasons. Nazi Germany had a strong anti-tobacco movement, as pioneering research by Franz H. Müller in 1939 demonstrated a causal link between smoking and lung cancer. The Reich Health Office took measures to try to limit smoking, including producing lectures and pamphlets. Smoking was banned in many workplaces, on trains, and among on-duty members of the military. Government agencies also worked to control other carcinogenic substances such as asbestos and pesticides. As part of a general public health campaign, water supplies were cleaned up, lead and mercury were removed from consumer products, and women were urged to undergo regular screenings for breast cancer. Government-run health care insurance plans were available, but Jews were denied coverage starting in 1933. That same year, Jewish doctors were forbidden to treat government-insured patients. In 1937, Jewish doctors were forbidden to treat non-Jewish patients, and in 1938 their right to practice medicine was removed entirely. Medical experiments, many of them pseudoscientific, were performed on concentration camp inmates beginning in 1941. The most notorious doctor to perform medical experiments was SS-Hauptsturmführer Josef Mengele, camp doctor at Auschwitz. Many of his victims died. Concentration camp inmates were made available for purchase by pharmaceutical companies for drug testing and other experiments. Nazi society had elements supportive of animal rights and many people were fond of zoos and wildlife. The government took several measures to ensure the protection of animals and the environment. In 1933 the Nazis enacted a stringent animal-protection law that affected what was allowed for medical research. The law was only loosely enforced, and in spite of a ban on vivisection, the Ministry of the Interior readily handed out permits for experiments on animals. The Reich Forestry Office under Göring enforced regulations that required foresters to plant a variety of trees to ensure suitable habitat for wildlife, and a new Reich Animal Protection Act became law in 1933. The regime enacted the Reich Nature Protection Act in 1935 to protect the natural landscape from excessive economic development. It allowed for the expropriation of privately owned land to create nature preserves and aided in long-range planning. Perfunctory efforts were made to curb air pollution, but little enforcement was undertaken once the war began. When the Nazis seized power in 1933, roughly 67 per cent of the population of Germany was Protestant, 33 per cent was Roman Catholic, while Jews made up less than 1 per cent. According to the 1939 census, taken following the annexation of Austria, 54 per cent of the population considered themselves Protestant, 40 per cent Roman Catholic, 3.5 per cent Gottgläubig (God-believing; a Nazi religious movement) and 1.5 per cent nonreligious. Nazi Germany extensively employed Christian imagery and instituted a variety of new Christian celebrations, such as a massive celebration marking the 1200th anniversary of the birth of the Frankish emperor Charlemagne, who Christianised neighbouring continental Germanic peoples by force. Nazi propaganda stylised Hitler as a Christ-like messiah, a "figure of redemption according to the Christian model", "who would liberate the world from the Antichrist". Under the Gleichschaltung process, Hitler attempted to create a unified Protestant Reich Church from Germany's 28 existing Protestant state churches. The pro-Nazi Ludwig Müller was installed as Reich Bishop and the pro-Nazi pressure group German Christians gained control of the new church. They objected to the Old Testament because of its Jewish origins and demanded that converted Jews be barred from their church. Pastor Martin Niemöller responded with the formation of the Confessing Church, from which some clergymen opposed the Nazi regime. When in 1935 the Confessing Church synod protested the Nazi policy on religion, 700 of their pastors were arrested. Müller resigned and Hitler appointed Hanns Kerrl as Minister for Church Affairs to continue efforts to control Protestantism. In 1936, a Confessing Church envoy protested to Hitler against the religious persecutions and human rights abuses. Hundreds more pastors were arrested. The church continued to resist and by early 1937 Hitler abandoned his hope of uniting the Protestant churches. Niemöller was arrested on 1 July 1937 and spent most of the next seven years in Sachsenhausen concentration camp and Dachau. Theological universities were closed and pastors and theologians of other Protestant denominations were also arrested. Persecution of the Catholic Church in Germany followed the Nazi takeover. Hitler moved quickly to eliminate political Catholicism, rounding up functionaries of the Catholic-aligned Bavarian People's Party and Catholic Centre Party, which along with all other non-Nazi political parties ceased to exist by July. The Reichskonkordat (Reich Concordat) treaty with the Vatican was signed in 1933, amid continuing harassment of the church in Germany. The treaty required the regime to honour the independence of Catholic institutions and prohibited clergy from involvement in politics. Hitler routinely disregarded the Concordat, closing all Catholic institutions whose functions were not strictly religious. Clergy, nuns and lay leaders were targeted, with thousands of arrests over the ensuing years, often on trumped-up charges of currency smuggling or immorality. Several Catholic leaders were targeted in the 1934 Night of the Long Knives assassinations. Most Catholic youth groups refused to dissolve themselves and Hitler Youth leader Baldur von Schirach encouraged members to attack Catholic boys in the streets. Propaganda campaigns claimed the church was corrupt, restrictions were placed on public meetings and Catholic publications faced censorship. Catholic schools were required to reduce religious instruction and crucifixes were removed from state buildings. Pope Pius XI had the "Mit brennender Sorge" ("With Burning Concern") encyclical smuggled into Germany for Passion Sunday 1937 and read from every pulpit as it denounced the systematic hostility of the regime toward the church. In response, Goebbels renewed the regime's crackdown and propaganda against Catholics. Enrolment in denominational schools dropped sharply and by 1939 all such schools were disbanded or converted to public facilities. Later Catholic protests included the 22 March 1942 pastoral letter by the German bishops on "The Struggle against Christianity and the Church". About 30 per cent of Catholic priests were disciplined by police during the Nazi era. A vast security network spied on clergy and priests were frequently denounced, arrested or sent to concentration camps – many to the dedicated clergy barracks at Dachau. In the areas of Poland annexed in 1939, the Nazis instigated a brutal suppression and systematic dismantling of the Catholic Church. Alfred Rosenberg, head of the Nazi Party Office of Foreign Affairs and Hitler's appointed cultural and educational leader for Nazi Germany, considered Catholicism to be among the Nazis' chief enemies. He planned the "extermination of the foreign Christian faiths imported into Germany", and for the Bible and Christian cross to be replaced in all churches, cathedrals, and chapels with copies of Mein Kampf and the swastika. Other sects of Christianity were also targeted, with Chief of the Nazi Party Chancellery Martin Bormann publicly proclaiming in 1941, "National Socialism and Christianity are irreconcilable." Culture If the experience of the Third Reich teaches us anything, it is that a love of great music, great art and great literature does not provide people with any kind of moral or political immunization against violence, atrocity, or subservience to dictatorship. The regime promoted the concept of Volksgemeinschaft, a national German ethnic community. The goal was to build a classless society based on racial purity and the perceived need to prepare for warfare, conquest and a struggle against Marxism. The German Labour Front founded the Kraft durch Freude (KdF; Strength Through Joy) organisation in 1933. As well as taking control of tens of thousands of privately run recreational clubs, it offered highly regimented holidays and entertainment such as cruises, vacation destinations and concerts. The Reichskulturkammer (Reich Chamber of Culture) was organised under the control of the Propaganda Ministry in September 1933. Sub-chambers were set up to control aspects of cultural life such as film, radio, newspapers, fine arts, music, theatre and literature. Members of these professions were required to join their respective organisation. Jews and people considered politically unreliable were prevented from working in the arts, and many emigrated. Books and scripts had to be approved by the Propaganda Ministry prior to publication. Standards deteriorated as the regime sought to use cultural outlets exclusively as propaganda. Radio became popular in Germany during the 1930s; over 70 per cent of households owned a receiver by 1939, more than any other country. By July 1933, radio station staffs were purged of leftists and others deemed undesirable. Propaganda and speeches were typical radio fare immediately after the seizure of power, but as time went on Goebbels insisted more music should be played so that listeners would not turn to foreign broadcasters for entertainment. Newspapers, like other media, were controlled by the state; the Reich Press Chamber shut down or bought newspapers and publishing houses. By 1939 over two-thirds of the newspapers and magazines were directly owned by the Propaganda Ministry. The Nazi Party daily newspaper, the Völkischer Beobachter ("Ethnic Observer"), was edited by Rosenberg. Goebbels controlled the wire services and insisted that all newspapers in Germany should only publish content favourable to the regime. Under Goebbels, the Propaganda Ministry issued two dozen directives every week on exactly what news should be published and what angles to use; the typical newspaper followed the directives closely, especially regarding what to omit. Newspaper readership plummeted, partly because of the perceived decreased quality of the content and partly because of the surge in popularity of radio. Propaganda became less effective towards the end of the war, as people were able to obtain information outside official channels. Many authors left the country and some wrote material critical of the regime while in exile. Goebbels recommended that the remaining authors should concentrate on books themed on Germanic myths and the concept of blood and soil. By the end of 1933 over a thousand books—most of them by Jewish authors or featuring Jewish characters—had been banned by the Nazi regime. Nazi book burnings took place; nineteen such events were held on the night of 10 May 1933. Tens of thousands of books from dozens of figures, including Albert Einstein, Sigmund Freud, Helen Keller, Alfred Kerr, Marcel Proust, Erich Maria Remarque, Upton Sinclair, Jakob Wassermann, H. G. Wells, and Émile Zola were publicly burnt. Pacifist works, and literature espousing liberal, democratic values were targeted for destruction, as well as any writings supporting the Weimar Republic or those written by Jewish authors. Hitler took a personal interest in architecture and worked closely with the state architects Paul Troost and Albert Speer to create public buildings in a neoclassical style based on Roman architecture. Speer constructed imposing structures such as the Nazi party rally grounds in Nuremberg and a new Reich Chancellery building in Berlin. Hitler's plans for rebuilding Berlin included a gigantic dome based on the Pantheon in Rome and a triumphal arch more than double the height of the Arc de Triomphe in Paris. Neither structure was built. Hitler's belief that abstract, Dadaist, expressionist and modern art were decadent became the basis for policy. Many art museum directors lost their posts in 1933 and were replaced by party members. Some 6,500 modern works of art were removed from museums and replaced with works chosen by a Nazi jury. Exhibitions of the rejected pieces, under titles such as "Decadence in Art", were launched in sixteen different cities by 1935. The Degenerate Art Exhibition, organised by Goebbels, ran in Munich from July to November 1937. The exhibition proved wildly popular, attracting over two million visitors. The composer Richard Strauss was appointed president of the Reichsmusikkammer (Reich Music Chamber) on its founding in November 1933. As was the case with other art forms, the Nazis ostracised musicians who were deemed racially unacceptable and for the most part disapproved of music that was too modern or atonal. Jazz was considered especially inappropriate and foreign jazz musicians left the country or were expelled. Hitler favoured the music of Richard Wagner, especially pieces based on Germanic myths and heroic stories, and attended the Bayreuth Festival each year from 1933 to 1942. Movies were popular in Germany in the 1930s and 1940s, with admissions of over a billion people in 1942, 1943, and 1944. By 1934, German regulations restricting currency exports made it impossible for US filmmakers to take their profits back to America, so the major film studios closed their German branches. Exports of German films plummeted, as their antisemitic content made them impossible to show in other countries. The two largest film companies, Universum Film AG and Tobis, were purchased by the Propaganda Ministry, which by 1939 was producing most German films. The productions were not always overtly propagandistic, but generally had a political subtext and followed party lines regarding themes and content. Scripts were pre-censored. Leni Riefenstahl's Triumph of the Will (1935)—documenting the 1934 Nuremberg Rally—and Olympia (1938)—covering the 1936 Summer Olympics—pioneered techniques of camera movement and editing that influenced later films. New techniques such as telephoto lenses and cameras mounted on tracks were employed. Both films remain controversial, as their aesthetic merit is inseparable from their propagandising of Nazi ideals. Legacy The Allied powers organised war crimes trials, beginning with the Nuremberg trials, held from November 1945 to October 1946, of 23 top Nazi officials. They were charged with conspiracy to commit crimes, crimes against peace, war crimes and crimes against humanity. All but three were found guilty and twelve were sentenced to death. Twelve subsequent Nuremberg trials of 184 defendants were held between 1946 and 1949. Between 1946 and 1949, the Allies investigated 3,887 cases, of which 489 were brought to trial. The result was convictions of 1,426 people; 297 of these were sentenced to death and 279 to life in prison, with the remainder receiving lesser sentences. About 65 per cent of the death sentences were carried out. Poland was more active than other nations in investigating war crimes, for example prosecuting 673 of the total 789 Auschwitz staff brought to trial. The political programme espoused by Hitler and the Nazis brought about a world war, leaving behind a devastated and impoverished Europe. Germany itself suffered wholesale destruction, characterised as Stunde Null (Zero Hour). The number of civilians killed during the Second World War was unprecedented in the history of warfare. As a result, Nazi ideology and the actions taken by the regime are almost universally regarded as gravely immoral. Historians, philosophers, and politicians often use the word "evil" to describe Hitler and the Nazi regime. Interest in Nazi Germany continues in the media and the academic world. While Evans remarks that the era "exerts an almost universal appeal because its murderous racism stands as a warning to the whole of humanity", young neo-Nazis enjoy the shock value that Nazi symbols or slogans provide. The display or use of Nazi symbolism is illegal in Germany and Austria. Nazi Germany was succeeded by three states: West Germany (the Federal Republic of Germany or "FRG"), East Germany (the German Democratic Republic or "GDR"), and Austria. The process of denazification initiated by the Allies was only partially successful, as the need for experts in such fields as medicine and engineering was too great. However, expression of Nazi views was frowned upon, and those who expressed such views were frequently dismissed from their jobs. From the immediate post-war period through the 1950s, Germans kept quiet about their wartime experiences and felt a sense of communal guilt. The trial of Adolf Eichmann in 1961 and the broadcast of the television miniseries Holocaust in 1978 brought the process of Vergangenheitsbewältigung (coping with the past) to the forefront for many Germans. Once study of Nazi Germany was introduced into the school curriculum starting in the 1970s, people began researching the experiences of their family members. Study of the era and a willingness to critically examine its mistakes has led to the development of a strong democracy in Germany, but with lingering undercurrents of antisemitism and neo-Nazi thought. In 2017, a Körber Foundation survey found that 47 per cent of 14- to 16-year-olds polled knew what Auschwitz was. The journalist Alan Posener attributed the country's "growing historical amnesia" in part to a failure by German film and television to reflect the country's history accurately. See also References External links Final Solution Pre-Machtergreifung Post-Machtergreifung 52°31′N 13°22′E / 52.52°N 13.37°E / 52.52; 13.37 |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_note-210] | [TOKENS: 10728] |
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006—over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to £700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, — with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) — as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999–2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256×224 to 640×480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consoles—including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears out—usually unevenly—due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/File:Jean-Baptiste_de_Lamarck.jpg] | [TOKENS: 138] |
File:Jean-Baptiste de Lamarck.jpg Summary Licensing The author died in 1838, so this work is in the public domain in its country of origin and other countries and areas where the copyright term is the author's life plus 100 years or fewer. This work is in the public domain in the United States because it was published (or registered with the U.S. Copyright Office) before January 1, 1931. File history Click on a date/time to view the file as it appeared at that time. File usage The following 9 pages use this file: Global file usage The following other wikis use this file: View more global usage of this file. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Shas] | [TOKENS: 2988] |
Contents Shas Shas (Hebrew: ש״ס) is a Haredi religious political party in Israel. Founded in 1984 by Rabbi Ovadia Yosef, a former Israeli Sephardi chief rabbi, who remained its spiritual leader until his death in October 2013, it primarily represents the interests of Sephardic and Mizrahi Haredi Jews. Shas is the third-largest party in the Knesset as of 2024[update]. Since 1984, it has been part of most governing coalitions, whether the ruling party was Labor or Likud. Name The party was originally called Shom'rei Torah ("Guardians of the Torah"), with the acronym ש״ת, pronounced "Shat" or "Shas". However, Israeli election law requires a party wishing to use letters for their acronym that already appear in the acronym of an existing party to first obtain permission from that party, and the Israeli Labor Party, whose letters are אמת, refused to grant Shas permission to use the ת. Instead, it was named ש״ס, Shas, an acronym for Shomrei S'farad, meaning "Sephardic Guardians". The name is also a reference to the six orders (Shisha S'darim) of the Mishnah and the Talmud, both of which are often referred to by the same acronym, "Shas". The party's legal name is "Hit'akhdut ha-S'pharadim ha-Olamit Shom'rei Torah" (התאחדות הספרדים העולמית שומרי תורה), meaning "International Union of the Sepharadim, Guardians of the Torah". History Shas was founded in 1984, prior to the elections to the eleventh Knesset in the same year, in protest against the small representation of Sephardim in the largely Ashkenazi Agudat Yisrael, through the merger of regional lists which were compiled in 1983. It was originally known as the Worldwide Sephardic Association of Torah Keepers (Hebrew: הִתְאַחֲדוּת הַסְּפָרַדִּים הָעוֹלָמִית שׁוֹמְרֵי תּוֹרָה, Hitahdut HaSfaradim HaOlamit Shomrei Torah). The party was formed under the leadership of former Israeli Chief Sephardi Rabbi Ovadia Yosef, who established a four-member (including himself) Council of Torah Sages and remained the party's spiritual leader until his death. In founding the party, Yosef received strategic help and guidance from Rabbi Elazar Shach, leader of Israel's non-Hasidic Haredi Ashkenazi Jews. Yosef founded the party in 1984 on the platform of a return to religion and as a counter to an establishment dominated by Ashkenazi Jews of European extraction. Not all Shas voters are ultra-Orthodox Jews. Many of its voters are Modern Orthodox and traditional Mizrahi and Sephardi Jews, due to its alignment with the promotion of an "authentic Middle Eastern" Israeli culture, which fits with traditional Zionist beliefs of a revival of authentic, non-Europeanized Jewish culture. However, it still represents the Sephardi and Mizrahi Haredi Jewish sectors in the Knesset. Shas has at times been able to exert disproportionate influence by gaining control of the balance of power in the Knesset within the context of the traditionally narrow margin between Israel's large parties. Like its Labor Zionist counterparts (i. e., Labor and Meretz) that gain votes from the kibbutz movement, Shas gains votes and supports from moshavim that are inhabited by Mizrahi and Sephardi Jews, either Orthodox or non-Orthodox. Also, since it became a member of the World Zionist Organization, it gains votes from Orthodox settlers in the West Bank.[citation needed] In the elections to the eleventh Knesset in 1984, Shas won four seats. Following Aryeh Deri's conviction on corruption charges in 1999, Shas gained 17 seats in the 1999 elections, its strongest showing since its formation. Although 26 seats were projected for the following election had it run in 2001, Shas was reduced to 11 seats in the 2003 election because the two-ballot system was amended.[citation needed] In the 2006 elections, it gained one more seat, after running what the BBC called "an aggressive campaign that targeted the neo-conservative economic policies of the previous government", and joined Ehud Olmert's coalition government, alongside Kadima, Labor, Gil and, between October 2006 and January 2008, Yisrael Beiteinu. In the government, Shas party leader Yishai was Minister of Industry, Trade and Labor and Deputy Prime Minister, while Ariel Atias was Minister of Communications and Meshulam Nahari and Yitzhak Cohen were Ministers without Portfolio.[citation needed] Following the 2009 elections, in which Shas won eleven seats, it joined Benjamin Netanyahu's coalition government and held four cabinet posts. Eli Yishai, who led the party at that time, was one of four Deputy Prime Ministers and Minister of Internal Affairs. On 4 December 2011, Shas launched its United States affiliate, American Friends of Shas, based in Brooklyn, New York. Shas won 11 seats in the 2013 elections, but chose to form part of the Labor opposition to Netanyahu's new government. Yair Lapid of the Yesh Atid party and Naftali Bennett of The Jewish Home, who had won more seats and joined the coalition, both favored conscription of the previously exempt Haredi men into Israel's national service and a reduction in state financial support for Haredi families, policies Shas opposes.[citation needed] In December 2014, Eli Yishai left the Shas party, which he had led for more than a decade. He said he would lead a new religious party in the election scheduled for March 2015. His departure from Shas and Aryeh Deri did not come as a surprise. The party that he formed, Yachad, failed to pass the election threshold. In the ensuing election, Shas was accused of tampering with Yachad ballots. They were also accused of creating a straw party with the symbols of Otzma Yehudit, which was running on a joint list with Yachad during the election. Shas won 7 seats in the election. In 2017, opinion polling showed that Shas was falling under the election threshold of 3.25%. In response, Shas leaders said that there was a coup attempt in the party. In the same year, a tape was leaked of the party's former spiritual leader, criticizing Jerusalem Chief Rabbi Shlomo Amar. The party won 8 seats in the April 2019 election, and 9 seats in September 2019. Both elections were inconclusive, and resulted in a third election in 2020, in which Shas won 9 seats. After the election, a senior Likud minister anonymously told Al-Monitor that Deri was mediating political coalition talks between Netanyahu and Blue and White leader Benny Gantz. It was also reported that Deri "might even be open to a new alliance with Blue and White" after the anti-clerical Yesh Atid split from the alliance. Negotiations subsequently resulted in the formation of a rotation government between Netanyahu and Gantz, which included Shas. The government collapsed in December 2020 due to failed budget negotiations, resulting in another snap election in 2021. Shas won 9 seats in the 2021 election, but remained out of government for the first time since 2013. Later that year, Deri resigned from the Knesset as part of a plea bargain for alleged tax evasion. The new government collapsed after a year, and elections were held in 2022, Shas won 11 seats. Shas returned to government, with Deri becoming Interior Minister, Health Mnister and Vice Prime Minister. In early 2023, Deri was required to relinquish his ministerial posts due to the plea bargain after a ruling by the Supreme Court. He remained a member of the Knesset and the leader of Shas. Shas' ministers resigned from government in July 2025 over the government's failure to pass a law excluding ultraorthodox youth from the draft, but remain part of the coalition. Ideology Defunct The stated purpose of the party is to "return the crown to the former glory", meaning to protect the religious and cultural heritage of Sephardic Jewry and rectify what it sees as the "continued economic and social discrimination against the Sephardic population of Israel". Focusing on the needs of Sephardic Orthodox Israelis, Shas established its own government-funded religious education system called MaAyan HaHinuch HaTorani, which became popular in poor Sephardic towns, increasing the party's popular support. Shas advocates for the increased influence of Halakha, the Jewish religious law, in Israeli society, and actively engages in the Baal teshuva movement, encouraging non-Orthodox Israelis of Sephardic and Mizrahi-Jewish heritage to adopt an Orthodox Jewish lifestyle. Shas is a Haredi religious party, but it has participated in left-wing governments and is often willing to compromise on both religious and economic issues. At first, Shas followed a moderate policy on the Israeli–Palestinian conflict, after Yosef had declared that lives were more important than territories, but by the 2010s it had moved to the right, opposing any freeze in Israeli settlement activity in the West Bank. In addition, it was skeptical towards the U.S. Obama Administration's intentions regarding the Israeli–Palestinian peace process and began to support a consolidation of Israeli settlement interests, especially regarding yeshivas and Jewish holy sites in the West Bank. It further believes in a "United Jerusalem" and supports the Greater Jerusalem plan.[clarification needed] In 2010, Shas joined the World Zionist Organization, having made significant changes to its charter. One of Shas's demands is a compensation package for Sephardi and Mizrahi Jews who were forced to flee their home countries and leave their property behind.[citation needed] Shas opposes any form of public expression of homosexuality, including Gay Pride parades, especially in Jerusalem. Shas MK Nissim Ze'ev accused the homosexual community of "carrying out the self-destruction of Israeli society and the Jewish people", calling homosexuals "a plague as toxic as bird flu". However, the party condemns any form of violence against gay people. Controversies Several Shas MKs, including Aryeh Deri, Rafael Pinhasi, Yair Levy, Ofer Hugi and Yair Peretz, have been convicted of criminal offenses that include fraud and forgery. In addition, MK Shlomo Benizri was convicted of bribery, conspiring to commit a crime and obstruction of justice on 1 April 2008. In 2010, Ovadia Yosef cursed the Palestinians as "evil, bitter enemies of Israel" and said that, "Abu Mazen and all these evil people should perish from this world. God should strike them with a plague." Saeb Erekat of the PLO said Yosef's remarks were tantamount to a call for "genocide against Palestinians". Yosef later apologized and wrote to Egyptian President Hosni Mubarak: "I support your efforts and praise all the leaders and the peoples — Egyptians, Jordanians and Palestinians — who are partners and wish the success of this important process of achieving peace in our region, and preventing bloodshed. May God grant you longevity and may you succeed in your efforts for peace and may there be peace in our region." Previously, Yosef had called Arabs "vipers" and called for Israel to "annihilate" them. "It is forbidden to be merciful to them. You must send missiles to them and annihilate them. They are evil and damnable." A spokesman later clarified that his comments were only aimed at murderers and terrorists and not the entire Arab world. In 2020, the party was fined ₪7,500 by the Central Elections Committee for handing out prayer cards at polling stations during the 2020 Knesset elections, which were claimed to cure "Corona and every illness and pestilence". Women's campaign Women activists protested the lack of female representation in Shas by organizing a "No Female Candidate, No Female Vote" campaign. The women said they would not vote for a party that does not include women candidates on its slate and sent an open letter to the Knesset representatives of ultra-Orthodox parties, which was also circulated on social media. Rabbi Mordechai Blau, a senior party member, threatened that women participating in the movement or bucking the party leadership would find their children "banned from Haredi schools" and their employers "boycotted by the community". Shas announced that it would create a women's council within the movement, a step that was welcomed by the campaigners. At the same time, they said: "We will move forward and call on the Haredi factions to enable women to serve as MKs in the Knesset." Eli Yishai said on Israel Radio: "There is nothing in Jewish law that says you can't have a woman as a Knesset member. But our rabbis decide what they decide on every subject and the same goes for this." When a group of ultra-Orthodox women created their own party, U'Bizchutan, Isaac Bezalel, the Shas spokesman, said: "The Haredi public is not yet open to women serving in the Knesset." Knesset members Eleven men serve as members of the Knesset for Shas in the twenty-fifth Knesset: Party leaders Election results References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Data_Protection_Commissioner] | [TOKENS: 930] |
Contents Data Protection Commissioner The Office of the Data Protection Commissioner (Irish: An Coimisinéir Cosanta Sonraí) (DPC), also known as Data Protection Commission, is the independent national authority responsible for upholding the EU fundamental right of individuals to data privacy through the enforcement and monitoring of compliance with data protection legislation in Ireland. It was established in 1989. Role and operations The independent role and powers of the Data Protection Commissioner are as set out in legislation in the Data Protection Acts 1988 and 2003. These Acts transpose the Council of Europe 1981 Data Protection Convention (Convention 108) and the 1995 EU Data Protection Directive (Directive 95/46/EC). However, the latter was then replaced by the EU General Data Protection Regulation (GDPR), which is directly applicable upon Members States such as Ireland. Complaints received from individuals who feel that their personal information is not being treated in accordance with the data protection law are investigated under section 10 of the Data Protection Acts. It is the statutory obligation of the Office to seek to amicably resolve complaints in the first instance. Where an amicable resolution cannot be achieved, the Commissioner may make a decision on whether, in her opinion, there has been a breach of the law. If the complainant or the data controller disagrees with the Commissioner's finding, they have the right to appeal the decision to the Circuit Court. The DPC's main priority, if a complaint is upheld, is that the data controller complies with the law and puts right the matter concerned. If an organization does not voluntarily cooperate with an investigation, the DPC has powers of compulsion to require such cooperation. In 2015, the Office received 932 complaints that were opened for investigation. Investigations into 1,015 complaints were concluded. In 2018, Martin Meany, editor of Goosed.ie, filed a complaint to the DPC against the Diocese of Ossory stating he wished for his baptismal records to be deleted. The complaint started a subsequent "own volition enquiry" by the DPC into "whether the church's holding of personal data on baptisms and other Catholic sacraments that individuals may have taken falls under the EU's data protection law, the General Data Protection Regulation". In 2022, Meany launched High Court Judicial Review proceedings against the DPC. He claims the DPC has failed to complete an investigation into his complaint against the Catholic Church. In 2021, NOYB (None Of Your Business), an Austrian NGO founded by Max Schrems, filed a complaint against the DPC for corruption under Austrian law after the DPC demanded that the group sign a non-disclosure agreement in order to continue with their long-running complaint against Facebook. NOYB argued that the DPC could not demand favourable media coverage as the price of using its services. In January 2023, DPC was forced to increase the fine issued to Meta Platforms after a review by European Data Protection Board found that the initial fine was insufficient. European Data Protection Board determined that DPC has failed to perform its enforcement responsibility with "due diligence". The critics have pointed out that 7 out of 8 decisions handed down by European Data Protection Board were against the Irish DPC, and that the DPC "always choose the most tortuous, lengthy and expensive legal route to a decision rather than a simple application of EU law". Section 10 (1A) of the Acts provides that "the Commissioner may carry out or cause to be carried out such investigations as he or she considers appropriate in order to ensure compliance with the provisions of this Act and to identify any contravention thereof." These investigations often take the form of audits of selected organizations. The aim of an audit is to identify any issues of concern about the way the organization under scrutiny manages personal data.[citation needed] In 2015, the DPC carried out 51 audits and inspections of organizations in the public and private sectors. Enforcement All breaches of the Privacy and Electronic Communications (EC Directive) Regulations 2003 for which the Office of the Data Protection Commissioner has responsibility are offences. The offences relate primarily to the sending of unsolicited marketing communications by electronic means. The offences are punishable by fines – up to €5,000 for each unsolicited message on summary conviction and up to €250,000 on conviction on indictment. The Office of the Data Protection Commissioner may bring summary proceedings for an offence under the Regulations.[citation needed] Enforcement responsibility is shared with the Commission for Communications Regulation (ComReg).[citation needed] References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Atomic_bombings_of_Hiroshima_and_Nagasaki] | [TOKENS: 21393] |
Contents Atomic bombings of Hiroshima and Nagasaki Second Sino-Japanese War On 6 and 9 August 1945, the United States detonated two atomic bombs over the Japanese cities of Hiroshima and Nagasaki, respectively, during World War II. The aerial bombings killed 150,000 to 246,000 people, most of whom were civilians, and remain the first and only uses of nuclear weapons in an armed conflict. Japan announced its surrender to the Allies on 15 August, six days after the bombing of Nagasaki and the Soviet Union's declaration of war against Japan and invasion of Manchuria. The Japanese government signed an instrument of surrender on 2 September, ending the war. In the final year of World War II, the Allies prepared for a costly invasion of the Japanese mainland. This undertaking was preceded by a conventional bombing and firebombing campaign that devastated 64 Japanese cities, including an operation on Tokyo. The war in Europe concluded when Germany surrendered on 8 May 1945, and the Allies turned their full attention to the Pacific War. By July 1945, the Allies' Manhattan Project had produced two types of atomic bombs: "Little Boy", an enriched uranium gun-type fission weapon, and "Fat Man", a plutonium implosion-type nuclear weapon. The 509th Composite Group of the U.S. Army Air Forces was trained and equipped with the specialized Silverplate version of the Boeing B-29 Superfortress, and deployed to Tinian in the Mariana Islands. The Allies called for the unconditional surrender of the Imperial Japanese Armed Forces in the Potsdam Declaration on 26 July 1945, the alternative being "prompt and utter destruction". The Japanese government ignored the ultimatum. The consent of the United Kingdom was obtained for the bombing, as was required by the Quebec Agreement, and orders were issued on 25 July by General Thomas T. Handy, the acting chief of staff of the U.S. Army, for atomic bombs to be used on Hiroshima, Kokura, Niigata, and Nagasaki. These targets were chosen because they were large urban areas that also held significant military facilities. On 6 August, a Little Boy was dropped on Hiroshima. Three days later, a Fat Man was dropped on Nagasaki. Over the next two to four months, the effects of the atomic bombings killed 90,000 to 166,000 people in Hiroshima and 60,000 to 80,000 people in Nagasaki; roughly half the deaths occurred on the first day. For months afterward, many people continued to die from the effects of burns, radiation sickness, and other injuries, compounded by illness and malnutrition. Despite Hiroshima's sizable military garrison, estimated at 24,000 troops, some 90% of the dead were civilians. Scholars have extensively studied the effects of the bombings on the social and political character of subsequent world history and popular culture, and there is still much debate concerning the ethical and legal justification for the bombings as well as their ramifications of geopolitics especially with the context of the Cold War. Supporters argue that the atomic bombings were necessary to bring an end to the war with minimal casualties and ultimately prevented a greater loss of life on both sides, and also assert that the demonstration of atomic weaponry created the Long Peace in the fear of preventing a nuclear war. Conversely, critics argue that the bombings were unnecessary for the war's end and were a war crime, raising moral and ethical implications, and also assert that future use of atomic weaponry is more likely than anticipated and could lead to a nuclear holocaust. Background The discovery of nuclear fission in 1938 made the development of an atomic bomb a theoretical possibility. Fears that a German atomic bomb project would develop atomic weapons first, especially among scientists who were refugees from Nazi Germany and other fascist countries, were expressed in the Einstein–Szilard letter to Roosevelt in 1939. This prompted preliminary research in the United States in late 1939. Progress was slow until the arrival of the British MAUD Committee report in late 1941, which indicated that only 5 to 10 kilograms of isotopically-pure uranium-235 were needed for a bomb instead of tons of natural uranium and a neutron moderator like heavy water. Consequently, the work was accelerated, first as a pilot program, and finally in the agreement by Roosevelt to turn the work over to the U.S. Army Corps of Engineers to construct the production facilities necessary to produce uranium-235 and plutonium-239. This work was consolidated within the newly created Manhattan Engineer District, which became better known as the Manhattan Project, eventually under the direction of Major General Leslie R. Groves, Jr.. The work of the Manhattan Project took place at dozens of sites across the United States, and even some outside of its borders. It would ultimately cost over US$2 billion (equivalent to about $28 billion in 2024) and employ over 125,000 people simultaneously at its peak. Groves appointed J. Robert Oppenheimer to organize and head the project's Los Alamos Laboratory in New Mexico, where bomb design work was carried out. Two different types of bombs were eventually developed: a gun-type fission weapon that used uranium-235, called Little Boy, and a more complex implosion-type nuclear weapon that used plutonium-239, called Fat Man. There was a Japanese nuclear weapon program, but it lacked the human, mineral, and financial resources of the Manhattan Project, and never made much progress towards developing an atomic bomb. In 1945, the Pacific War between the Empire of Japan and the Allies entered its fourth year. Most Japanese military units fought fiercely, ensuring that the Allied victory would come at an enormous cost. The 1.25 million battle casualties incurred in total by the United States in World War II included both military personnel killed in action and wounded in action. Nearly one million of the casualties occurred during the last year of the war, from June 1944 to June 1945. In December 1944, American battle casualties hit an all-time monthly high of 88,000 as a result of the German Ardennes Offensive. Worried by the losses sustained, President Franklin D. Roosevelt suggested the use of atomic bombs on Germany as soon as possible, but was informed the first usable atomic weapons were still months away. America's reserves of manpower were running out. Deferments for groups such as agricultural workers were tightened, and there was consideration of drafting women. At the same time, the public was becoming war-weary, and demanding that long-serving servicemen be sent home. In the Pacific, the Allies returned to the Philippines, recaptured Burma, and invaded Borneo. Offensives were undertaken to reduce the Japanese forces remaining in Bougainville, New Guinea and the Philippines. In April 1945, American forces landed on Okinawa, where heavy fighting continued until June. Along the way, the ratio of Japanese to American casualties dropped from five to one in the Philippines to two to one on Okinawa. Although some Japanese soldiers were taken prisoner, most fought until they were killed or committed suicide. Nearly 99 percent of the 21,000 defenders of Iwo Jima were killed. Of the 117,000 Okinawan and Japanese troops defending Okinawa in April to June 1945, 94 percent were killed; 7,401 Japanese soldiers surrendered, an unprecedentedly large number. As the Allies advanced towards Japan, conditions became steadily worse for the Japanese people. Japan's merchant fleet declined from 5,250,000 gross register tons in 1941 to 1,560,000 tons in March 1945, and 557,000 tons in August 1945. The lack of raw materials forced the Japanese war economy into a steep decline after the middle of 1944. The civilian economy, which had slowly deteriorated throughout the war, reached disastrous levels by the middle of 1945. The loss of shipping also affected the fishing fleet, and the 1945 catch was only 22 percent of that in 1941. The 1945 rice harvest was the worst since 1909, and hunger and malnutrition became widespread. U.S. industrial production was overwhelmingly superior to Japan's. By 1943, the U.S. produced almost 100,000 aircraft a year, compared to Japan's production of 70,000 for the entire war. In February 1945, Prince Fumimaro Konoe advised Emperor Hirohito that defeat was inevitable, and urged him to abdicate. Even before the surrender of Nazi Germany on 8 May 1945, plans were underway for the largest operation of the Pacific War, Operation Downfall, the Allied invasion of Japan. The operation had two parts: set to begin in October 1945, Operation Olympic involved a series of landings by the U.S. Sixth Army intended to capture the southern third of the southernmost main Japanese island, Kyūshū. This was to be followed in March 1946 by Operation Coronet, the capture of the Kantō Plain, near Tokyo on the main Japanese island of Honshu by the U.S. First, Eighth and Tenth Armies, as well as a Commonwealth Corps made up of Australian, British and Canadian divisions. The target date was chosen to allow for Olympic to complete its objectives, for troops to be redeployed from Europe, and the Japanese winter to pass. Japan's geography made this invasion plan obvious to the Japanese; they were able to predict the Allied invasion plans accurately and thus adjust their defensive plan, Operation Ketsugō, accordingly. The Japanese planned an all-out defense of Kyūshū, with little left in reserve. In all, there were 2.3 million Japanese Army troops prepared to defend the home islands, backed by a civilian militia of 28 million. Casualty predictions varied widely, but were extremely high. The Vice Chief of the Imperial Japanese Navy General Staff, Vice Admiral Takijirō Ōnishi, predicted up to 20 million Japanese deaths. The Americans were alarmed by the Japanese buildup, which was accurately tracked through Ultra intelligence. On 15 June 1945, a study by the Joint War Plans Committee, drawing on the experience of the Battle of Leyte, estimated that Downfall would result in 132,500 to 220,000 U.S. casualties, with U.S. dead and missing in the range from 27,500 to 50,000. Secretary of War Henry L. Stimson commissioned his own study by Quincy Wright and William Shockley, who estimated the invading Allies would suffer between 1.7 and 4 million casualties, of whom between 400,000 and 800,000 would be dead, while Japanese fatalities would have been around 5 to 10 million. In a meeting with the President and commanders on 18 June 1945, General George C. Marshall stated that "there was reason to believe" casualties for the first 30 days would not exceed the price paid for Luzon. Additionally, with the Japanese position rendered "hopeless" by an invasion of their mainland, Marshall speculated that Soviet entry into the war might be "the decisive action" needed to finally "[leverage] them into capitulation." Marshall began contemplating the use of a weapon that was "readily available and which assuredly can decrease the cost in American lives:" poison gas. Quantities of phosgene, mustard gas, tear gas and cyanogen chloride were moved to Luzon from stockpiles in Australia and New Guinea in preparation for Operation Olympic, and MacArthur ensured that Chemical Warfare Service units were trained in their use. Consideration was also given to using biological weapons. While the United States had developed plans for an air campaign against Japan prior to the Pacific War, the capture of Allied bases in the western Pacific in the first weeks of the conflict meant that this offensive did not begin until mid-1944 when the long-ranged Boeing B-29 Superfortress became ready for use in combat. Operation Matterhorn involved India-based B-29s staging through bases around Chengdu in China to make a series of raids on strategic targets in Japan. This effort failed to achieve the strategic objectives that its planners had intended, largely because of logistical problems, the bomber's mechanical difficulties, the vulnerability of Chinese staging bases, and the extreme range required to reach key Japanese cities. Brigadier General Haywood S. Hansell determined that Guam, Tinian, and Saipan in the Mariana Islands would better serve as B-29 bases, but they were in Japanese hands. Strategies were shifted to accommodate the air war, and the islands were captured between June and August 1944. Air bases were developed, and B-29 operations commenced from the Marianas in October 1944. The XXI Bomber Command began missions against Japan on 18 November 1944. The early attempts to bomb Japan from the Marianas proved just as ineffective as the China-based B-29s had been. Hansell continued the practice of conducting so-called high-altitude precision bombing, aimed at key industries and transportation networks, even after these tactics had not produced acceptable results. These efforts proved unsuccessful due to logistical difficulties with the remote location, technical problems with the new and advanced aircraft, unfavorable weather conditions, and enemy action. Hansell's successor, Major General Curtis LeMay, assumed command in January 1945 and initially continued to use the same precision bombing tactics, with equally unsatisfactory results. The attacks initially targeted key industrial facilities but much of the Japanese manufacturing process was carried out in small workshops and private homes. Under pressure from United States Army Air Forces (USAAF) headquarters in Washington, LeMay changed tactics and decided that low-level incendiary raids against Japanese cities were the only way to destroy their production capabilities, shifting from precision bombing to area bombardment with incendiaries. Like most strategic bombing during World War II, the aim of the air offensive against Japan was to destroy the enemy's war industries, kill or disable civilian employees of these industries, and undermine civilian morale. Over the next six months, the XXI Bomber Command under LeMay firebombed 64 Japanese cities. The firebombing of Tokyo, codenamed Operation Meetinghouse, on 9–10 March, killed an estimated 100,000 people and destroyed 41 km2 (16 sq mi) of the city and 267,000 buildings in a single night. It was the deadliest bombing raid of the war, at a cost of 20 B-29s shot down by flak and fighters. By May, 75 percent of bombs dropped were incendiaries designed to burn down Japan's "paper cities". By mid-June, Japan's six largest cities had been devastated. The end of the fighting on Okinawa that month provided airfields even closer to the Japanese mainland, allowing the bombing campaign to be further escalated. Aircraft flying from Allied aircraft carriers and the Ryukyu Islands also regularly struck targets in Japan during 1945 in preparation for Operation Downfall. Firebombing switched to smaller cities, with populations ranging from 60,000 to 350,000. According to Yuki Tanaka, the U.S. fire-bombed over a hundred Japanese towns and cities. The Japanese military was unable to stop the Allied attacks and the country's civil defense preparations proved inadequate. Japanese fighters and anti-aircraft guns had difficulty engaging bombers flying at high altitude. From April 1945, the Japanese interceptors also had to face American fighter escorts based on Iwo Jima and Okinawa. That month, the Imperial Japanese Army Air Service and Imperial Japanese Navy Air Service stopped attempting to intercept the air raids to preserve fighter aircraft to counter the expected invasion. By mid-1945 the Japanese only occasionally scrambled aircraft to intercept individual B-29s conducting reconnaissance sorties over the country, to conserve supplies of fuel. In July 1945, the Japanese had 138 thousand cubic metres (1,156,000 US barrels) of avgas stockpiled for the invasion of Japan. About 72 thousand cubic metres (604,000 US barrels) had been consumed in the home islands area in April, May and June 1945. While the Japanese military decided to resume attacks on Allied bombers from late June, by this time there were too few operational fighters available for this change of tactics to hinder the Allied air raids. Preparations The 509th Composite Group was constituted on 9 December 1944, and activated on 17 December 1944, at Wendover Army Air Field, Utah, commanded by Colonel Paul Tibbets. Tibbets was assigned to organize and command a combat group to develop the means of delivering an atomic weapon against targets in Germany and Japan. Because the flying squadrons of the group consisted of both bomber and transport aircraft, the group was designated as a "composite" rather than a "bombardment" unit. Due to its remoteness, Tibbets selected Wendover for his training base over Great Bend, Kansas and Mountain Home, Idaho. Each bombardier completed at least 50 practice drops of inert or conventional explosive pumpkin bombs, targeting islands around Tinian and later the Japanese home islands, until as late as 14 August 1945. Some of the missions over Japan were flown by single unescorted bombers with a single payload to accustom the Japanese to this pattern. They also simulated actual atomic bombing runs, including the directions of ingress and egress with respect to the wind. Tibbets himself was barred from flying most missions over Japan for fear that he might be captured and interrogated. On 5 April 1945, the code name Operation Centerboard was assigned. The officer responsible for its allocation in the War Department's Operations Division was not cleared to know any details of it. The first bombing was later codenamed Operation Centerboard I, and the second, Operation Centerboard II. The 509th Composite Group had an authorized strength of 225 officers and 1,542 enlisted men, almost all of whom eventually deployed to Tinian. In addition to its authorized strength, the 509th had attached to it on Tinian 51 civilian and military personnel from Project Alberta, known as the 1st Technical Detachment. The 509th Composite Group's 393rd Bombardment Squadron was equipped with 15 Silverplate B-29s. These aircraft were specially adapted to carry nuclear weapons, and were equipped with fuel-injected engines, Curtiss Electric reversible-pitch propellers, pneumatic actuators for rapid opening and closing of bomb bay doors and other improvements. The ground support echelon of the 509th Composite Group moved by rail on 26 April 1945, to its port of embarkation at Seattle, Washington. On 6 May the support elements sailed on the SS Cape Victory for the Marianas, while group materiel was shipped on the SS Emile Berliner. The Cape Victory made brief port calls at Honolulu and Eniwetok but the passengers were not permitted to leave the dock area. An advance party of the air echelon, consisting of 29 officers and 61 enlisted men, flew by C-54 to North Field on Tinian, between 15 and 22 May. There were also two representatives from Washington, D.C., Brigadier General Thomas Farrell, the deputy commander of the Manhattan Project, and Rear Admiral William R. Purnell of the Military Policy Committee, who were on hand to decide higher policy matters on the spot. Along with Captain William S. Parsons, the commander of Project Alberta, they became known as the "Tinian Joint Chiefs". In April 1945, Marshall asked Groves to nominate specific targets for bombing for final approval by himself and Stimson. Groves formed a Target Committee, chaired by himself, that included Farrell, Major John A. Derry, Colonel William P. Fisher, Joyce C. Stearns and David M. Dennison from the USAAF; and scientists John von Neumann, Robert R. Wilson and William Penney from the Manhattan Project. The Target Committee met in Washington on 27 April; at Los Alamos on 10 May, where it was able to talk to the scientists and technicians there; and finally in Washington on 28 May, where it was briefed by Tibbets and Commander Frederick Ashworth from Project Alberta, and the Manhattan Project's scientific advisor, Richard C. Tolman. The Target Committee nominated five targets: Kokura (now Kitakyushu), the site of one of Japan's largest munitions plants; Hiroshima, an embarkation port and industrial center that was the site of a major military headquarters; Yokohama, an urban center for aircraft manufacture, machine tools, docks, electrical equipment and oil refineries; Niigata, a port with industrial facilities including steel and aluminum plants and an oil refinery; and Kyoto, a major industrial center. The target selection was subject to the following criteria: These cities were largely untouched during the nightly bombing raids, and the Army Air Forces agreed to leave them off the target list so accurate assessment of the damage caused by the atomic bombs could be made. Hiroshima was described as "an important army depot and port of embarkation in the middle of an urban industrial area. It is a good radar target and it is such a size that a large part of the city could be extensively damaged. There are adjacent hills which are likely to produce a focusing effect which would considerably increase the blast damage. Due to rivers it is not a good incendiary target." The Target Committee stated that "It was agreed that psychological factors in the target selection were of great importance. Two aspects of this are (1) obtaining the greatest psychological effect against Japan and (2) making the initial use sufficiently spectacular for the importance of the weapon to be internationally recognized when publicity on it is released. ... Kyoto has the advantage of the people being more highly intelligent and hence better able to appreciate the significance of the weapon. Hiroshima has the advantage of being such a size and with possible focussing from nearby mountains that a large fraction of the city may be destroyed. The Emperor's palace in Tokyo has a greater fame than any other target but is of least strategic value." Edwin O. Reischauer, a Japan expert for the U.S. Army Intelligence Service, was incorrectly said to have prevented the bombing of Kyoto. In his autobiography, Reischauer specifically refuted this claim: ... the only person deserving credit for saving Kyoto from destruction is Henry L. Stimson, the Secretary of War at the time, who had known and admired Kyoto ever since his honeymoon there several decades earlier. Extant sources show that while Stimson was personally familiar with Kyoto, this was the result of a visit decades after his marriage when he was governor-general of the Philippines, not because he honeymooned there. On 30 May, Stimson asked Groves to remove Kyoto from the target list due to its historical, religious and cultural significance, but Groves pointed to its military and industrial significance. Stimson then approached President Harry S. Truman about the matter. Truman agreed with Stimson, and Kyoto was temporarily removed from the target list. Groves attempted to restore Kyoto to the target list in July, but Stimson remained adamant. On 25 July, Nagasaki was put on the target list in place of Kyoto. It was a major military port, one of Japan's largest shipbuilding and repair centers, and an important producer of naval ordnance. In early May 1945, the Interim Committee was created by Stimson at the urging of leaders of the Manhattan Project and with the approval of Truman to advise on matters pertaining to nuclear technology. They agreed that the atomic bomb was to be used (1) against Japan at the earliest opportunity, (2) without special warning, and (3) on a "dual target" of military installation surrounded by other buildings susceptible to damage. During the meetings on 31 May and 1 June, scientist Ernest Lawrence had suggested giving the Japanese a non-combat demonstration. Arthur Compton later recalled that: It was evident that everyone would suspect trickery. If a bomb were exploded in Japan with previous notice, the Japanese air power was still adequate to give serious interference. An atomic bomb was an intricate device, still in the developmental stage. Its operation would be far from routine. If during the final adjustments of the bomb the Japanese defenders should attack, a faulty move might easily result in some kind of failure. Such an end to an advertised demonstration of power would be much worse than if the attempt had not been made. It was now evident that when the time came for the bombs to be used we should have only one of them available, followed afterwards by others at all-too-long intervals. We could not afford the chance that one of them might be a dud. If the test were made on some neutral territory, it was hard to believe that Japan's determined and fanatical military men would be impressed. If such an open test were made first and failed to bring surrender, the chance would be gone to give the shock of surprise that proved so effective. On the contrary, it would make the Japanese ready to interfere with an atomic attack if they could. Though the possibility of a demonstration that would not destroy human lives was attractive, no one could suggest a way in which it could be made so convincing that it would be likely to stop the war. The possibility of a demonstration was raised again in the Franck Report issued by physicist James Franck on 11 June and the Scientific Advisory Panel rejected his report on 16 June, saying that "we can propose no technical demonstration likely to bring an end to the war; we see no acceptable alternative to direct military use." Franck then took the report to Washington, D.C., where the Interim Committee met on 21 June to re-examine its earlier conclusions; but it reaffirmed that there was no alternative to the use of the bomb on a military target. Like Compton, many U.S. officials and scientists argued that a demonstration would sacrifice the shock value of the atomic attack, and the Japanese could deny the atomic bomb was lethal, making the mission less likely to produce surrender. Allied prisoners of war might be moved to the demonstration site and be killed by the bomb. They also worried that the bomb might be a failure, as the Trinity test was that of a stationary device, not an air-dropped bomb. In addition, although more bombs were in production, only two would be available at the start of August, and they cost billions of dollars, so using one for a demonstration would be expensive. For several months, the U.S. had warned civilians of potential air raids by dropping more than 63 million leaflets across Japan. Many Japanese cities suffered terrible damage from aerial bombings; some were as much as 97 percent destroyed. LeMay thought that leaflets would increase the psychological impact of bombing, and reduce the international stigma of area-bombing cities. Even with the warnings, Japanese opposition to the war remained ineffective. In general, the Japanese regarded the leaflet messages as truthful, with many Japanese choosing to leave major cities. The leaflets caused such concern that the government ordered the arrest of anyone caught in possession of a leaflet. Leaflet texts were prepared by recent Japanese prisoners of war because they were thought to be the best choice "to appeal to their compatriots". In preparation for dropping an atomic bomb on Hiroshima, the Oppenheimer-led Scientific Panel of the Interim Committee decided against a demonstration bomb and against a special leaflet warning. Those decisions were implemented because of the uncertainty of a successful detonation and also because of the wish to maximize shock in the leadership. No warning was given to Hiroshima that a new and much more destructive bomb was going to be dropped. Various sources gave conflicting information about when the last leaflets were dropped on Hiroshima prior to the atomic bomb. Robert Jay Lifton wrote that it was 27 July, and Theodore H. McNelly wrote that it was 30 July. The USAAF history noted that eleven cities were targeted with leaflets on 27 July, but Hiroshima was not one of them, and there were no leaflet sorties on 30 July. Leaflet sorties were undertaken on 1 and 4 August. Hiroshima may have been leafleted in late July or early August, as survivor accounts talk about a delivery of leaflets a few days before the atomic bomb was dropped. Three versions were printed of a leaflet listing 11 or 12 cities targeted for firebombing; a total of 33 cities listed. With the text of this leaflet reading in Japanese "... we cannot promise that only these cities will be among those attacked ..." Hiroshima was not listed. In 1943, the United States and the United Kingdom signed the Quebec Agreement, which stipulated that nuclear weapons would not be used against another country without mutual consent. Stimson therefore had to obtain British permission. A meeting of the Combined Policy Committee, which included one Canadian representative, was held at the Pentagon on 4 July 1945. Field Marshal Sir Henry Maitland Wilson announced that the British government concurred with the use of nuclear weapons against Japan, which would be officially recorded as a decision of the Combined Policy Committee. As the release of information to third parties was also controlled by the Quebec Agreement, discussion then turned to what scientific details would be revealed in the press announcement of the bombing. The meeting also considered what Truman could reveal to Joseph Stalin, the leader of the Soviet Union, at the upcoming Potsdam Conference, as this also required British concurrence. Orders for the attack were issued to General Carl Spaatz on 25 July under the signature of General Thomas T. Handy, the acting chief of staff, since Marshall was at the Potsdam Conference with Truman. It read in part: That day, Truman noted in his diary that: This weapon is to be used against Japan between now and August 10th. I have told the Sec. of War, Mr. Stimson, to use it so that military objectives and soldiers and sailors are the target and not women and children. Even if the Japs are savages, ruthless, merciless and fanatic, we as the leader of the world for the common welfare cannot drop that terrible bomb on the old capital [Kyoto] or the new [Tokyo]. He and I are in accord. The target will be a purely military one. The 16 July success of the Trinity Test in the New Mexico desert exceeded expectations. On 26 July, Allied leaders issued the Potsdam Declaration, which outlined the terms of surrender for Japan. The declaration was presented as an ultimatum and stated that without a surrender, the Allies would attack Japan, resulting in "the inevitable and complete destruction of the Japanese armed forces and just as inevitably the utter devastation of the Japanese homeland". The atomic bomb was not mentioned in the communiqué. On 28 July, Japanese papers reported that the declaration had been rejected by the Japanese government. That afternoon, Prime Minister Kantarō Suzuki declared at a press conference that the Potsdam Declaration was no more than a rehash (yakinaoshi) of the Cairo Declaration, that the government intended to ignore it (mokusatsu, "kill by silence"), and that Japan would fight to the end. The statement was taken by both Japanese and foreign papers as a clear rejection of the declaration. Emperor Hirohito, who was waiting for a Soviet reply to non-committal Japanese peace feelers, made no move to change the government position. Japan's willingness to surrender remained conditional on the preservation of the kokutai (Imperial institution and national polity), assumption by the Imperial Headquarters of responsibility for disarmament and demobilization, no occupation of the Japanese Home Islands, Korea or Formosa, and delegation of the punishment of war criminals to the Japanese government. At Potsdam, Truman agreed to a request from Winston Churchill that Britain be represented when the atomic bomb was dropped. William Penney and Group Captain Leonard Cheshire were sent to Tinian, but LeMay would not let them accompany the Hiroshima mission. They would subsequently take part in the Nagasaki mission as observers. The Little Boy bomb, except for the uranium payload, was ready at the beginning of May 1945. There were two uranium-235 components, a hollow cylindrical projectile and a cylindrical target insert. The projectile was completed on 15 June, and the target insert on 24 July. The projectile and eight bomb pre-assemblies (partly assembled bombs without the powder charge and fissile components) left Hunters Point Naval Shipyard, California, on 16 July aboard the cruiser USS Indianapolis, and arrived on Tinian on 26 July. The target insert followed by air on 30 July, accompanied by Commander Francis Birch from Project Alberta. The bomb was later described by physicist Harold Agnew, who flew in an accompanying aircraft, as "completely unsafe"; it would normally be armed, an exceptionally delicate task, on the ground before takeoff; responding to concerns expressed by the 509th Composite Group about the possibility of a B-29 crashing on takeoff with an armed bomb on board, Birch modified the Little Boy design to incorporate a removable breech plug that would permit the bomb to be armed in flight. The first plutonium core, along with its polonium-beryllium urchin initiator, was transported in the custody of Project Alberta courier Raemer Schreiber in a magnesium field carrying case designed for the purpose by Philip Morrison. Magnesium was chosen because it does not act as a neutron reflector. The core departed from Kirtland Army Air Field on a C-54 transport aircraft of the 509th Composite Group's 320th Troop Carrier Squadron on 26 July, and arrived at North Field 28 July. Three Fat Man high-explosive pre-assemblies, designated F31, F32, and F33, were picked up at Kirtland on 28 July by three B-29s, two from the 393rd Bombardment Squadron plus one from the 216th Army Air Force Base Unit, and transported to North Field, arriving on 2 August. Hiroshima At the time of its bombing, Hiroshima was a city of industrial and military significance. A number of military units were located nearby, the most important of which was the headquarters of Field Marshal Shunroku Hata's Second General Army, which commanded the defense of all of southern Japan, and was located in Hiroshima Castle. Hata's command consisted of some 400,000 men, most of whom were on Kyushu where an Allied invasion was correctly anticipated. Also present in Hiroshima were the headquarters of the 59th Army, the 5th Division and the 224th Division, a recently formed mobile unit. The city was defended by five batteries of 70 mm and 80 mm (2.8 and 3.1 inch) anti-aircraft guns of the 3rd Anti-Aircraft Division, including units from the 121st and 122nd Anti-Aircraft Regiments and the 22nd and 45th Separate Anti-Aircraft Battalions. In total, an estimated 40,000 Japanese military personnel were stationed in the city. Hiroshima was a supply and logistics base for the Japanese military. The city was a communications center, a key port for shipping, and an assembly area for troops. It supported a large war industry, manufacturing parts for planes and boats, for bombs, rifles, and handguns. The center of the city contained several reinforced concrete buildings. Outside the center, the area was congested by a dense collection of small timber workshops set among Japanese houses. A few larger industrial plants lay near the outskirts of the city. The houses were constructed of timber with tile roofs, and many of the industrial buildings were also built around timber frames. The city as a whole was highly susceptible to fire damage. It was the second largest city in Japan after Kyoto that was still undamaged by air raids, primarily because it lacked the aircraft manufacturing industry that was the XXI Bomber Command's priority target. On 3 July, the Joint Chiefs of Staff placed it off limits to bombers, along with Kokura, Niigata and Kyoto. The population of Hiroshima had reached a peak of over 381,000 earlier in the war but prior to the atomic bombing, the population had steadily decreased because of a systematic evacuation ordered by the Japanese government. At the time of the attack, the population was approximately 340,000–350,000. Residents wondered why Hiroshima had been spared destruction by firebombing. Some speculated that the city was to be saved for U.S. occupation headquarters, others thought perhaps their relatives in Hawaii and California had petitioned the U.S. government to avoid bombing Hiroshima. More realistic city officials had ordered buildings torn down to create long, straight firebreaks. These continued to be expanded and extended up to the morning of 6 August 1945. Hiroshima was the primary target of the first atomic bombing mission on 6 August, with Kokura and Nagasaki as alternative targets. The 393rd Bombardment Squadron B-29 Enola Gay, named after Tibbets's mother and piloted by Tibbets, took off from North Field, Tinian, about six hours' flight time from Japan, at 02:45 local time. Enola Gay was accompanied by two other B-29s: The Great Artiste, commanded by Major Charles Sweeney, which carried instrumentation, and a then-nameless aircraft later called Necessary Evil, commanded by Captain George Marquardt. Necessary Evil was the photography aircraft. After leaving Tinian, the aircraft made their way separately to Iwo Jima to rendezvous with Sweeney and Marquardt at 05:55 at 2,800 meters (9,200 ft), and set course for Japan. The aircraft arrived over Hiroshima in clear visibility at 9,470 meters (31,060 ft). Parsons, who was in command of the mission, armed the bomb in flight to minimize the risks during takeoff. He had witnessed four B-29s crash and burn at takeoff, and feared that a nuclear explosion would occur if a B-29 crashed with an armed Little Boy on board. His assistant, Second Lieutenant Morris R. Jeppson, removed the safety devices 30 minutes before reaching the target area. During the night of 5–6 August, Japanese early warning radar detected the approach of numerous American aircraft headed for the southern part of Japan. Radar detected 65 bombers headed for Saga, 102 bound for Maebashi, 261 en route to Nishinomiya, 111 headed for Ube and 66 bound for Imabari. An alert was given and radio broadcasting stopped in many cities, among them Hiroshima. The all-clear was sounded in Hiroshima at 00:05. About an hour before the bombing, the air raid alert was sounded again, as Straight Flush flew over the city. It broadcast a short message, which was picked up by Enola Gay. It read: "Cloud cover less than 3/10th at all altitudes. Advice: bomb primary." The all-clear was sounded over Hiroshima again at 07:09. At 08:09, Tibbets started his bomb run and handed control over to his bombardier, Major Thomas Ferebee. The release at 08:15 (Hiroshima time) went as planned, and the Little Boy containing about 64 kg (141 lb) of uranium-235 took 44.4 seconds to fall from the aircraft flying at about 9,400 meters (31,000 ft) to a detonation height of about 580 meters (1,900 ft) above the city. Enola Gay was 18.5 km (11.5 mi) away before it felt the shock waves from the blast. Due to crosswind, the bomb missed the aiming point, the Aioi Bridge, by approximately 240 m (800 ft) and detonated directly over Shima Surgical Clinic. It released energy equivalent to 16 ± 2 kilotons of TNT (66.9 ± 8.4 TJ)—four times the tonnage of conventional bombs that had wiped out the city of Dresden in Germany. The weapon was very inefficient, with only 1.7 percent of its material fissioning. The radius of total destruction was about 1.6 kilometers (1 mi), with resulting fires across 11 km2 (4.4 sq mi). Enola Gay stayed over the target area for two minutes and was 16 kilometers (10 mi) away when the bomb detonated. Only Tibbets, Parsons, and Ferebee knew of the nature of the weapon; the others on the bomber were only told to expect a blinding flash and given protective goggles. "It was hard to believe what we saw", Tibbets told reporters, while Parsons said "the whole thing was tremendous and awe-inspiring ... the men aboard with me gasped 'My God'." He and Tibbets compared the shockwave to "a close burst of ack-ack fire". Necessary Evil's cameras all failed; the only film of the attack was made by a 16 mm cine camera smuggled onto the aircraft by a crew member. Enola Gay's crew received a heroes' welcome on landing at Tinian, with hundreds of cheering welcomers and "more generals than I'd ever seen in my life. We wondered what the hell they were doing there" according to navigator Theodore "Dutch" Van Kirk. When Tibbets stepped from his B-29 the Distinguished Service Cross was unexpectedly pinned to his chest while he was still holding his pipe. People on the ground reported a pika (ピカ)—a brilliant flash of light—followed by a don (ドン)—a loud booming sound. The experiences of survivors in the city varied depending on their location and circumstances, but a common factor in survivor accounts was a sense that a conventional weapon (sometimes cited as a magnesium bomb, which has a distinctively bright white flash) had happened to go off immediately in their vicinity, causing tremendous damage (throwing people across rooms, breaking glass, crushing buildings). After emerging from the ruins, the survivors gradually understood that the entire city had been attacked at the same instant. Survivor accounts frequently feature walking through the ruins of the city without a clear sense of where to go, and encountering the cries of people trapped within crushed structures, or people with horrific burns. As the numerous small fires created by the blast began to grow, they merged into a firestorm that moved quickly throughout the ruins, killing many who had been trapped, and causing people to jump into Hiroshima's rivers in search of sanctuary (many of whom drowned). The photographer Yoshito Matsushige took the only photographs of Hiroshima immediately after the bombing. He described in a later interview that, immediately after the bombing, "everywhere there was dust; it made a grayish darkness over everything." He took five photographs in total before he could not continue: "It was really a terrible scene. It was just like something out of hell." Survivor accounts also prominently feature cases of survivors who appeared uninjured, but who would succumb within hours or days to what would later be identified as radiation sickness. Estimating the number of people killed by the blast, firestorm, and radiation effects of the bombing has been hampered by imprecise record-keeping during the war, the chaos caused by the attack, uncertainty about the number of people in the city on the morning of the attack, and variations in methodology. Reports by the Manhattan Project in 1946 and the U.S. occupation–led Joint Commission for the Investigation of the Atomic Bomb in Japan in 1951 estimated 66,000 dead and 69,000 injured, and 64,500 dead and 72,000 injured, respectively, while Japanese-led reconsiderations of the death toll in the 1970s estimated 140,000 dead in Hiroshima by the end of the year. Estimates also vary on the number of Japanese military personnel killed. The United States Strategic Bombing Survey estimated in 1946 that there were 24,158 soldiers present in Hiroshima at the time of the attack, and that 6,789 were killed or missing as a result; the 1970s reconsiderations estimated about 10,000 military dead. A modern estimate by the Radiation Effects Research Foundation (RERF) estimates a city population of 340,000 to 350,000 at the time of the bombing, of which 90,000 to 166,000 died by the end of the year. U.S. surveys estimated that 12 km2 (4.7 sq mi) of the city were destroyed. Japanese officials determined that 69 percent of Hiroshima's buildings were destroyed and another 6 to 7 percent damaged. Some of the reinforced concrete buildings in Hiroshima had been very strongly constructed because of the earthquake danger in Japan, and their framework did not collapse even though they were fairly close to the blast center. Since the bomb detonated in the air, the blast was directed more downward than sideways, which was largely responsible for the survival of the Prefectural Industrial Promotional Hall, now commonly known as the Genbaku (A-bomb) dome, which was only 150 m (490 ft) from ground zero (the hypocenter). The ruin was named Hiroshima Peace Memorial and was made a UNESCO World Heritage Site in 1996 over the objections of the United States and China, which expressed reservations on the grounds that other Asian nations were the ones who suffered the greatest loss of life and property, and a focus on Japan lacked historical perspective. The air raid warning had been cleared at 07:31, and many people were outside, going about their activities. Eizō Nomura was the closest known survivor, being in the basement of a reinforced concrete building (it remained as the Rest House after the war) only 170 meters (560 ft) from ground zero at the time of the attack. He died in 1982, aged 84. Akiko Takakura was among the closest survivors to the hypocenter of the blast. She was in the solidly-built Bank of Hiroshima only 300 meters (980 ft) from ground-zero at the time of the attack. Over 90 percent of the doctors and 93 percent of the nurses in Hiroshima were killed or injured—most had been in the downtown area which received the greatest damage. The hospitals were destroyed or heavily damaged. Only one doctor, Terufumi Sasaki, remained on duty at the Red Cross Hospital. Nonetheless, by early afternoon the police and volunteers had established evacuation center at hospitals, schools and tram stations, and a morgue was established in the Asano library. Survivors of the blast gathered for medical treatment, but many would die before receiving any help, leaving behind rings of corpses around hospitals. Most elements of the Japanese Second General Army headquarters were undergoing physical training on the grounds of Hiroshima Castle, barely 820 meters (900 yd) from the hypocenter. The attack killed 3,243 troops on the parade ground. The communications room of Chugoku Military District Headquarters that was responsible for issuing and lifting air raid warnings was located in a semi-basement in the castle. Yoshie Oka, a Hijiyama Girls High School student who had been mobilized to serve as a communications officer, had just sent a message that the alarm had been issued for Hiroshima and neighboring Yamaguchi when the bomb exploded. She used a special phone to inform Fukuyama Headquarters (some 100 kilometers (62 mi) away) that "Hiroshima has been attacked by a new type of bomb. The city is in a state of near-total destruction." Since Mayor Senkichi Awaya had been killed at the mayoral residence, Field Marshal Shunroku Hata, who was only slightly wounded, took over the administration of the city and coordinated relief efforts. Many of his staff had been killed or fatally wounded, including Lieutenant Colonel Yi U, a prince of the Korean imperial family who was serving as a General Staff Officer. Hata's senior surviving staff officer was the wounded Colonel Kumao Imoto, who acted as his chief of staff. Soldiers from the undamaged Hiroshima Ujina Harbor used Shin'yō-class suicide motorboats, intended to repel the American invasion, to collect the wounded and take them down the rivers to the military hospital at Ujina. Trucks and trains brought in relief supplies and evacuated survivors from the city. Twelve American airmen were imprisoned at the Chugoku Military Police Headquarters, about 400 meters (1,300 ft) from the hypocenter of the blast. Most died instantly, although two were reported to have been executed by their captors, and two prisoners badly injured by the bombing were left next to the Aioi Bridge by the Kempeitai, where they were stoned to death. Eight U.S. prisoners of war killed as part of the medical experiments program at Kyushu University were falsely reported by Japanese authorities as having been killed in the atomic blast as part of an attempted cover up. The fires created by the atomic bomb detonation carried large amounts of ash into the clouds in the atmosphere. One to two hours after the explosion, a "black rain" fell as a tarry combination of ash, radioactive fallout, and water, causing severe radiation burns in some cases. The Tokyo control operator of the Japan Broadcasting Corporation noticed that the Hiroshima station had gone off the air. He tried to re-establish his program by using another telephone line, but it too had failed. About 20 minutes later the Tokyo railroad telegraph center realized that the main line telegraph had stopped working just north of Hiroshima. From some small railway stops within 16 km (10 mi) of the city came unofficial and confused reports of a terrible explosion in Hiroshima. All these reports were transmitted to the headquarters of the Imperial Japanese Army General Staff. Military bases repeatedly tried to call the Army Control Station in Hiroshima. The complete silence from that city puzzled the General Staff; they knew that no large enemy raid had occurred and that no sizable store of explosives was in Hiroshima at that time. A young officer was instructed to fly immediately to Hiroshima, to land, survey the damage, and return to Tokyo with reliable information for the staff. It was felt that nothing serious had taken place and that the explosion was just a rumor. The staff officer went to the airport and took off for the southwest. After flying for about three hours, while still nearly 160 km (100 mi) from Hiroshima, he and his pilot saw a great cloud of smoke from the firestorm created by the bomb. After circling the city to survey the damage they landed south of the city, where the staff officer, after reporting to Tokyo, began to organize relief measures. Tokyo learned that the city had been destroyed by a new type of bomb from President Truman's announcement of the strike, sixteen hours later. The official White House announcement was delayed by congestion on circuits from Tinian, which meant that Groves could not initially confirm the extent of the damage to Hiroshima. The first report from Hiroshima was sent on a hotline by a fourteen-year old female student volunteer communications officer Yoshie Okawa in a bunker at the Chugoku Military District Headquarters, half a mile (800m) from the epicentre. She was not believed at Fukuyama, so she found a soldier above ground who told her that Hiroshima has been attacked by a new type of bomb. The city is in a state of near-total destruction. Events of 7–9 August Sixteen hours after the Hiroshima bombing, a statement was issued by the White House staff in Truman's name announcing the use of the new weapon. It said that the United States and its allies had "spent two billion dollars on the greatest scientific gamble in history—and won," and warned Japan: "If they do not now accept our terms, they may expect a rain of ruin from the air, the like of which has never been seen on this earth. Behind this air attack will follow sea and land forces in such numbers and power as they have not yet seen and with the fighting skill of which they are already well aware." This was a widely broadcast speech picked up by Japanese news agencies. The 50,000-watt standard wave station on Saipan, the OWI radio station, broadcast a similar message to Japan every 15 minutes about Hiroshima, stating that more Japanese cities would face a similar fate in the absence of immediate acceptance of the terms of the Potsdam Declaration and emphatically urged civilians to evacuate major cities. Radio Japan, which continued to extoll victory for Japan by never surrendering had informed the Japanese of the destruction of Hiroshima by a single bomb. Soviet Foreign Minister Vyacheslav Molotov had informed Tokyo of the Soviet Union's unilateral abrogation of the Soviet–Japanese Neutrality Pact on 5 April. At two minutes past midnight on 9 August, Tokyo time, Soviet infantry, armor, and air forces had launched the Manchurian Strategic Offensive Operation. Four hours later, word reached Tokyo of the Soviet Union's official declaration of war. The senior leadership of the Japanese Army began preparations to impose martial law on the nation, with the support of Minister of War Korechika Anami, to stop anyone attempting to make peace. On 7 August, a day after Hiroshima was destroyed, Yoshio Nishina and other atomic physicists arrived at the city, and carefully examined the damage. They then went back to Tokyo and told the cabinet that Hiroshima was indeed destroyed by a nuclear weapon. Admiral Soemu Toyoda, the Chief of the Naval General Staff, estimated that no more than one or two additional bombs could be readied, so they decided to endure the remaining attacks, acknowledging "there would be more destruction but the war would go on". American Magic codebreakers intercepted the cabinet's messages. Purnell, Parsons, Tibbets, Spaatz, and LeMay met on Guam that same day to discuss what should be done next. Since there was no indication of Japan surrendering, they decided to proceed with dropping another bomb. Parsons said that Project Alberta would have it ready by 11 August, but Tibbets pointed to weather reports indicating poor flying conditions on that day due to a storm, and asked if the bomb could be readied by 9 August. Parsons agreed to try to do so. Nagasaki The city of Nagasaki had been one of the largest seaports in southern Japan, and was of great wartime importance because of its wide-ranging industrial activity, including the production of ordnance, ships, military equipment, and other war materials. The four largest companies in the city were Mitsubishi Shipyards, Electrical Shipyards, Arms Plant, and Steel and Arms Works, which employed about 90 percent of the city's labor force, and accounted for 90 percent of the city's industry. Although an important industrial city, Nagasaki had been spared from firebombing because its geography made it difficult to locate at night with AN/APQ-13 radar. Unlike the other target cities, Nagasaki had not been placed off limits to bombers by the Joint Chiefs of Staff's 3 July directive, and was bombed on a small scale five times. During one of these raids on 1 August, a number of conventional high-explosive bombs were dropped on the city. A few hit the shipyards and dock areas in the southwest portion of the city, and several hit the Mitsubishi Steel and Arms Works. By early August, the city was defended by the 134th Anti-Aircraft Regiment of the 4th Anti-Aircraft Division with four batteries of 7 cm (2.8 in) anti-aircraft guns and two searchlight batteries. In contrast to Hiroshima, almost all of the buildings were of old-fashioned Japanese construction, consisting of timber or timber-framed buildings with timber walls (with or without plaster) and tile roofs. Many of the smaller industries and business establishments were also situated in buildings of timber or other materials not designed to withstand explosions. Nagasaki had been permitted to grow for many years without conforming to any definite city zoning plan; residences were erected adjacent to factory buildings and to each other almost as closely as possible throughout the entire industrial valley. On the day of the bombing, an estimated 263,000 people were in Nagasaki, including 240,000 Japanese residents, 10,000 Korean residents, 2,500 conscripted Korean workers, 9,000 Japanese soldiers, 600 conscripted Chinese workers, and 400 Allied prisoners of war in a camp to the north of Nagasaki. According to a crew member, Hiroshima had been "the perfect mission" where everything went right, whereas in the Nagasaki mission, almost everything would go wrong. Responsibility for the timing of the second bombing was delegated to Tibbets. Scheduled for 11 August, the raid was moved earlier by two days to avoid a five-day period of bad weather forecast to begin on 10 August. Three bomb pre-assemblies had been transported to Tinian, labeled F-31, F-32, and F-33 on their exteriors. On 8 August, a dress rehearsal was conducted off Tinian by Sweeney using Bockscar as the drop airplane. Assembly F-33 was expended testing the components and F-31 was designated for the 9 August mission. At 03:47 Tinian time (GMT+10), 02:47 Japanese time, on the morning of 9 August 1945, Bockscar, flown by Sweeney's crew, lifted off from Tinian island with the Fat Man, with Kokura as the primary target and Nagasaki the secondary target. The mission plan for the second attack was nearly identical to that of the Hiroshima mission, with two B-29s flying an hour ahead as weather scouts and two additional B-29s in Sweeney's flight for instrumentation and photographic support of the mission. Sweeney took off with his weapon partially armed, but with the electrical safety plugs still engaged; arming was completed a few minutes after takeoff. During pre-flight inspection of Bockscar, the flight engineer notified Sweeney that an inoperative fuel transfer pump made it impossible to use 2,400 liters (640 U.S. gal) of fuel carried in a reserve tank. This fuel would still have to be carried all the way to Japan and back, consuming still more fuel. Replacing the pump would take hours; moving the Fat Man to another aircraft might take just as long and was dangerous as well, as the bomb was live. Tibbets and Sweeney therefore elected to have Bockscar continue the mission as the reserve fuel was not expected to be needed. This time Penney and Cheshire were allowed to accompany the mission, flying as observers on the third plane, Big Stink, flown by the group's operations officer, Major James I. Hopkins, Jr. Observers aboard the weather planes reported both targets clear. The aircraft ran into thunderstorms, with the fully armed bomb on board. Unexpectedly a white light on the bomb control panel came on that normally illuminates when the bomb is about to be dropped, making the crew think it might detonate, but it was traced to a misplaced switch. When Sweeney's aircraft arrived at the assembly point for his flight off the coast of Japan, Big Stink failed to make the rendezvous. According to Cheshire, Hopkins was at varying heights including 2,700 meters (9,000 ft) higher than he should have been, and was not flying tight circles over Yakushima as previously agreed with Sweeney and Captain Frederick C. Bock, who was piloting the support B-29 The Great Artiste. Instead, Hopkins was flying 64-kilometer (40 mi) dogleg patterns. Though ordered not to circle longer than fifteen minutes, Sweeney continued to wait for Big Stink for forty minutes. Before leaving the rendezvous point, Sweeney consulted Ashworth, who was in charge of the bomb. As commander of the aircraft, Sweeney made the decision to proceed to the primary, the city of Kokura. After exceeding the original departure time limit by nearly a half-hour, Bockscar, accompanied by The Great Artiste, proceeded to Kokura, thirty minutes away. The delay at the rendezvous had resulted in clouds and drifting smoke over Kokura from fires started by a major firebombing raid by 224 B-29s on nearby Yahata the previous day. Additionally, the Yahata Steel Works intentionally burned coal tar to produce black smoke. The clouds and smoke resulted in 70 percent of the area over Kokura being covered, obscuring the aiming point. Three bomb runs were made over the next 50 minutes, burning fuel and exposing the aircraft repeatedly to the heavy defenses around Kokura, but the bombardier was unable to drop visually. By the time of the third bomb run, Japanese anti-aircraft fire was getting close, and Second Lieutenant Jacob Beser, who was monitoring Japanese communications, reported activity on the Japanese fighter direction radio bands. With fuel running low because of delays and the bad weather, with reserve fuel unavailable due to the failed pump, Bockscar and The Great Artiste headed for their secondary target, Nagasaki. Fuel consumption calculations made en route indicated that Bockscar had insufficient fuel to reach Iwo Jima and would be forced to divert to Okinawa, which had become entirely Allied-occupied territory only six weeks earlier. After initially deciding that if Nagasaki were obscured on their arrival the crew would carry the bomb to Okinawa and dispose of it in the ocean if necessary, Ashworth agreed with Sweeney's suggestion that a radar approach would be used if the target was obscured. At about 07:50 Japanese time (GMT+9), an air raid alert was sounded in Nagasaki, but the "all clear" signal was given at 08:30. When only two B-29 Superfortresses were sighted at 10:53 no further alarm was given; the Japanese apparently assumed that the planes were only conducting reconnaissance. A few minutes later, at 11:00 Japanese time, The Great Artiste dropped instruments attached to three parachutes. These instruments also contained an unsigned letter to Ryokichi Sagane, a physicist at the University of Tokyo who had studied with three of the scientists responsible for the atomic bomb at the University of California, Berkeley, urging him to tell the public about the danger involved with these weapons of mass destruction. The messages were found by military authorities but not turned over to Sagane until a month later. In 1949, one of the authors of the letter, Luis Alvarez, met with Sagane and signed the letter. At 11:01, a last-minute break in the clouds over Nagasaki allowed Bockscar's bombardier, Captain Kermit Beahan, to visually sight the target as ordered. The Fat Man weapon, containing a core of about 5 kg (11 lb) of plutonium, was dropped over the city's industrial valley. It exploded 47 seconds later at 11:02 at 503 ± 10 m (1,650 ± 33 ft), above a tennis court, halfway between the Mitsubishi Steel and Arms Works in the south and the Nagasaki Arsenal in the north, nearly 3 km (1.9 mi) northwest of the planned hypocenter. By a bizarre coincidence, Fat Man detonated almost directly over the factory that had made the torpedoes used in the Japanese attack on Pearl Harbor. The blast was confined to the Urakami Valley, and a major portion of the city was protected by the intervening hills. The resulting explosion released the equivalent energy of 21 ± 2 kt (87.9 ± 8.4 TJ). Big Stink spotted the explosion from 160 kilometers (100 mi) away, and flew over to observe. Bockscar flew on to Okinawa, arriving unannounced with only sufficient fuel for a single approach. Sweeney tried repeatedly to contact the control tower for landing clearance, but received no answer. He could see heavy air traffic landing and taking off from Yontan Airfield. Firing off every flare on board to alert the field to his emergency landing, the Bockscar came in fast, landing at 230 km/h (140 mph) instead of the normal 190 kilometers per hour (120 mph). The number two engine died from fuel starvation as he began the final approach. Touching down on only three engines midway down the landing strip, Bockscar bounced up into the air again for about 7.6 meters (25 ft) before slamming back down hard. The heavy B-29 slewed left and towards a row of parked B-24 bombers before the pilots managed to regain control. Its reversible propellers were insufficient to slow the aircraft adequately, and with both pilots standing on the brakes, Bockscar made a swerving 90-degree turn at the end of the runway to avoid running off it. A second engine died from fuel exhaustion before the plane came to a stop. Unlike the heroes' welcome for Enola Gay there was nobody to greet them; nobody knew they were coming. Following the mission, there was confusion over the identification of the plane. The first eyewitness account by war correspondent William L. Laurence of The New York Times, who accompanied the mission aboard the aircraft piloted by Bock, reported that Sweeney was leading the mission in The Great Artiste. He also noted its "Victor" number as 77, which was that of Bockscar. Laurence had interviewed Sweeney and his crew, and was aware that they referred to their airplane as The Great Artiste. Except for Enola Gay, none of the 393rd's B-29s had yet had names painted on the noses, a fact which Laurence himself noted in his account. Unaware of the switch in aircraft, Laurence assumed Victor 77 was The Great Artiste, which was in fact, Victor 89. Although the bomb was more powerful than the one used on Hiroshima, its effects were confined by hillsides to the narrow Urakami Valley. Of 7,500 Japanese employees who worked inside the Mitsubishi Munitions plant, including "mobilized" students and regular workers, 6,200 were killed. Some 17,000–22,000 others who worked in other war plants and factories in the city died as well. The 1946 Manhattan Project report estimated 39,000 dead and 25,000 injured, and the 1951 U.S.-led Joint Commission report estimated 39,214 dead and 25,153 injured; Japanese-led reconsiderations in the 1970s estimated 70,000 dead in Nagasaki by the end of the year. A modern estimate by the Radiation Effects Research Foundation (RERF) estimates a city population of 250,000 to 270,000 at the time of the bombing, of which 60,000 to 80,000 died by the end of the year. Unlike Hiroshima's military death toll, only 150 Japanese soldiers were killed instantly, including 36 from the 134th AAA Regiment of the 4th AAA Division. At least eight Allied prisoners of war (POWs)—British Royal Air Force Corporal Ronald Shaw, and seven Dutch POWs—were killed, and as many as thirteen may have died. American POW Joe Kieyoomia survived, reportedly having been shielded from the effects of the bomb by the concrete walls of his cell. The 24 Australian POWs in Nagasaki survived. The radius of total destruction was about 1.6 km (1 mi), followed by fires across the northern portion of the city to 3.2 km (2 mi) south of the bomb. The Nagasaki Arsenal was destroyed in the blast. About 58 percent of the Mitsubishi Arms Plant and about 78 percent of the Mitsubishi Steel Works were damaged. The Mitsubishi Electric Works, on the border of the main destruction zone, suffered only 10 percent structural damage. Many fires burnt after the bombing, but no firestorm developed, unlike at Hiroshima, as the damaged areas had insufficient fuel density. Instead, ambient wind pushed the fire to spread along the valley. Had the bomb been dropped at the intended target, in the heart of Nagasaki's downtown historic district, the destruction to medical and administrative infrastructure would have been greater. As in Hiroshima, the bombing badly damaged the city's medical facilities. A makeshift hospital was established at the Shinkozen Primary School, which served as the main medical center. The trains kept running and evacuated many victims to hospitals in nearby towns. A medical team from a naval hospital reached the city in the evening. Fire-fighting brigades from the neighboring towns assisted in fighting the fires. Takashi Nagai, a seriously injured doctor working in the radiology department of Nagasaki Medical College Hospital, and the rest of the surviving medical staff treated bombing victims. The atomic bomb explosion generated a windstorm several kilometers wide that carried ash, dust, and debris over the mountain ranges surrounding Nagasaki. Approximately 20 minutes after the bombing, a black rain fell with the consistency of mud or oil. The rain carried radioactive material and continued for one to two hours. Plans for more atomic attacks on Japan There were plans for further attacks on Japan following Hiroshima and Nagasaki. Groves expected to have another (plutonium-239-based) "Fat Man" atomic bomb ready for use on 19 August, with three more in September and a further three in October. A second Little Boy bomb (using uranium-235) would not be available until December 1945. On 10 August, he sent a memorandum to Marshall in which he wrote that "the next bomb ... should be ready for delivery on the first suitable weather after 17 or 18 August." The memo today contains hand-written comment written by Marshall: "It is not to be released over Japan without express authority from the President." At the cabinet meeting that morning, Truman discussed these actions. James Forrestal paraphrased Truman as saying "there will be no further dropping of the atomic bomb," while Henry A. Wallace recorded in his diary that: "Truman said he had given orders to stop atomic bombing. He said the thought of wiping out another 100,000 people was too horrific. He didn't like the idea of killing, as he said, 'all those kids.'" The previous order that the target cities were to be attacked with atomic bombs "as made ready" was thus modified. There was already discussion in the War Department about conserving the bombs then in production for Operation Downfall, and Marshall suggested to Stimson that the remaining cities on the target list be spared attack with atomic bombs. On 13 August, General John E. Hull and Colonel Lyle Seeman discussed the usage of the upcoming weapons, favouring "tactical use" in support of an invasion, dropped two to three days prior to US troop capture or amphibious landing, as opposed to continuation of strategic attacks. Two more Fat Man assemblies were readied, and scheduled to leave Kirtland Field for Tinian on 11 and 14 August, and Tibbets was ordered by LeMay to return to Albuquerque, New Mexico, to collect them. At Los Alamos, technicians worked 24 hours straight to cast another plutonium core. Although cast, it still needed to be pressed and coated, which would take until 16 August. Therefore, it could have been ready for use on 19 August. Unable to reach Marshall, Groves ordered on his own authority on 13 August that the core should not be shipped. Surrender of Japan and subsequent occupation Until 9 August, Japan's war council still insisted on its four conditions for surrender. The full cabinet met at 14:30 on 9 August, and spent most of the day debating surrender. Anami conceded that victory was unlikely, but argued in favor of continuing the war. The meeting ended at 17:30, with no decision having been reached. Suzuki went to the palace to report on the outcome of the meeting, where he met with Kōichi Kido, the Lord Keeper of the Privy Seal of Japan. Kido informed him that the emperor had agreed to hold an imperial conference, and gave a strong indication that the emperor would consent to surrender on condition that kokutai be preserved. A second cabinet meeting was held at 18:00. Only four ministers supported Anami's position of adhering to the four conditions, but since cabinet decisions had to be unanimous, no decision was reached before it ended at 22:00. Calling an imperial conference required the signatures of the prime minister and the two service chiefs, but the Chief Cabinet Secretary Hisatsune Sakomizu had already obtained signatures from Toyoda and General Yoshijirō Umezu in advance, and he reneged on his promise to inform them if a meeting was to be held. The meeting commenced at 23:50. No consensus had emerged by 02:00 on 10 August, but the emperor gave his "sacred decision", authorizing the Foreign Minister, Shigenori Tōgō, to notify the Allies that Japan would accept their terms on one condition, that the declaration "does not comprise any demand which prejudices the prerogatives of His Majesty as a Sovereign ruler." On 12 August, the Emperor informed the imperial family of his decision to surrender. One of his uncles, Prince Asaka, asked whether the war would be continued if the kokutai could not be preserved. Hirohito simply replied, "Of course." As the Allied terms seemed to leave intact the principle of the preservation of the Throne, Hirohito recorded on 14 August his capitulation announcement which was broadcast to the Japanese nation the next day despite an attempted military coup d'état by militarists opposed to the surrender. In his declaration's fifth paragraph, Hirohito solely mentions the duration of the conflict; and did not explicitly mention the Soviets as a factor for surrender: But now the war has lasted for nearly four years. Despite the best that has been done by every one—the gallant fighting of military and naval forces, the diligence and assiduity of Our servants of the State and the devoted service of Our one hundred million people, the war situation has developed not necessarily to Japan's advantage, while the general trends of the world have all turned against her interest. The sixth paragraph by Hirohito specifically mentions the use of nuclear ordnance devices, from the aspect of the unprecedented damage they caused: Moreover, the enemy has begun to employ a new and most cruel bomb, the power of which to do damage is, indeed, incalculable, taking the toll of many innocent lives. Should we continue to fight, not only would it result in an ultimate collapse and obliteration of the Japanese nation, but also it would lead to the total extinction of human civilization. The seventh paragraph gives the reason for the ending of hostilities against the Allies: Such being the case, how are we to save the millions of our subjects, or to atone ourselves before the hallowed spirits of our imperial ancestors? This is the reason why we have ordered the acceptance of the provisions of the joint declaration of the powers. In his "Rescript to the Soldiers and Sailors" delivered on 17 August, Hirohito did not refer to the atomic bombs or possible human extinction, and instead described the Soviet declaration of war as "endangering the very foundation of the Empire's existence." Three weeks after the bombings, Tibbets decided to visit Hiroshima with some of his team, but landed at Nagasaki instead as Hiroshima's airfield was unusable. They arrived before any American troops and drove into the city, which they found eerie. They were amazed at all the destruction from just one bomb—"It scares the hell out of you". Reportage On 10 August 1945, the day after the Nagasaki bombing, military photographer Yōsuke Yamahata, correspondent Higashi, and artist Yamada arrived in the city with instructions to record the destruction for propaganda purposes. Yamahata took scores of photographs, and on 21 August, they appeared in Mainichi Shimbun, a popular Japanese newspaper. After Japan's surrender and the arrival of American forces, copies of his photographs were seized amid the ensuing censorship, but some records have survived. Leslie Nakashima, a former United Press (UP) journalist, filed the first personal account of the scene to appear in American newspapers. He observed that large numbers of survivors continued to die from what later became recognized as radiation poisoning. On 31 August, The New York Times published an abbreviated version of his 27 August UP article. Nearly all references to uranium poisoning were omitted. An editor's note was added to say that, according to American scientists, "the atomic bomb will not have any lingering after-effects." Wilfred Burchett was also one of the first Western journalists to visit Hiroshima after the bombing. He arrived alone by train from Tokyo on 2 September, defying the traveling ban put in place on Western correspondents. Burchett's dispatch, "The Atomic Plague", was printed by the Daily Express newspaper in London on 5 September 1945. The reports from Nakashima and Burchett informed the public for the first time of the gruesome effects of radiation and nuclear fallout—radiation burns and radiation poisoning, sometimes lasting more than thirty days after the blast. Burchett especially noted that people were dying "horribly" after bleeding from orifices, and their flesh would rot away from the injection holes where vitamin A was administered, to no avail. The New York Times then apparently reversed course and ran a front-page story by Bill Lawrence confirming the existence of a terrifying affliction in Hiroshima, where many had symptoms such as hair loss and vomiting blood before dying. Lawrence had gained access to the city as part of a press junket promoting the U.S. Army Air Force. Some reporters were horrified by the scene, however, referring to what they saw as a "death laboratory" littered with "human guinea pigs". General MacArthur found the reporting to have turned from good PR into bad PR and threatened to court martial the entire group. He withdrew Burchett's press accreditation and expelled the journalist from the occupation zones. The authorities also accused him of being under the sway of Japanese propaganda and later suppressed another story, on the Nagasaki bombing, by George Weller of the Chicago Daily News. Less than a week after his New York Times story was published, Lawrence also backtracked and dismissed the reports on radiation sickness as Japanese efforts to undermine American morale. A member of the United States Strategic Bombing Survey, Lieutenant Daniel A. McGovern, arrived in September 1945 to document the effects of the bombing of Japan. He used a film crew to document the effects of the bombings in early 1946. The film crew shot 27,000 m (90,000 ft) of film, resulting in a three-hour documentary titled The Effects of the Atomic Bombs Against Hiroshima and Nagasaki. The documentary included images from hospitals, burned-out buildings and cars, and rows of skulls and bones on the ground. It was classified "secret" for the next 22 years. Motion picture company Nippon Eigasha started sending cameramen to Nagasaki and Hiroshima in September 1945. On 24 October 1945, a U.S. military policeman stopped a Nippon Eigasha cameraman from continuing to film in Nagasaki. All Nippon Eigasha's reels were confiscated by the American authorities, but they were requested by the Japanese government, and declassified. The public release of film footage of the city post-attack, and some research about the effects of the attack, was restricted during the occupation of Japan, but the Hiroshima-based magazine, Chugoku Bunka, in its first issue published on 10 March 1946, devoted itself to detailing the damage from the bombing. The book Hiroshima, written by Pulitzer Prize winner John Hersey and originally published in article form in The New Yorker, is reported to have reached Tokyo in English by January 1947, and the translated version was released in Japan in 1949. It narrated the stories of the lives of six bomb survivors from immediately prior to, and months after, the dropping of the Little Boy bomb. Beginning in 1974, a compilation of drawings and artwork made by the survivors of the bombings began to be compiled, with completion in 1977, and under both book and exhibition format, it was titled The Unforgettable Fire. The bombing amazed Otto Hahn and other German atomic scientists, whom the British held at Farm Hall in Operation Epsilon. Hahn stated that he had not believed an atomic weapon "would be possible for another twenty years"; Werner Heisenberg did not believe the news at first. Carl Friedrich von Weizsäcker said "I think it's dreadful of the Americans to have done it. I think it is madness on their part", but Heisenberg replied, "One could equally well say 'That's the quickest way of ending the war'". Hahn was grateful that the German project had not succeeded in developing "such an inhumane weapon"; Karl Wirtz observed that even if it had, "we would have obliterated London but would still not have conquered the world, and then they would have dropped them on us". Hahn told the others, "Once I wanted to suggest that all uranium should be sunk to the bottom of the ocean". The Vatican agreed; L'Osservatore Romano expressed regret that the bomb's inventors did not destroy the weapon for the benefit of humanity. Rev. Cuthbert Thicknesse, the dean of St Albans, prohibited using St Albans Abbey for a thanksgiving service for the war's end, calling the use of atomic weapons "an act of wholesale, indiscriminate massacre". News of the atomic bombing was greeted more positively in the U.S.; a poll in Fortune magazine in late 1945 showed a significant minority of Americans (23 percent) wishing that more atomic bombs could have been dropped on Japan. The initial positive response was supported by the imagery presented to the public (mainly the powerful images of the mushroom cloud); at this time it was usual in the US not to use graphic images of death. Post-attack casualties An estimated 90,000 to 166,000 people in Hiroshima (between 26 and 49 percent of its population) and 60,000 to 80,000 people in Nagasaki (between 22 and 32 percent of its population) died in 1945, of which a majority in each case were killed on the days of the bombings, due to the force and heat of the blasts themselves. Nearly all of the remainder of victims died within two to four months, due to radiation exposure and resulting complications. One Atomic Bomb Casualty Commission (ABCC) report discusses 6,882 people examined in Hiroshima and 6,621 people examined in Nagasaki, who were largely within 2,000 meters (6,600 ft) of the hypocenter, who suffered injuries from the blast and heat but died from complications frequently compounded by acute radiation syndrome (ARS), all within about 20 to 30 days. Many people not injured by the blast eventually died within that timeframe as well after suffering from ARS. At the time, the doctors had no idea what the cause was and were unable to effectively treat the condition. Midori Naka was the first death officially certified to be the result of radiation poisoning or, as it was referred to by many, the "atomic bomb disease". She was some 650 meters (2,130 ft) from the hypocenter at Hiroshima and would die on 24 August 1945 after traveling to Tokyo. It was unappreciated at the time but the average radiation dose that would kill approximately 50 percent of adults (the LD50) was approximately halved; that is, smaller doses were made more lethal when the individual experienced concurrent blast or burn polytraumatic injuries. Conventional skin injuries that cover a large area frequently result in bacterial infection; the risk of sepsis and death is increased when a usually non-lethal radiation dose moderately suppresses the white blood cell count. In the spring of 1948, the ABCC was established in accordance with a presidential directive from Truman to the National Academy of Sciences–National Research Council to conduct investigations of the late effects of radiation among the survivors in Hiroshima and Nagasaki. In 1956, the ABCC published The Effect of Exposure to the Atomic Bombs on Pregnancy Termination in Hiroshima and Nagasaki. The ABCC became the Radiation Effects Research Foundation (RERF) on 1 April 1975. A binational organization run by both the United States and Japan, the RERF is still in operation today. Cancers do not immediately emerge after exposure to radiation; instead, radiation-induced cancer has a minimum latency period of some five years and above, and leukemia some two years and above, peaking around six to eight years later. Jarrett Foley published the first major reports on the significant increased incidence of the latter among survivors. Almost all cases of leukemia over the following 50 years were in people exposed to more than 1Gy. In a strictly dependent manner dependent on their distance from the hypocenter, in the 1987 Life Span Study, conducted by the Radiation Effects Research Foundation, a statistical excess of 507 cancers, of undefined lethality, were observed in 79,972 hibakusha who had still been living between 1958 and 1987 and who took part in the study. As the epidemiology study continues with time, the RERF estimates that, from 1950 to 2000, 46 percent of leukemia deaths which may include Sadako Sasaki and 11 percent of solid cancers of unspecified lethality were likely due to radiation from the bombs or some other post-attack city effects, with the statistical excess being 200 leukemia deaths and 1,700 solid cancers of undeclared lethality. Both of these statistics being derived from the observation of approximately half of the total survivors, strictly those who took part in the study. A meta-analysis from 2016 found that radiation exposure increases cancer risk, but also that the average lifespan of survivors was reduced by only a few months compared to those not exposed to radiation. While during the preimplantation period, that is one to ten days following conception, intrauterine radiation exposure of "at least 0.2 Gy" can cause complications of implantation and death of the human embryo. One of the early studies conducted by the ABCC was on the outcome of pregnancies occurring in Hiroshima and Nagasaki, and in a control city, Kure, located 29 km (18 mi) south of Hiroshima, to discern the conditions and outcomes related to radiation exposure. James V. Neel led the study which found that the overall number of birth defects was not significantly higher among the children of survivors who were pregnant at the time of the bombings. He also studied the longevity of the children who survived the bombings of Hiroshima and Nagasaki, reporting that between 90 and 95 percent were still living 50 years later. While the National Academy of Sciences raised the possibility that Neel's procedure did not filter the Kure population for possible radiation exposure which could bias the results, overall, a statistically insignificant increase in birth defects occurred directly after the bombings of Nagasaki and Hiroshima when the cities were taken as wholes, in terms of distance from the hypocenters. However, Neel and others noted that in approximately 50 humans who were of an early gestational age at the time of the bombing and who were all within about 1 kilometer (0.62 mi) of the hypocenter, an increase in microencephaly and anencephaly was observed upon birth, with the incidence of these two particular malformations being nearly three times what was to be expected when compared to the control group in Kure. In 1985, Johns Hopkins University geneticist James F. Crow examined Neel's research and confirmed that the number of birth defects was not significantly higher in Hiroshima and Nagasaki. Many members of the ABCC and its successor Radiation Effects Research Foundation (RERF) were still looking for possible birth defects among the survivors decades later, but found no evidence that they were significantly common among the survivors or inherited in the children of survivors. Despite the small sample size of 1,600 to 1,800 persons who came forth as prenatally exposed at the time of the bombings, that were both within a close proximity to the two hypocenters, to survive the in utero absorption of a substantial dose of radiation and then the malnourished post-attack environment, data from this cohort do support the increased risk of severe mental retardation (SMR), that was observed in some 30 individuals, with SMR being a common outcome of the aforementioned microencephaly. While a lack of statistical data, with just 30 individuals out of 1,800, prevents a definitive determination of a threshold point, the data collected suggests a threshold intrauterine or fetal dose for SMR, at the most radiosensitive period of cognitive development, when there is the largest number of undifferentiated neural cells (8 to 15 weeks post-conception) to begin at a threshold dose of approximately "0.09" to "0.15" Gy, with the risk then linearly increasing to a 43-percent rate of SMR when exposed to a fetal dose of 1 Gy at any point during these weeks of rapid neurogenesis. However either side of this radiosensitive age, none of the prenatally exposed to the bombings at an age less than 8 weeks, that is prior to synaptogenesis or at a gestational age more than 26 weeks "were observed to be mentally retarded", with the condition therefore being isolated to those solely of 8–26 weeks of age and who absorbed more than approximately "0.09" to "0.15" Gy of prompt radiation energy. Examination of the prenatally exposed in terms of IQ performance and school records, determined the beginning of a statistically significant reduction in both, when exposed to greater than 0.1 to 0.5 gray, during the same gestational period of 8–25 weeks. However outside this period, at less than 8 weeks and greater than 26 after conception, "there is no evidence of a radiation-related effect on scholastic performance." The reporting of doses in terms of absorbed energy in units of grays and rads – rather than the biologically significant, biologically weighted sievert in both the SMR and cognitive performance data – is typical. The reported threshold dose variance between the two cities is suggested to be a manifestation of the difference between X-ray and neutron absorption, with Little Boy emitting substantially more neutron flux, whereas the Baratol that surrounded the core of Fat Man filtered or shifted the absorbed neutron-radiation profile, so that the dose of radiation energy received in Nagasaki was mostly that from exposure to X-rays/gamma rays. Contrast this to the environment within 1500 meters of the hypocenter at Hiroshima, where the in-utero dose depended more on the absorption of neutrons which have a higher biological effect per unit of energy absorbed. From the radiation dose reconstruction work, the estimated dosimetry at Hiroshima still has the largest uncertainty as the Little Boy bomb design was never tested before deployment or afterward; as such, the estimated radiation profile absorbed by individuals at Hiroshima had required greater reliance on calculations than the Japanese soil, concrete and roof-tile measurements which began to reach accurate levels and thereby inform researchers, in the 1990s. Many other investigations into cognitive outcomes, such as schizophrenia as a result of prenatal exposure, have been conducted with "no statistically significant linear relationship seen". There is a suggestion that in the most extremely exposed, those who survived within a kilometer or so of the hypocenters, a trend emerges akin to that seen in SMR, though the sample size is too small to determine with any significance. Hibakusha The survivors of the bombings are called hibakusha (被爆者; pronounced [çibaꜜkɯ̥ɕa] or [çibakɯ̥ꜜɕa]), a Japanese word that translates to "explosion-affected people". The Japanese government has recognized about 650,000 people as hibakusha. As of 31 March 2025[update], 99,130 were still alive, mostly in Japan. The government of Japan recognizes about one percent of these as having illnesses caused by radiation.[better source needed] The memorials in Hiroshima and Nagasaki contain lists of the names of the hibakusha who are known to have died since the bombings. Updated annually on the anniversaries of the bombings, as of August 2025[update], the memorials record the names of more than 550,000 hibakusha; 349,246 in Hiroshima and 201,942 in Nagasaki. If they discuss their background, hibakusha and their children were (and still are) victims of fear-based discrimination and exclusion for marriage or work due to public ignorance; much of the public persist with the belief that the hibakusha carry some hereditary or even contagious disease. This is despite the fact that no statistically demonstrable increase of birth defects/congenital malformations was found among the later conceived children born to survivors of the nuclear weapons used at Hiroshima and Nagasaki, or has been found in the later conceived children of cancer survivors who had previously received radiotherapy. The surviving women of Hiroshima and Nagasaki, that could conceive, who were exposed to substantial amounts of radiation, had children with no higher incidence of abnormalities/birth defects than the rate which is observed in the Japanese average. A study of the long-term psychological effects of the bombings on the survivors found that even 17–20 years after the bombings had occurred survivors showed a higher prevalence of anxiety and somatization symptoms. Perhaps as many as 200 people from Hiroshima sought refuge in Nagasaki. The 2006 documentary Twice Survived: The Doubly Atomic Bombed of Hiroshima and Nagasaki documented 165 nijū hibakusha (lit. double explosion-affected people), nine of whom claimed to be in the blast zone in both cities. On 24 March 2009, the Japanese government officially recognized Tsutomu Yamaguchi as a double hibakusha. He was confirmed to be 3 km (1.9 mi) from ground zero in Hiroshima on a business trip when the bomb was detonated. He was seriously burnt on his left side and spent the night in Hiroshima. He arrived at his home city of Nagasaki on 8 August, the day before the bombing, and he was exposed to residual radiation while searching for his relatives. He was the first officially recognized survivor of both bombings. He died in 2010 of stomach cancer. During the war, Japan brought as many as 670,000 Korean conscripts to Japan to work as forced labor. About 5,000–8,000 Koreans were killed in Hiroshima and 1,500–2,000 in Nagasaki. According to the South Korea Atomic Bomb Victims Association, about 70,000 Koreans were exposed to the bomb. By the end of 1945, some 40,000 (57.1%) had died. The overall rate was about 33.7%. According to a study by the Gyeonggi Welfare Foundation, some survivors were forced to clear rubble and recover bodies (which were all radioactive). Koreans without local ties remained in the city, exposed to the radioactive fallout and with limited access to medical care, while Japanese evacuees fled to relatives. Korean survivors had a difficult time fighting for the same recognition as hibakusha as afforded to all Japanese survivors, a situation which resulted in the denial of free health benefits to them in Japan. Most issues were eventually addressed in 2008 through lawsuits. Memorials Hiroshima was subsequently struck by Typhoon Ida on 17 September 1945. More than half the bridges were destroyed, and the roads and railroads were damaged, further devastating the city. The population increased from 83,000 soon after the bombing to 146,000 in February 1946. The city was rebuilt after the war, with help from the national government through the Hiroshima Peace Memorial City Construction Law passed in 1949. It provided financial assistance for reconstruction, along with land donated that was previously owned by the national government and used for military purposes. In 1949, a design was selected for the Hiroshima Peace Memorial Park. Hiroshima Prefectural Industrial Promotion Hall, the closest surviving building to the location of the bomb's detonation, was designated the Hiroshima Peace Memorial. The Hiroshima Peace Memorial Museum was opened in 1955 in the Peace Park. Hiroshima also contains a Peace Pagoda, built in 1966 by Nipponzan-Myōhōji-Daisanga. On 27 January 1981, the Atomic Bombing Relic Selecting Committee of Hiroshima announced to build commemorative plaques at nine historical sites related to the bombing in the year. Genbaku Dome, Shima Hospital (hypocenter), Motoyasu Bridge [ja] all unveiled plaques with historical photographs and descriptions. The rest sites planned including Hondō Shopping Street, Motomachi No.2 Army Hospital site, Hiroshima Red Cross Hospital [ja], Fukuromachi Elementary School [ja], Hiroshima City Hall [ja] and Hiroshima Station. The committee also planned to establish 30 commemorative plaques in three years. Nagasaki was rebuilt and dramatically changed form after the war. The pace of reconstruction was initially slow, and the first simple emergency dwellings were not provided until 1946. The focus on redevelopment was the replacement of war industries with foreign trade, shipbuilding and fishing. This was formally declared when the Nagasaki International Culture City Reconstruction Law was passed in May 1949. New temples were built, as well as new churches owing to an increase in the presence of Christianity. The Nagasaki Atomic Bomb Museum opened in the mid-1990s. Some of the rubble was left as a memorial, such as a torii at Sannō Shrine, and an arch near ground zero. In 2013, four locations were designated Registered Monuments to provide legal protection against future development. These four sites, together with "ground zero" (the hypocenter of the atomic bomb explosion) were collectively designated a National Historic Site in 2016. These sites include: Debate over bombings The role of the bombings in Japan's surrender, and the ethical, legal, and military controversies surrounding the United States' justification for them have been the subject of scholarly and popular debate. On one hand, it has been argued that the bombings caused the Japanese surrender, thereby preventing casualties that an invasion of Japan would have involved. Stimson talked of saving one million casualties. The naval blockade might have starved the Japanese into submission without an invasion, but this would also have resulted in many more Japanese deaths. However, critics of the bombings have asserted that atomic weapons are fundamentally immoral, that the bombings were war crimes, and that they constituted state terrorism. The Japanese may have surrendered without the bombings, but only an unconditional surrender would satisfy the Allies. Others, such as historian Tsuyoshi Hasegawa, argued that the entry of the Soviet Union into the war against Japan "played a much greater role than the atomic bombs in inducing Japan to surrender because it dashed any hope that Japan could terminate the war through Moscow's mediation". A view among critics of the bombings, popularized by American historian Gar Alperovitz in 1965, is that the United States used nuclear weapons to intimidate the Soviet Union in the early stages of the Cold War. James Orr wrote that this idea became the accepted position in Japan and that it may have played some part in the decision-making of the US government. Legal considerations The Hague Conventions of 1899 and 1907, which address the codes of wartime conduct on land and at sea, were adopted before the rise of air power. Despite repeated diplomatic attempts to update international humanitarian law to include aerial warfare, it was not updated before World War II. The absence of specific international humanitarian law did not mean aerial warfare was not covered under the laws of war, but rather that there was no general agreement of how to interpret those laws. This means that aerial bombardment of civilian areas in enemy territory by all major belligerents during World War II was not prohibited by positive or specific customary international humanitarian law. In 1963 the bombings were subjected to judicial review in Ryuichi Shimoda v. The State. The District Court of Tokyo ruled the use of nuclear weapons in warfare was not illegal, but held in its obiter dictum that the atomic bombings of both Hiroshima and Nagasaki were illegal under international law of that time, as an indiscriminate bombardment of undefended cities. The court denied the appellants compensation on the grounds that the Japanese government had waived the right for reparations from the U.S. government under the Treaty of San Francisco. Legacy By 30 June 1946, there were components for nine atomic bombs in the US arsenal, all Fat Man devices identical to the one used at Nagasaki. The nuclear weapons were handmade devices, and a great deal of work remained to improve their ease of assembly, safety, reliability and storage before they were ready for production. There were also many improvements to their performance that had been suggested or recommended, but that had not been possible under the pressure of wartime development. The Chairman of the Joint Chiefs of Staff, Fleet Admiral William D. Leahy, decried the use of the atomic bombs as adopting "an ethical standard common to the barbarians of the Dark Ages", but in October 1947 he reported a military requirement for 400 bombs. The American monopoly on nuclear weapons lasted four years before the Soviet Union detonated an atomic bomb in September 1949.[a] The United States responded with the development of the hydrogen bomb, a thousand times as powerful as the bombs that devastated Hiroshima and Nagasaki. Such ordinary fission bombs would henceforth be regarded as small tactical nuclear weapons. By 1986, the United States had 23,317 nuclear weapons and the Soviet Union had 40,159. In early 2019, more than 90% of the world's 13,865 nuclear weapons were owned by the United States and Russia. By 2020, nine nations had nuclear weapons. Japan, not one of them, reluctantly signed the Treaty on the Non-Proliferation of Nuclear Weapons in February 1970, but remained protected by the United States in the arrangement known as the nuclear umbrella. American nuclear weapons were stored on Okinawa, and sometimes in Japan itself, albeit in contravention of agreements between the two nations. Lacking the resources to fight the Soviet Union using conventional forces, NATO came to depend on the use of nuclear weapons to defend itself during the Cold War, a policy that became known in the 1950s as the New Look. In the decades after Hiroshima and Nagasaki, the United States threatened many times to use its nuclear weapons. On 7 July 2017, more than 120 countries voted to adopt the UN Treaty on the Prohibition of Nuclear Weapons. Elayne Whyte Gómez, president of the UN negotiations, said, "the world has been waiting for this legal norm for 70 years". As of 2024[update], Japan has not signed the treaty. See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-rbb24.de-287] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/OpenAI#cite_note-sf-standard-253] | [TOKENS: 8773] |
Contents OpenAI OpenAI is an American artificial intelligence research organization comprising both a non-profit foundation and a controlled for-profit public benefit corporation (PBC), headquartered in San Francisco. It aims to develop "safe and beneficial" artificial general intelligence (AGI), which it defines as "highly autonomous systems that outperform humans at most economically valuable work". OpenAI is widely recognized for its development of the GPT family of large language models, the DALL-E series of text-to-image models, and the Sora series of text-to-video models, which have influenced industry research and commercial applications. Its release of ChatGPT in November 2022 has been credited with catalyzing widespread interest in generative AI. The organization was founded in 2015 in Delaware but evolved a complex corporate structure. As of October 2025, following restructuring approved by California and Delaware regulators, the non-profit OpenAI Foundation holds 26% of the for-profit OpenAI Group PBC, with Microsoft holding 27% and employees/other investors holding 47%. Under its governance arrangements, the OpenAI Foundation holds the authority to appoint the board of the for-profit OpenAI Group PBC, a mechanism designed to align the entity’s strategic direction with the Foundation’s charter. Microsoft previously invested over $13 billion into OpenAI, and provides Azure cloud computing resources. In October 2025, OpenAI conducted a $6.6 billion share sale that valued the company at $500 billion. In 2023 and 2024, OpenAI faced multiple lawsuits for alleged copyright infringement against authors and media companies whose work was used to train some of OpenAI's products. In November 2023, OpenAI's board removed Sam Altman as CEO, citing a lack of confidence in him, but reinstated him five days later following a reconstruction of the board. Throughout 2024, roughly half of then-employed AI safety researchers left OpenAI, citing the company's prominent role in an industry-wide problem. Founding In December 2015, OpenAI was founded as a not for profit organization by Sam Altman, Elon Musk, Ilya Sutskever, Greg Brockman, Trevor Blackwell, Vicki Cheung, Andrej Karpathy, Durk Kingma, John Schulman, Pamela Vagata, and Wojciech Zaremba, with Sam Altman and Elon Musk as the co-chairs. A total of $1 billion in capital was pledged by Sam Altman, Greg Brockman, Elon Musk, Reid Hoffman, Jessica Livingston, Peter Thiel, Amazon Web Services (AWS), and Infosys. However, the actual capital collected significantly lagged pledges. According to company disclosures, only $130 million had been received by 2019. In its founding charter, OpenAI stated an intention to collaborate openly with other institutions by making certain patents and research publicly available, but later restricted access to its most capable models, citing competitive and safety concerns. OpenAI was initially run from Brockman's living room. It was later headquartered at the Pioneer Building in the Mission District, San Francisco. According to OpenAI's charter, its founding mission is "to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity." Musk and Altman stated in 2015 that they were partly motivated by concerns about AI safety and existential risk from artificial general intelligence. OpenAI stated that "it's hard to fathom how much human-level AI could benefit society", and that it is equally difficult to comprehend "how much it could damage society if built or used incorrectly". The startup also wrote that AI "should be an extension of individual human wills and, in the spirit of liberty, as broadly and evenly distributed as possible", and that "because of AI's surprising history, it's hard to predict when human-level AI might come within reach. When it does, it'll be important to have a leading research institution which can prioritize a good outcome for all over its own self-interest." Co-chair Sam Altman expected a decades-long project that eventually surpasses human intelligence. Brockman met with Yoshua Bengio, one of the "founding fathers" of deep learning, and drew up a list of great AI researchers. Brockman was able to hire nine of them as the first employees in December 2015. OpenAI did not pay AI researchers salaries comparable to those of Facebook or Google. It also did not pay stock options which AI researchers typically get. Nevertheless, OpenAI spent $7 million on its first 52 employees in 2016. OpenAI's potential and mission drew these researchers to the firm; a Google employee said he was willing to leave Google for OpenAI "partly because of the very strong group of people and, to a very large extent, because of its mission." OpenAI co-founder Wojciech Zaremba stated that he turned down "borderline crazy" offers of two to three times his market value to join OpenAI instead. In April 2016, OpenAI released a public beta of "OpenAI Gym", its platform for reinforcement learning research. Nvidia gifted its first DGX-1 supercomputer to OpenAI in August 2016 to help it train larger and more complex AI models with the capability of reducing processing time from six days to two hours. In December 2016, OpenAI released "Universe", a software platform for measuring and training an AI's general intelligence across the world's supply of games, websites, and other applications. Corporate structure In 2019, OpenAI transitioned from non-profit to "capped" for-profit, with the profit being capped at 100 times any investment. According to OpenAI, the capped-profit model allows OpenAI Global, LLC to legally attract investment from venture funds and, in addition, to grant employees stakes in the company. Many top researchers work for Google Brain, DeepMind, or Facebook, which offer equity that a nonprofit would be unable to match. Before the transition, OpenAI was legally required to publicly disclose the compensation of its top employees. The company then distributed equity to its employees and partnered with Microsoft, announcing an investment package of $1 billion into the company. Since then, OpenAI systems have run on an Azure-based supercomputing platform from Microsoft. OpenAI Global, LLC then announced its intention to commercially license its technologies. It planned to spend $1 billion "within five years, and possibly much faster". Altman stated that even a billion dollars may turn out to be insufficient, and that the lab may ultimately need "more capital than any non-profit has ever raised" to achieve artificial general intelligence. The nonprofit, OpenAI, Inc., is the sole controlling shareholder of OpenAI Global, LLC, which, despite being a for-profit company, retains a formal fiduciary responsibility to OpenAI, Inc.'s nonprofit charter. A majority of OpenAI, Inc.'s board is barred from having financial stakes in OpenAI Global, LLC. In addition, minority members with a stake in OpenAI Global, LLC are barred from certain votes due to conflict of interest. Some researchers have argued that OpenAI Global, LLC's switch to for-profit status is inconsistent with OpenAI's claims to be "democratizing" AI. On February 29, 2024, Elon Musk filed a lawsuit against OpenAI and CEO Sam Altman, accusing them of shifting focus from public benefit to profit maximization—a case OpenAI dismissed as "incoherent" and "frivolous," though Musk later revived legal action against Altman and others in August. On April 9, 2024, OpenAI countersued Musk in federal court, alleging that he had engaged in "bad-faith tactics" to slow the company's progress and seize its innovations for his personal benefit. OpenAI also argued that Musk had previously supported the creation of a for-profit structure and had expressed interest in controlling OpenAI himself. The countersuit seeks damages and legal measures to prevent further alleged interference. On February 10, 2025, a consortium of investors led by Elon Musk submitted a $97.4 billion unsolicited bid to buy the nonprofit that controls OpenAI, declaring willingness to match or exceed any better offer. The offer was rejected on 14 February 2025, with OpenAI stating that it was not for sale, but the offer complicated Altman's restructuring plan by suggesting a lower bar for how much the nonprofit should be valued. OpenAI, Inc. was originally designed as a nonprofit in order to ensure that AGI "benefits all of humanity" rather than "the private gain of any person". In 2019, it created OpenAI Global, LLC, a capped-profit subsidiary controlled by the nonprofit. In December 2024, OpenAI proposed a restructuring plan to convert the capped-profit into a Delaware-based public benefit corporation (PBC), and to release it from the control of the nonprofit. The nonprofit would sell its control and other assets, getting equity in return, and would use it to fund and pursue separate charitable projects, including in science and education. OpenAI's leadership described the change as necessary to secure additional investments, and claimed that the nonprofit's founding mission to ensure AGI "benefits all of humanity" would be better fulfilled. The plan has been criticized by former employees. A legal letter named "Not For Private Gain" asked the attorneys general of California and Delaware to intervene, stating that the restructuring is illegal and would remove governance safeguards from the nonprofit and the attorneys general. The letter argues that OpenAI's complex structure was deliberately designed to remain accountable to its mission, without the conflicting pressure of maximizing profits. It contends that the nonprofit is best positioned to advance its mission of ensuring AGI benefits all of humanity by continuing to control OpenAI Global, LLC, whatever the amount of equity that it could get in exchange. PBCs can choose how they balance their mission with profit-making. Controlling shareholders have a large influence on how closely a PBC sticks to its mission. On October 28, 2025, OpenAI announced that it had adopted the new PBC corporate structure after receiving approval from the attorneys general of California and Delaware. Under the new structure, OpenAI's for-profit branch became a public benefit corporation known as OpenAI Group PBC, while the non-profit was renamed to the OpenAI Foundation. The OpenAI Foundation holds a 26% stake in the PBC, while Microsoft holds a 27% stake and the remaining 47% is owned by employees and other investors. All members of the OpenAI Group PBC board of directors will be appointed by the OpenAI Foundation, which can remove them at any time. Members of the Foundation's board will also serve on the for-profit board. The new structure allows the for-profit PBC to raise investor funds like most traditional tech companies, including through an initial public offering, which Altman claimed was the most likely path forward. In January 2023, OpenAI Global, LLC was in talks for funding that would value the company at $29 billion, double its 2021 value. On January 23, 2023, Microsoft announced a new US$10 billion investment in OpenAI Global, LLC over multiple years, partially needed to use Microsoft's cloud-computing service Azure. From September to December, 2023, Microsoft rebranded all variants of its Copilot to Microsoft Copilot, and they added MS-Copilot to many installations of Windows and released Microsoft Copilot mobile apps. Following OpenAI's 2025 restructuring, Microsoft owns a 27% stake in the for-profit OpenAI Group PBC, valued at $135 billion. In a deal announced the same day, OpenAI agreed to purchase $250 billion of Azure services, with Microsoft ceding their right of first refusal over OpenAI's future cloud computing purchases. As part of the deal, OpenAI will continue to share 20% of its revenue with Microsoft until it achieves AGI, which must now be verified by an independent panel of experts. The deal also loosened restrictions on both companies working with third parties, allowing Microsoft to pursue AGI independently and allowing OpenAI to develop products with other companies. In 2017, OpenAI spent $7.9 million, a quarter of its functional expenses, on cloud computing alone. In comparison, DeepMind's total expenses in 2017 were $442 million. In the summer of 2018, training OpenAI's Dota 2 bots required renting 128,000 CPUs and 256 GPUs from Google for multiple weeks. In October 2024, OpenAI completed a $6.6 billion capital raise with a $157 billion valuation including investments from Microsoft, Nvidia, and SoftBank. On January 21, 2025, Donald Trump announced The Stargate Project, a joint venture between OpenAI, Oracle, SoftBank and MGX to build an AI infrastructure system in conjunction with the US government. The project takes its name from OpenAI's existing "Stargate" supercomputer project and is estimated to cost $500 billion. The partners planned to fund the project over the next four years. In July, the United States Department of Defense announced that OpenAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and xAI. In the same month, the company made a deal with the UK Government to use ChatGPT and other AI tools in public services. OpenAI subsequently began a $50 million fund to support nonprofit and community organizations. In April 2025, OpenAI raised $40 billion at a $300 billion post-money valuation, which was the highest-value private technology deal in history. The financing round was led by SoftBank, with other participants including Microsoft, Coatue, Altimeter and Thrive. In July 2025, the company reported annualized revenue of $12 billion. This was an increase from $3.7 billion in 2024, which was driven by ChatGPT subscriptions, which reached 20 million paid subscribers by April 2025, up from 15.5 million at the end of 2024, alongside a rapidly expanding enterprise customer base that grew to five million business users. The company’s cash burn remains high because of the intensive computational costs required to train and operate large language models. It projects an $8 billion operating loss in 2025. OpenAI reports revised long-term spending projections totaling approximately $115 billion through 2029, with annual expenditures projected to escalate significantly, reaching $17 billion in 2026, $35 billion in 2027, and $45 billion in 2028. These expenditures are primarily allocated toward expanding compute infrastructure, developing proprietary AI chips, constructing data centers, and funding intensive model training programs, with more than half of the spending through the end of the decade expected to support research-intensive compute for model training and development. The company's financial strategy prioritizes market expansion and technological advancement over near-term profitability, with OpenAI targeting cash-flow-positive operations by 2029 and projecting revenue of approximately $200 billion by 2030. This aggressive spending trajectory underscores both the enormous capital requirements of scaling cutting-edge AI technology and OpenAI's commitment to maintaining its position as a leader in the artificial intelligence industry. In October 2025, OpenAI completed an employee share sale of up to $10 billion to existing investors which valued the company at $500 billion. The deal values OpenAI as the most valuable privately owned company in the world—surpassing SpaceX as the world's most valuable private company. On November 17, 2023, Sam Altman was removed as CEO when its board of directors (composed of Helen Toner, Ilya Sutskever, Adam D'Angelo and Tasha McCauley) cited a lack of confidence in him. Chief Technology Officer Mira Murati took over as interim CEO. Greg Brockman, the president of OpenAI, was also removed as chairman of the board and resigned from the company's presidency shortly thereafter. Three senior OpenAI researchers subsequently resigned: director of research and GPT-4 lead Jakub Pachocki, head of AI risk Aleksander Mądry, and researcher Szymon Sidor. On November 18, 2023, there were reportedly talks of Altman returning as CEO amid pressure placed upon the board by investors such as Microsoft and Thrive Capital, who objected to Altman's departure. Although Altman himself spoke in favor of returning to OpenAI, he has since stated that he considered starting a new company and bringing former OpenAI employees with him if talks to reinstate him didn't work out. The board members agreed "in principle" to resign if Altman returned. On November 19, 2023, negotiations with Altman to return failed and Murati was replaced by Emmett Shear as interim CEO. The board initially contacted Anthropic CEO Dario Amodei (a former OpenAI executive) about replacing Altman, and proposed a merger of the two companies, but both offers were declined. On November 20, 2023, Microsoft CEO Satya Nadella announced Altman and Brockman would be joining Microsoft to lead a new advanced AI research team, but added that they were still committed to OpenAI despite recent events. Before the partnership with Microsoft was finalized, Altman gave the board another opportunity to negotiate with him. About 738 of OpenAI's 770 employees, including Murati and Sutskever, signed an open letter stating they would quit their jobs and join Microsoft if the board did not rehire Altman and then resign. This prompted OpenAI investors to consider legal action against the board as well. In response, OpenAI management sent an internal memo to employees stating that negotiations with Altman and the board had resumed and would take some time. On November 21, 2023, after continued negotiations, Altman and Brockman returned to the company in their prior roles along with a reconstructed board made up of new members Bret Taylor (as chairman) and Lawrence Summers, with D'Angelo remaining. According to subsequent reporting, shortly before Altman’s firing, some employees raised concerns to the board about how he had handled the safety implications of a recent internal AI capability discovery. On November 29, 2023, OpenAI announced that an anonymous Microsoft employee had joined the board as a non-voting member to observe the company's operations; Microsoft resigned from the board in July 2024. In February 2024, the Securities and Exchange Commission subpoenaed OpenAI's internal communication to determine if Altman's alleged lack of candor misled investors. In 2024, following the temporary removal of Sam Altman and his return, many employees gradually left OpenAI, including most of the original leadership team and a significant number of AI safety researchers. In August 2023, it was announced that OpenAI had acquired the New York-based start-up Global Illumination, a company that deploys AI to develop digital infrastructure and creative tools. In June 2024, OpenAI acquired Multi, a startup focused on remote collaboration. In March 2025, OpenAI reached a deal with CoreWeave to acquire $350 million worth of CoreWeave shares and access to AI infrastructure, in return for $11.9 billion paid over five years. Microsoft was already CoreWeave's biggest customer in 2024. Alongside their other business dealings, OpenAI and Microsoft were renegotiating the terms of their partnership to facilitate a potential future initial public offering by OpenAI, while ensuring Microsoft's continued access to advanced AI models. On May 21, OpenAI announced the $6.5 billion acquisition of io, an AI hardware start-up founded by former Apple designer Jony Ive in 2024. In September 2025, OpenAI agreed to acquire the product testing startup Statsig for $1.1 billion in an all-stock deal and appointed Statsig's founding CEO Vijaye Raji as OpenAI's chief technology officer of applications. The company also announced development of an AI-driven hiring service designed to rival LinkedIn. OpenAI acquired personal finance app Roi in October 2025. In October 2025, OpenAI acquired Software Applications Incorporated, the developer of Sky, a macOS-based natural language interface designed to operate across desktop applications. The Sky team joined OpenAI, and the company announced plans to integrate Sky’s capabilities into ChatGPT. In December 2025, it was announced OpenAI had agreed to acquire Neptune, an AI tooling startup that helps companies track and manage model training, for an undisclosed amount. In January 2026, it was announced OpenAI had acquired healthcare technology startup Torch for approximately $60 million. The acquisition followed the launch of OpenAI’s ChatGPT Health product and was intended to strengthen the company’s medical data and healthcare artificial intelligence capabilities. OpenAI has been criticized for outsourcing the annotation of data sets to Sama, a company based in San Francisco that employed workers in Kenya. These annotations were used to train an AI model to detect toxicity, which could then be used to moderate toxic content, notably from ChatGPT's training data and outputs. However, these pieces of text usually contained detailed descriptions of various types of violence, including sexual violence. The investigation uncovered that OpenAI began sending snippets of data to Sama as early as November 2021. The four Sama employees interviewed by Time described themselves as mentally scarred. OpenAI paid Sama $12.50 per hour of work, and Sama was redistributing the equivalent of between $1.32 and $2.00 per hour post-tax to its annotators. Sama's spokesperson said that the $12.50 was also covering other implicit costs, among which were infrastructure expenses, quality assurance and management. In 2024, OpenAI began collaborating with Broadcom to design a custom AI chip capable of both training and inference, targeted for mass production in 2026 and to be manufactured by TSMC on a 3 nm process node. This initiative intended to reduce OpenAI's dependence on Nvidia GPUs, which are costly and face high demand in the market. In January 2024, Arizona State University purchased ChatGPT Enterprise in OpenAI's first deal with a university. In June 2024, Apple Inc. signed a contract with OpenAI to integrate ChatGPT features into its products as part of its new Apple Intelligence initiative. In June 2025, OpenAI began renting Google Cloud's Tensor Processing Units (TPUs) to support ChatGPT and related services, marking its first meaningful use of non‑Nvidia AI chips. In September 2025, it was revealed that OpenAI signed a contract with Oracle to purchase $300 billion in computing power over the next five years. In September 2025, OpenAI and NVIDIA announced a memorandum of understanding that included a potential deployment of at least 10 gigawatts of NVIDIA systems and a $100 billion investment from NVIDIA in OpenAI. OpenAI expected the negotiations to be completed within weeks. As of January 2026, this has not been realized, and the two sides are rethinking the future of their partnership. In October 2025, OpenAI announced a multi-billion dollar deal with AMD. OpenAI committed to purchasing six gigawatts worth of AMD chips, starting with the MI450. OpenAI will have the option to buy up to 160 million shares of AMD, about 10% of the company, depending on development, performance and share price targets. In December 2025, Disney said it would make a $1 billion investment in OpenAI, and signed a three-year licensing deal that will let users generate videos using Sora—OpenAI's short-form AI video platform. More than 200 Disney, Marvel, Star Wars and Pixar characters will be available to OpenAI users. In early 2026, Amazon entered advanced discussions to invest up to $50 billion in OpenAI as part of a potential artificial intelligence partnership. Under the proposed agreement, OpenAI’s models could be integrated into Amazon’s digital assistant Alexa and other internal projects. OpenAI provides LLMs to the Artificial Intelligence Cyber Challenge and to the Advanced Research Projects Agency for Health. In October 2024, The Intercept revealed that OpenAI's tools are considered "essential" for AFRICOM's mission and included in an "Exception to Fair Opportunity" contractual agreement between the United States Department of Defense and Microsoft. In December 2024, OpenAI said it would partner with defense-tech company Anduril to build drone defense technologies for the United States and its allies. In 2025, OpenAI's Chief Product Officer, Kevin Weil, was commissioned lieutenant colonel in the U.S. Army to join Detachment 201 as senior advisor. In June 2025, the U.S. Department of Defense awarded OpenAI a $200 million one-year contract to develop AI tools for military and national security applications. OpenAI announced a new program, OpenAI for Government, to give federal, state, and local governments access to its models, including ChatGPT. Services In February 2019, GPT-2 was announced, which gained attention for its ability to generate human-like text. In 2020, OpenAI announced GPT-3, a language model trained on large internet datasets. GPT-3 is aimed at natural language answering questions, but it can also translate between languages and coherently generate improvised text. It also announced that an associated API, named the API, would form the heart of its first commercial product. Eleven employees left OpenAI, mostly between December 2020 and January 2021, in order to establish Anthropic. In 2021, OpenAI introduced DALL-E, a specialized deep learning model adept at generating complex digital images from textual descriptions, utilizing a variant of the GPT-3 architecture. In December 2022, OpenAI received widespread media coverage after launching a free preview of ChatGPT, its new AI chatbot based on GPT-3.5. According to OpenAI, the preview received over a million signups within the first five days. According to anonymous sources cited by Reuters in December 2022, OpenAI Global, LLC was projecting $200 million of revenue in 2023 and $1 billion in revenue in 2024. After ChatGPT was launched, Google announced a similar chatbot, Bard, amid internal concerns that ChatGPT could threaten Google’s position as a primary source of online information. On February 7, 2023, Microsoft announced that it was building AI technology based on the same foundation as ChatGPT into Microsoft Bing, Edge, Microsoft 365 and other products. On March 14, 2023, OpenAI released GPT-4, both as an API (with a waitlist) and as a feature of ChatGPT Plus. On November 6, 2023, OpenAI launched GPTs, allowing individuals to create customized versions of ChatGPT for specific purposes, further expanding the possibilities of AI applications across various industries. On November 14, 2023, OpenAI announced they temporarily suspended new sign-ups for ChatGPT Plus due to high demand. Access for newer subscribers re-opened a month later on December 13. In December 2024, the company launched the Sora model. It also launched OpenAI o1, an early reasoning model that was internally codenamed strawberry. Additionally, ChatGPT Pro—a $200/month subscription service offering unlimited o1 access and enhanced voice features—was introduced, and preliminary benchmark results for the upcoming OpenAI o3 models were shared. On January 23, 2025, OpenAI released Operator, an AI agent and web automation tool for accessing websites to execute goals defined by users. The feature was only available to Pro users in the United States. OpenAI released deep research agent, nine days later. It scored a 27% accuracy on the benchmark Humanity's Last Exam (HLE). Altman later stated GPT-4.5 would be the last model without full chain-of-thought reasoning. In July 2025, reports indicated that AI models by both OpenAI and Google DeepMind solved mathematics problems at the level of top-performing students in the International Mathematical Olympiad. OpenAI's large language model was able to achieve gold medal-level performance, reflecting significant progress in AI's reasoning abilities. On October 6, 2025, OpenAI unveiled its Agent Builder platform during the company's DevDay event. The platform includes a visual drag-and-drop interface that lets developers and businesses design, test, and deploy agentic workflows with limited coding. On October 21, 2025, OpenAI introduced ChatGPT Atlas, a browser integrating the ChatGPT assistant directly into web navigation, to compete with existing browsers such as Google Chrome and Apple Safari. On December 11, 2025, OpenAI announced GPT-5.2. This model will be better at creating spreadsheets, building presentations, perceiving images, writing code and understanding long context. On January 27, 2026, OpenAI introduced Prism, a LaTeX-native workspace meant to assist scientists to help with research and writing. The platform utilizes GPT-5.2 as a backend to automate the process of drafting for scientific papers, including features for managing citations, complex equation formatting, and real-time collaborative editing. In March 2023, the company was criticized for disclosing particularly few technical details about products like GPT-4, contradicting its initial commitment to openness and making it harder for independent researchers to replicate its work and develop safeguards. OpenAI cited competitiveness and safety concerns to justify this repudiation. OpenAI's former chief scientist Ilya Sutskever argued in 2023 that open-sourcing increasingly capable models was increasingly risky, and that the safety reasons for not open-sourcing the most potent AI models would become "obvious" in a few years. In September 2025, OpenAI published a study on how people use ChatGPT for everyday tasks. The study found that "non-work tasks" (according to an LLM-based classifier) account for more than 72 percent of all ChatGPT usage, with a minority of overall usage related to business productivity. In July 2023, OpenAI launched the superalignment project, aiming within four years to determine how to align future superintelligent systems. OpenAI promised to dedicate 20% of its computing resources to the project, although the team denied receiving anything close to 20%. OpenAI ended the project in May 2024 after its co-leaders Ilya Sutskever and Jan Leike left the company. In August 2025, OpenAI was criticized after thousands of private ChatGPT conversations were inadvertently exposed to public search engines like Google due to an experimental "share with search engines" feature. The opt-in toggle, intended to allow users to make specific chats discoverable, resulted in some discussions including personal details such as names, locations, and intimate topics appearing in search results when users accidentally enabled it while sharing links. OpenAI announced the feature's permanent removal on August 1, 2025, and the company began coordinating with search providers to remove the exposed content, emphasizing that it was not a security breach but a design flaw that heightened privacy risks. CEO Sam Altman acknowledged the issue in a podcast, noting users often treat ChatGPT as a confidant for deeply personal matters, which amplified concerns about AI handling sensitive data. Management In 2018, Musk resigned from his Board of Directors seat, citing "a potential future conflict [of interest]" with his role as CEO of Tesla due to Tesla's AI development for self-driving cars. OpenAI stated that Musk's financial contributions were below $45 million. On March 3, 2023, Reid Hoffman resigned from his board seat, citing a desire to avoid conflicts of interest with his investments in AI companies via Greylock Partners, and his co-founding of the AI startup Inflection AI. Hoffman remained on the board of Microsoft, a major investor in OpenAI. In May 2024, Chief Scientist Ilya Sutskever resigned and was succeeded by Jakub Pachocki. Co-leader Jan Leike also departed amid concerns over safety and trust. OpenAI then signed deals with Reddit, News Corp, Axios, and Vox Media. Paul Nakasone then joined the board of OpenAI. In August 2024, cofounder John Schulman left OpenAI to join Anthropic, and OpenAI's president Greg Brockman took extended leave until November. In September 2024, CTO Mira Murati left the company. In November 2025, Lawrence Summers resigned from the board of directors. Governance and legal issues In May 2023, Sam Altman, Greg Brockman and Ilya Sutskever posted recommendations for the governance of superintelligence. They stated that superintelligence could happen within the next 10 years, allowing a "dramatically more prosperous future" and that "given the possibility of existential risk, we can't just be reactive". They proposed creating an international watchdog organization similar to IAEA to oversee AI systems above a certain capability threshold, suggesting that relatively weak AI systems on the other side should not be overly regulated. They also called for more technical safety research for superintelligences, and asked for more coordination, for example through governments launching a joint project which "many current efforts become part of". In July 2023, the FTC issued a civil investigative demand to OpenAI to investigate whether the company's data security and privacy practices to develop ChatGPT were unfair or harmed consumers (including by reputational harm) in violation of Section 5 of the Federal Trade Commission Act of 1914. These are typically preliminary investigative matters and are nonpublic, but the FTC's document was leaked. In July 2023, the FTC launched an investigation into OpenAI over allegations that the company scraped public data and published false and defamatory information. They asked OpenAI for comprehensive information about its technology and privacy safeguards, as well as any steps taken to prevent the recurrence of situations in which its chatbot generated false and derogatory content about people. The agency also raised concerns about ‘circular’ spending arrangements—for example, Microsoft extending Azure credits to OpenAI while both companies shared engineering talent—and warned that such structures could negatively affect the public. In September 2024, OpenAI's global affairs chief endorsed the UK's "smart" AI regulation during testimony to a House of Lords committee. In February 2025, OpenAI CEO Sam Altman stated that the company is interested in collaborating with the People's Republic of China, despite regulatory restrictions imposed by the U.S. government. This shift comes in response to the growing influence of the Chinese artificial intelligence company DeepSeek, which has disrupted the AI market with open models, including DeepSeek V3 and DeepSeek R1. Following DeepSeek's market emergence, OpenAI enhanced security protocols to protect proprietary development techniques from industrial espionage. Some industry observers noted similarities between DeepSeek's model distillation approach and OpenAI's methodology, though no formal intellectual property claim was filed. According to Oliver Roberts, in March 2025, the United States had 781 state AI bills or laws. OpenAI advocated for preempting state AI laws with federal laws. According to Scott Kohler, OpenAI has opposed California's AI legislation and suggested that the state bill encroaches on a more competent federal government. Public Citizen opposed a federal preemption on AI and pointed to OpenAI's growth and valuation as evidence that existing state laws have not hampered innovation. Before May 2024, OpenAI required departing employees to sign a lifelong non-disparagement agreement forbidding them from criticizing OpenAI and acknowledging the existence of the agreement. Daniel Kokotajlo, a former employee, publicly stated that he forfeited his vested equity in OpenAI in order to leave without signing the agreement. Sam Altman stated that he was unaware of the equity cancellation provision, and that OpenAI never enforced it to cancel any employee's vested equity. However, leaked documents and emails refute this claim. On May 23, 2024, OpenAI sent a memo releasing former employees from the agreement. OpenAI was sued for copyright infringement by authors Sarah Silverman, Matthew Butterick, Paul Tremblay and Mona Awad in July 2023. In September 2023, 17 authors, including George R. R. Martin, John Grisham, Jodi Picoult and Jonathan Franzen, joined the Authors Guild in filing a class action lawsuit against OpenAI, alleging that the company's technology was illegally using their copyrighted work. The New York Times also sued the company in late December 2023. In May 2024 it was revealed that OpenAI had destroyed its Books1 and Books2 training datasets, which were used in the training of GPT-3, and which the Authors Guild believed to have contained over 100,000 copyrighted books. In 2021, OpenAI developed a speech recognition tool called Whisper. OpenAI used it to transcribe more than one million hours of YouTube videos into text for training GPT-4. The automated transcription of YouTube videos raised concerns within OpenAI employees regarding potential violations of YouTube's terms of service, which prohibit the use of videos for applications independent of the platform, as well as any type of automated access to its videos. Despite these concerns, the project proceeded with notable involvement from OpenAI's president, Greg Brockman. The resulting dataset proved instrumental in training GPT-4. In February 2024, The Intercept as well as Raw Story and Alternate Media Inc. filed lawsuit against OpenAI on copyright litigation ground. The lawsuit is said to have charted a new legal strategy for digital-only publishers to sue OpenAI. On April 30, 2024, eight newspapers filed a lawsuit in the Southern District of New York against OpenAI and Microsoft, claiming illegal harvesting of their copyrighted articles. The suing publications included The Mercury News, The Denver Post, The Orange County Register, St. Paul Pioneer Press, Chicago Tribune, Orlando Sentinel, Sun Sentinel, and New York Daily News. In June 2023, a lawsuit claimed that OpenAI scraped 300 billion words online without consent and without registering as a data broker. It was filed in San Francisco, California, by sixteen anonymous plaintiffs. They also claimed that OpenAI and its partner as well as customer Microsoft continued to unlawfully collect and use personal data from millions of consumers worldwide to train artificial intelligence models. On May 22, 2024, OpenAI entered into an agreement with News Corp to integrate news content from The Wall Street Journal, the New York Post, The Times, and The Sunday Times into its AI platform. Meanwhile, other publications like The New York Times chose to sue OpenAI and Microsoft for copyright infringement over the use of their content to train AI models. In November 2024, a coalition of Canadian news outlets, including the Toronto Star, Metroland Media, Postmedia, The Globe and Mail, The Canadian Press and CBC, sued OpenAI for using their news articles to train its software without permission. In October 2024 during a New York Times interview, Suchir Balaji accused OpenAI of violating copyright law in developing its commercial LLMs which he had helped engineer. He was a likely witness in a major copyright trial against the AI company, and was one of several of its current or former employees named in court filings as potentially having documents relevant to the case. On November 26, 2024, Balaji died by suicide. His death prompted the circulation of conspiracy theories alleging that he had been deliberately silenced. California Congressman Ro Khanna endorsed calls for an investigation. On April 24, 2025, Ziff Davis sued OpenAI in Delaware federal court for copyright infringement. Ziff Davis is known for publications such as ZDNet, PCMag, CNET, IGN and Lifehacker. In April 2023, the EU's European Data Protection Board (EDPB) formed a dedicated task force on ChatGPT "to foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities" based on the "enforcement action undertaken by the Italian data protection authority against OpenAI about the ChatGPT service". In late April 2024 NOYB filed a complaint with the Austrian Datenschutzbehörde against OpenAI for violating the European General Data Protection Regulation. A text created with ChatGPT gave a false date of birth for a living person without giving the individual the option to see the personal data used in the process. A request to correct the mistake was denied. Additionally, neither the recipients of ChatGPT's work nor the sources used, could be made available, OpenAI claimed. OpenAI was criticized for lifting its ban on using ChatGPT for "military and warfare". Up until January 10, 2024, its "usage policies" included a ban on "activity that has high risk of physical harm, including", specifically, "weapons development" and "military and warfare". Its new policies prohibit "[using] our service to harm yourself or others" and to "develop or use weapons". In August 2025, the parents of a 16-year-old boy who died by suicide filed a wrongful death lawsuit against OpenAI (and CEO Sam Altman), alleging that months of conversations with ChatGPT about mental health and methods of self-harm contributed to their son's death and that safeguards were inadequate for minors. OpenAI expressed condolences and said it was strengthening protections (including updated crisis response behavior and parental controls). Coverage described it as a first-of-its-kind wrongful death case targeting the company's chatbot. The complaint was filed in California state court in San Francisco. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed seven lawsuits against OpenAI, of which four lawsuits alleged wrongful death. The suits were filed on behalf of Zane Shamblin, 23, of Texas; Amaurie Lacey, 17, of Georgia; Joshua Enneking, 26, of Florida; and Joe Ceccanti, 48, of Oregon, who each committed suicide after prolonged ChatGPT usage. In December 2025, Stein-Erik Soelberg, who was 56 years old at the time, allegedly murdered his mother Suzanne Adams. In the months prior the paranoid, delusional man often discussed his ideas with ChatGPT. Adam's estate then sued OpenAI claiming that the company shared responsibility due to the risk of chatbot psychosis despite the fact that chatbot psychosis is not a real medical diagnosis. OpenAI responded saying they will make ChatGPT safer for users disconnected from reality. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Origin_of_life_on_Earth] | [TOKENS: 10785] |
Contents Abiogenesis Abiogenesis or the origin of life (sometimes called biopoesis) is the natural process by which life arises from non-living matter, such as simple organic compounds. The prevailing scientific hypothesis is that the transition from non-living to living entities on Earth was not a single event, but a process of increasing complexity involving the formation of a habitable planet, the prebiotic synthesis of organic molecules, molecular self-replication, self-assembly, autocatalysis, and the emergence of cell membranes. The transition from non-life to life has not been observed experimentally, but many proposals have been made for different stages of the process. The study of abiogenesis aims to determine how pre-life chemical reactions gave rise to life under conditions strikingly different from those on Earth today. It uses tools from biology and chemistry, attempting a synthesis of many sciences. Life functions through the chemistry of carbon and water, and builds on four chemical families: lipids for cell membranes, carbohydrates such as sugars, amino acids for protein metabolism, and the nucleic acids DNA and RNA for heredity. A theory of abiogenesis must explain the origins and interactions of these classes of molecules. Many approaches investigate how self-replicating molecules came into existence. Researchers think that life descends from an RNA world, although other self-replicating and self-catalyzing molecules may have preceded RNA. Other approaches ("metabolism-first" hypotheses) focus on how catalysis on the early Earth might have provided the precursor molecules for self-replication. The 1952 Miller–Urey experiment demonstrated that amino acids can be synthesized from inorganic compounds under conditions like early Earth's. Subsequently, amino acids have been found in meteorites, comets, asteroids, and star-forming regions of space. While the last universal common ancestor of all modern organisms (LUCA) existed millions of years after the origin of life, its study can guide research into early universal characteristics. A genomics approach has sought to characterize LUCA by identifying the genes shared by Archaea and Bacteria, major branches of life. It appears there are 60 proteins common to all life and 355 prokaryotic genes that trace to LUCA; their functions imply that LUCA was anaerobic with the Wood–Ljungdahl pathway, deriving energy by chemiosmosis, and used DNA, the genetic code, and ribosomes. Earlier cells might have had a leaky membrane and been powered by a naturally occurring proton gradient near a deep-sea white smoker hydrothermal vent; or, life may have originated inside the continental crust or in water at Earth's surface. Although Earth is the only place known to harbor life, astrobiologists assume that life exists and came into being by similar processes on other planets. Geochemical and fossil evidence informs most studies. The Earth was formed at 4.54 Gya, and the earliest evidence of life on Earth dates from 3.8 Gya from Western Australia. Fossil micro-organisms may have lived in hydrothermal vent precipitates from Quebec, soon after ocean formation during the Hadean, so the process appears to have been relatively rapid in terms of geological time. Overview Life consists of reproduction with (heritable) variations. NASA defines life as "a self-sustaining chemical system capable of Darwinian evolution." Such a system is complex; the last universal common ancestor (LUCA), presumably a single-celled organism which lived some 4 billion years ago, already had hundreds of genes encoded in the DNA genetic code that is universal today. That in turn implies a suite of cellular machinery including messenger RNA, transfer RNA, and ribosomes to translate the code into proteins. Those proteins included enzymes to operate its anaerobic respiration via the Wood–Ljungdahl metabolic pathway, and a DNA polymerase to replicate its genetic material. The challenge for origin of life researchers is to explain how such a complex and tightly interlinked system could develop by evolutionary steps, as at first sight all its parts are necessary to enable it to function. For example, a cell, whether the LUCA or in a modern organism, copies its DNA with the DNA polymerase enzyme, which is itself produced by translating the DNA polymerase gene in the DNA. Neither the enzyme nor the DNA can be produced without the other. The evolutionary process could have started with molecular self-replication, self-assembly such as of cell membranes, and autocatalysis via RNA ribozymes in an RNA world environment. The transition of non-life to life has not been observed experimentally. Some scientists see both life and the origin of life as aspects of the same process. The preconditions to the development of a living cell like the LUCA are known, though disputed in detail: a habitable world is formed with a supply of minerals and liquid water. Prebiotic synthesis creates a range of simple organic compounds, which are assembled into polymers such as proteins and RNA. On the other side, the process after the LUCA is readily understood: biological evolution caused the development of a wide range of species with varied forms and biochemical capabilities. However, the derivation of the LUCA from simple components is far from understood. Although Earth remains the only place where life is known, the science of astrobiology seeks evidence of life on other planets. The 2015 NASA strategy on the origin of life aimed to solve the puzzle by identifying interactions, intermediary structures and functions, energy sources, and environmental factors that contributed to evolvable macromolecular systems, and mapping the chemical landscape of potential primordial informational polymers. The advent of such polymers was most likely a critical step in prebiotic chemical evolution. Those polymers derived, in turn, from simple organic compounds such as nucleobases, amino acids, and sugars, likely formed by reactions in the environment. A successful theory of the origin of life must explain how all these chemicals came into being. Pre-1960s conceptual history One ancient view of the origin of life, from Aristotle until the 19th century, was of spontaneous generation. This held that "lower" animals such as insects were generated by decaying organic substances, and that life arose by chance. This was questioned from the 17th century, in works like Thomas Browne's Pseudodoxia Epidemica. In 1665, Robert Hooke published the first drawings of a microorganism. In 1676, Antonie van Leeuwenhoek drew and described microorganisms, probably protozoa and bacteria. Van Leeuwenhoek disagreed with spontaneous generation, and by the 1680s convinced himself, using experiments ranging from sealed and open meat incubation and the close study of insect reproduction, that the theory was incorrect. In 1668 Francesco Redi showed that no maggots appeared in meat when flies were prevented from laying eggs. By the middle of the 19th century, spontaneous generation was considered disproven. Dating back to Anaxagoras in the 5th century BC, panspermia is the idea that life originated elsewhere in the universe and came to Earth. The modern version of panspermia holds that life may have been distributed to Earth by meteoroids, asteroids, comets or planetoids. This shifts the origin of life to another heavenly body. The advantage is that life is not required to have formed on each planet it occurs on, but in a more limited set of locations, and then spread about the galaxy to other star systems. There is some interest in the possibility that life originated on Mars and later transferred to Earth. The idea that life originated from non-living matter in slow stages appeared in Herbert Spencer's 1864–1867 book Principles of Biology, and in William Turner Thiselton-Dyer's 1879 paper "On spontaneous generation and evolution". On 1 February 1871 Charles Darwin wrote about these publications to Joseph Hooker, and set out his own speculation that the original spark of life may have been in a "warm little pond, with all sorts of ammonia and phosphoric salts,—light, heat, electricity &c present, that a protein compound was chemically formed". Darwin explained that "at the present day such matter would be instantly devoured or absorbed, which would not have been the case before living creatures were formed." Alexander Oparin in 1924 and J. B. S. Haldane in 1929 proposed that the earliest cells slowly self-organized from a primordial soup, the Oparin–Haldane hypothesis. Haldane suggested that the Earth's prebiotic oceans consisted of a "hot dilute soup" in which organic compounds could have formed. J. D. Bernal showed that such mechanisms could form most of the necessary molecules for life from inorganic precursors. In 1967, he suggested three "stages": the origin of biological monomers; the origin of biological polymers; and the evolution from molecules to cells. In 1952, Stanley Miller and Harold Urey carried out a chemical experiment to demonstrate how organic molecules could have formed spontaneously from inorganic precursors under prebiotic conditions like those posited by the Oparin–Haldane hypothesis. It used a highly reducing (lacking oxygen) mixture of gases—methane, ammonia, and hydrogen, with water vapor—to form organic monomers such as amino acids. Bernal said of the Miller–Urey experiment that "it is not enough to explain the formation of such molecules, what is necessary, is a physical-chemical explanation of the origins of these molecules that suggests the presence of suitable sources and sinks for free energy." However, current scientific consensus describes the primitive atmosphere as weakly reducing or neutral, diminishing the amount and variety of amino acids that could be produced. The addition of iron and carbonate minerals, present in early oceans, produces a diverse array of amino acids. Later work has focused on two other potential reducing environments: outer space and deep-sea hydrothermal vents. Producing a habitable Earth Soon after the Big Bang, roughly 14 Gya, the only chemical elements present in the universe were hydrogen, helium, and lithium, the three lightest atoms in the periodic table. These elements gradually accreted and began orbiting in disks of gas and dust. Gravitational accretion of material at the hot and dense centers of these protoplanetary disks formed stars by the fusion of hydrogen. Early stars were massive and short-lived, producing all the heavier elements by stellar nucleosynthesis. Such element formation proceeds to its most stable element Iron-56. Heavier elements were formed during supernovae at the end of a star's lifecycle. Carbon, currently the fourth most abundant element in the universe, was formed mainly in white dwarf stars. As these stars reached the end of their lifecycles, they ejected heavier elements, including carbon and oxygen, throughout the universe. These allowed for the formation of rocky planets. According to the nebular hypothesis, the Solar System began to form 4.6 Gya with the gravitational collapse of part of a giant molecular cloud. Most of the collapsing mass collected in the center, forming the Sun, while the rest flattened into a protoplanetary disk out of which the planets formed. The age of the Earth is 4.54 Gya as found by radiometric dating of calcium-aluminium-rich inclusions in carbonaceous chrondrite meteorites, the oldest material in the Solar System. Earth, during the Hadean eon (from its formation until 4.031 Gya,) was at first inhospitable to life. During its formation, the Earth lost much of its initial mass, and so lacked the gravity to hold molecular hydrogen and the bulk of the original inert gases. Soon after initial accretion of Earth at 4.48 Gya, its collision with Theia, a hypothesised impactor, is thought to have created the ejected debris that eventually formed the Moon. This impact removed the Earth's primary atmosphere, leaving behind clouds of viscous silicates and carbon dioxide. This unstable atmosphere was short-lived, soon condensing to form the bulk silicate Earth, leaving behind an atmosphere largely consisting of water vapor, nitrogen, and carbon dioxide, with smaller amounts of carbon monoxide, hydrogen, and sulfur compounds. The solution of carbon dioxide in water is thought to have made the seas slightly acidic, with a pH of about 5.5. Condensation to form liquid oceans is theorised to have occurred as early as the Moon-forming impact. This scenario is supported by the dating of 4.404 Gya zircon crystals with high δ18O values from metamorphosed quartzite of Mount Narryer in Western Australia. The Hadean atmosphere has been characterized as a "gigantic, productive outdoor chemical laboratory," similar to volcanic gases today which still support some abiotic chemistry. Despite the likely increased volcanism from early plate tectonics, the Earth may have been a predominantly water world between 4.4 and 4.3 Gya. It is debated whether crust was exposed above this ocean. Immediately after the Moon-forming impact, Earth likely had little if any continental crust, a turbulent atmosphere, and a hydrosphere subject to intense ultraviolet light from a T Tauri stage Sun. It was also affected by cosmic radiation, and continued asteroid and comet impacts. The Late Heavy Bombardment hypothesis posits that a period of intense impact occurred at 4.1 to 3.8 Gya during the Hadean and early Archean eons. Originally it was thought that the Late Heavy Bombardment was a single cataclysmic impact event occurring at 3.9 Gya; this would have had the potential to sterilize Earth by volatilizing liquid oceans and blocking sunlight needed for photosynthesis, delaying the earliest possible emergence of life. More recent research questioned the intensity of the Late Heavy Bombardment and its potential for sterilisation. If it was not one giant impact but a period of raised impact rate, it would have had much less destructive power. The 3.9 Gya date arose from dating of Apollo mission sample returns collected mostly near the Imbrium Basin, biasing the age of recorded impacts. Impact modelling of the lunar surface reveals that rather than a cataclysmic event at 3.9 Gya, multiple small-scale, short-lived periods of bombardment likely occurred. Terrestrial data backs this idea by showing multiple periods of ejecta in the rock record both before and after the 3.9 Gya marker, suggesting that the early Earth was subject to continuous impacts with less impact on extinction. If life evolved in the ocean at depths of more than ten meters, it would have been shielded both from late impacts and the then high levels of ultraviolet radiation from the sun. The available energy is maximized at 100–150 °C, the temperatures at which hyperthermophilic bacteria and thermoacidophilic archaea live. Based on evidence from the geologic record, life most likely emerged on Earth between 4.32 and 3.48 Gya. In 2017, the earliest physical evidence of life was reported to consist of microbialites in the Nuvvuagittuq Greenstone Belt of Northern Quebec, in banded iron formation rocks at least 3.77 and possibly as old as 4.32 Gya. The micro-organisms could have lived within hydrothermal vent precipitates, soon after the 4.4 Gya formation of oceans during the Hadean. The microbes resemble modern hydrothermal vent bacteria, supporting the view that abiogenesis began in such an environment. Later research disputed this interpretation of the data, stating that the observations may be better explained by abiotic processes in silica-rich waters, "chemical gardens," circulating hydrothermal fluids, or volcanic ejecta. Biogenic graphite has been found in 3.7 Gya metasedimentary rocks from southwestern Greenland and in microbial mat fossils from 3.49 Gya cherts in the Pilbara region of Western Australia. Evidence of early life in rocks from Akilia Island, near the Isua supracrustal belt in southwestern Greenland, dating to 3.7 Gya, have shown biogenic carbon isotopes. In other parts of the Isua supracrustal belt, graphite inclusions trapped within garnet crystals are connected to the other elements of life: oxygen, nitrogen, and possibly phosphorus in the form of phosphate, providing further evidence for life 3.7 ;Gya. In the Pilbara region of Western Australia, compelling evidence of early life was found in pyrite-bearing sandstone in a fossilized beach, with rounded tubular cells that oxidized sulfur by photosynthesis in the absence of oxygen. Carbon isotope ratios on graphite inclusions from the Jack Hills zircons suggest that life could have existed on Earth from 4.1 Gya. A 2024 study inferred LUCA's age as around 4.2 Gya (4.09–4.33 Gya) by analysing pre-LUCA gene duplicates, with calibration from fossil micro-organisms, much sooner after the origin of life than previously thought. The Pilbara region of Western Australia contains the Dresser Formation with rocks 3.48 Gya, including layered structures called stromatolites. Their modern counterparts are created by photosynthetic micro-organisms including cyanobacteria. These lie within undeformed hydrothermal-sedimentary strata; their texture indicates a biogenic origin. Parts of the Dresser formation preserve hot springs on land, but other regions seem to have been shallow seas. A molecular clock analysis suggests the LUCA emerged prior to 3.9 Gya. Producing molecules: prebiotic synthesis All chemical elements derive from stellar nucleosynthesis except for hydrogen and some helium and lithium. Basic chemical ingredients of life – the carbon-hydrogen molecule (CH), the carbon-hydrogen positive ion (CH+) and the carbon ion (C+) – can be produced by ultraviolet light from stars. Complex molecules, including organic molecules, form naturally both in space and on planets. Organic molecules on the early Earth could have had either terrestrial origins, with organic molecule synthesis driven by impact shocks or by other energy sources, such as ultraviolet light, redox coupling, or electrical discharges; or extraterrestrial origins (pseudo-panspermia), with organic molecules formed in interstellar dust clouds raining down on to the planet. An organic compound is a chemical whose molecules contain carbon. Carbon is abundant in the Sun, stars, comets, and in the atmospheres of most planets of the Solar System. Organic compounds are relatively common in space, formed by "factories of complex molecular synthesis" which occur in molecular clouds and circumstellar envelopes, and chemically evolve after reactions are initiated mostly by ionizing radiation. Purine and pyrimidine nucleobases including guanine, adenine, cytosine, uracil, and thymine, as well as sugars, have been found in meteorites. These could have provided the materials for DNA and RNA to form on the early Earth. The amino acid glycine was found in material ejected from comet Wild 2; it had earlier been detected in meteorites. Comets are encrusted with dark material, thought to be a tar-like organic substance formed from simple carbon compounds under ionizing radiation. A rain of material from comets could have brought such complex organic molecules to Earth. During the Late Heavy Bombardment, meteorites may have delivered up to five million tons of organic prebiotic elements to Earth per year. Currently 40,000 tons of cosmic dust falls to Earth each year. Polycyclic aromatic hydrocarbons (PAH) are the most common and abundant polyatomic molecules in the observable universe, and are a major store of carbon. They seem to have formed shortly after the Big Bang, and are associated with new stars and exoplanets. They are a likely constituent of Earth's primordial sea. PAHs have been detected in nebulae, and in the interstellar medium, in comets, and in meteorites. A star, HH 46-IR, resembling the sun early in its life, is surrounded by a disk of material which contains molecules including cyanide compounds, hydrocarbons, and carbon monoxide. PAHs in the interstellar medium can be transformed through hydrogenation, oxygenation, and hydroxylation to more complex organic compounds used in living cells. Organic compounds introduced on Earth by interstellar dust particles can help to form complex molecules, thanks to their peculiar surface-catalytic activities. The RNA component uracil and related molecules, including xanthine, in the Murchison meteorite were likely formed extraterrestrially, as suggested by studies of 12C/13C isotopic ratios. NASA studies of meteorites suggest that all four DNA nucleobases (adenine, guanine and related organic molecules) have been formed in outer space. The cosmic dust permeating the universe contains complex organics ("amorphous organic solids with a mixed aromatic–aliphatic structure") that could be created rapidly by stars. Glycolaldehyde, a sugar molecule and RNA precursor, has been detected in regions of space including around protostars and on meteorites. As early as the 1860s, experiments demonstrated that biologically relevant molecules can be produced from interaction of simple carbon sources with abundant inorganic catalysts. The spontaneous formation of complex polymers from abiotically generated monomers under the conditions posited by the "soup" theory is not straightforward. Besides the necessary basic organic monomers, compounds that would have prohibited the formation of polymers were also formed in high concentration during the Miller–Urey experiment and Joan Oró experiments. Biology uses essentially 20 amino acids for its coded protein enzymes, representing a very small subset of the structurally possible products. Since life tends to use whatever is available, an explanation is needed for why the set used is so small. Formamide is attractive as a medium that potentially provided a source of amino acid derivatives from simple aldehyde and nitrile feedstocks. Alexander Butlerov showed in 1861 that the formose reaction created sugars including tetroses, pentoses, and hexoses when formaldehyde is heated under basic conditions with divalent metal ions like calcium. R. Breslow proposed that the reaction was autocatalytic in 1959. Nucleobases, such as guanine and adenine, can be synthesized from simple carbon and nitrogen sources, such as hydrogen cyanide (HCN) and ammonia. On early Earth, HCN has been shown in modelling experiments to have likely been supplied via photochemical production in transient, highly reducing atmospheres (see Prebiotic atmosphere) following major impacts. Formamide, produced by the reaction of water and HCN, is ubiquitous and produces all four ribonucleotides when warmed with terrestrial minerals. It can be concentrated by the evaporation of water. HCN is poisonous only to aerobic organisms, which did not exist during the earliest phases of life's origin. It can contribute to chemical processes such as the synthesis of the amino acid glycine. DNA and RNA components including uracil, cytosine and thymine can be synthesized under outer space conditions, using starting chemicals such as pyrimidine found in meteorites. Pyrimidine may have been formed in red giant stars, in interstellar dust and gas clouds, or may have been synthesized on Earth via precursors such as cyanoacetylene and other intermediates made available following early asteroid impacts. All four RNA-bases may be synthesized from formamide in high-energy density events like extraterrestrial impacts. Several ribonucleotides for RNA formation have been synthesized in a laboratory environment which replicates prebiotic conditions via autocatalytic formose reaction. Other pathways for synthesizing bases from inorganic materials have been reported. Freezing temperatures assist the synthesis of purines, by concentrating key precursors such as HCN. However, while adenine and guanine require freezing conditions, cytosine and uracil may require boiling temperatures. Seven amino acids and eleven types of nucleobases formed in ice when ammonia and cyanide were left in a freezer for 25 years. S-triazines (alternative nucleobases), pyrimidines including cytosine and uracil, and adenine can be synthesized by subjecting a urea solution to freeze-thaw cycles under a reductive atmosphere with spark discharges. The unusual speed of these low-temperature reactions is due to eutectic freezing, which crowds impurities in microscopic pockets of liquid within the ice. Prebiotic peptide synthesis could have occurred by several routes. Some center on high temperature/concentration conditions in which condensation becomes energetically favorable, while others use plausible prebiotic condensing agents. Experimental evidence for the formation of peptides in uniquely concentrated environments is bolstered by work suggesting that wet-dry cycles and the presence of specific salts can greatly increase spontaneous condensation of glycine into poly-glycine chains. Other work suggests that while mineral surfaces, such as those of pyrite, calcite, and rutile catalyze peptide condensation, they also catalyze their hydrolysis. The authors suggest that additional chemical activation or coupling would be necessary to produce peptides at sufficient concentrations. Thus, mineral surface catalysis, while important, is not sufficient alone for peptide synthesis. Many prebiotically plausible condensing/activating agents have been identified, including the following: cyanamide, dicyanamide, dicyandiamide, diaminomaleonitrile, urea, trimetaphosphate, NaCl, CuCl2, (Ni,Fe)S, CO, carbonyl sulfide (COS), carbon disulfide (CS2), SO2, and diammonium phosphate (DAP). A 2024 experiment used a sapphire substrate with a web of thin cracks under a heat flow, mimicking deep-ocean vents, to concentrate prebiotically-relevant building blocks from a dilute mixture by up to three orders of magnitude. This could help to create biopolymers such as peptides. A similar role has been suggested for clays, though this speculation has not been supported through experimental evidence. The prebiotic synthesis of peptides from simpler molecules such as CO, NH3 and C, skipping the step of amino acid formation, is also very efficient. Producing protocells The largest unanswered question in evolution is how simple protocells first arose and differed in reproductive contribution to the following generation, thus initiating evolution. The lipid world theory postulates that the first self-replicating object was lipid-like. Phospholipids form lipid bilayers (as in cell membranes) in water while under agitation. These molecules were not present on early Earth, but other membrane-forming amphiphilic long-chain molecules were. These bodies may expand by insertion of additional lipids, and may spontaneously split into two offspring of similar size and composition. Lipid bodies may have provided sheltering envelopes for information storage, allowing the evolution of information-storing polymers like RNA. Only one or two types of vesicle-forming amphiphiles have been studied. There is an enormous number of possible arrangements of lipid bilayer membranes, and those with the best reproductive characteristics would have converged toward a hypercycle reaction, a positive feedback composed of two mutual catalysts represented by a membrane site and a specific compound trapped in the vesicle. Such site/compound pairs are transmissible to the daughter vesicles, leading to the emergence of distinct lineages of vesicles, subject to natural selection. A protocell is a self-organized, self-ordered, spherical collection of lipids proposed as a stepping-stone to life. A functional protocell has (as of 2014) not yet been achieved in a laboratory setting. Self-assembled vesicles are essential components of primitive cells. The theory of classical irreversible thermodynamics treats self-assembly under a generalized chemical potential within the framework of dissipative systems. The second law of thermodynamics requires that overall entropy increases, yet life is distinguished by its great degree of organization. Therefore, a boundary is needed to separate ordered life processes from chaotic non-living matter. Irene Chen and Jack W. Szostak suggest that elementary protocells can give rise to cellular behaviors including primitive forms of differential reproduction, competition, and energy storage. Competition for membrane molecules would favor stabilized membranes, suggesting a selective advantage for cross-linked fatty acids and even modern phospholipids. Such micro-encapsulation would allow for metabolism within the membrane and the exchange of small molecules, while retaining large biomolecules inside. Such a membrane is needed for a cell to create its own electrochemical gradient. Fatty acid vesicles in alkaline hydrothermal vent conditions can be stabilized by isoprenoids, synthesized by the formose reaction; the advantages and disadvantages of isoprenoids within the lipid bilayer in different microenvironments might have led to the divergence of the membranes of archaea and bacteria. Vesicles can undergo an evolutionary process under pressure cycling conditions. Simulating the systemic environment in tectonic fault zones within the Earth's crust, pressure cycling forms vesicles periodically, as well as random peptide chains which are selected for ability to integrate into the vesicle membrane. Further selection of vesicles for stability could lead to functional peptide structures, increasing vesicle survival rate. Producing biology Life requires a loss of entropy, or disorder, as molecules organize themselves into living matter. At the same time, the emergence of life is associated with the formation of structures beyond a certain threshold of complexity. The emergence of life with increasing order and complexity does not contradict the second law of thermodynamics, which states that overall entropy never decreases, since a living organism creates order in some places (e.g. its living body) at the expense of an increase of entropy elsewhere (e.g. heat and waste production). Multiple sources of energy were available for chemical reactions on the early Earth. Heat from geothermal processes is a standard energy source for chemistry. Other examples include sunlight, lightning, atmospheric entries of micro-meteorites, and implosion of bubbles in sea and ocean waves. This has been confirmed by experiments and simulations. Unfavorable reactions can be driven by highly favorable ones, as in the case of iron-sulfur chemistry. For example, this was probably important for carbon fixation.[a] Carbon fixation by reaction of CO2 with H2S via iron-sulfur chemistry is favorable, and occurs at neutral pH and 100 °C. Iron-sulfur surfaces, which are abundant near hydrothermal vents, can drive the production of small amounts of amino acids and other biomolecules. In 1961, Peter Mitchell proposed chemiosmosis as a cell's primary system of energy conversion. The mechanism, now ubiquitous in living cells, powers energy conversion in micro-organisms and in the mitochondria of eukaryotes, making it a likely candidate for early life. Mitochondria produce adenosine triphosphate (ATP), the energy currency of the cell used to drive cellular processes such as chemical syntheses. The mechanism of ATP synthesis involves a closed membrane in which the ATP synthase enzyme is embedded. The energy required to release strongly bound ATP has its origin in protons that move across the membrane. In modern cells, those proton movements are caused by the pumping of ions across the membrane, maintaining an electrochemical gradient. In the first organisms, the gradient could have been provided by the difference in chemical composition between the flow from a hydrothermal vent and the surrounding seawater, or perhaps meteoric quinones that were conducive to the development of chemiosmotic energy across lipid membranes if at a terrestrial origin. The PAH world hypothesis is a speculative hypothesis that proposes that polycyclic aromatic hydrocarbons (PAHs), known to be abundant in the universe, including in comets, and assumed to be abundant in the primordial soup of the early Earth, played a major role in the origin of life by mediating the synthesis of RNA molecules, leading into the RNA world. However, as yet, the hypothesis is untested. The RNA world hypothesis describes an early Earth with self-replicating and catalytic RNA but no DNA or proteins. It was proposed in 1962 by Alexander Rich; the term was coined by Walter Gilbert in 1986. Many researchers concur that an RNA world must have preceded modern DNA-based life. However, it may not have been the first to exist. There may have been over 30 chemical events between pre-RNA world to near-LUCA, just involving RNA. RNA is central to the translation process. Small RNAs can catalyze all the chemical groups and information transfers required for life. RNA both expresses and maintains genetic information in modern organisms; its components are easily synthesized under early Earth conditions. The structure of the ribosome has been called the "smoking gun", with a central core of RNA and no amino acid side chains within 18 Å of the active site that catalyzes peptide bond formation. RNA replicase can both code and catalyse further RNA replication, i.e. it is autocatalytic. Some catalytic RNAs can link smaller RNA sequences together, enabling self-replication. Natural selection would then favor the proliferation of such autocatalytic sets. Self-assembly of RNA may occur spontaneously in hydrothermal vents. A preliminary form of tRNA could have assembled into a replicator molecule. When this began to replicate, it may have had all three mechanisms of Darwinian selection: heritability, variation, and differential reproduction. Its fitness would have depended on its ability to adapt, determined by its nucleotide sequence, and resource availability. In line with the RNA world hypothesis, much of modern biology's templated protein biosynthesis is done by RNA molecules—namely tRNAs and the ribosome (consisting of both protein and rRNA). The most central reaction of peptide bond synthesis is carried out by base catalysis by the 23S rRNA domain V. Di- and tripeptides can be synthesized with a system consisting of only aminoacyl phosphate adaptors and RNA guides. Aminoacylation ribozymes that can charge tRNAs with their cognate amino acids have been selected in in vitro experimentation. The first proteins had to arise without a fully-fledged system of protein biosynthesis. Random sequence peptides would not have had biological function. Thus, significant study has gone into exploring how early functional proteins could have arisen from random sequences. Evidence on hydrolysis rates shows that abiotically plausible peptides likely contained significant "nearest-neighbor" biases. This could have had some effect on early protein sequence diversity. A search found that approximately 1 in 1011 random sequences had ATP binding function. Starting with the work of Carl Woese from 1977, genomics studies have placed the last universal common ancestor (LUCA) of all modern life-forms between Bacteria and a clade formed by Archaea and Eukaryota in the phylogenetic tree of life. It lived over 4 Gya. A minority of studies have placed the LUCA in Bacteria, proposing that Archaea and Eukaryota are evolutionarily derived from within Eubacteria; Thomas Cavalier-Smith suggested in 2006 that the phenotypically diverse bacterial phylum Chloroflexota contained the LUCA. In 2016, a set of 355 genes likely present in the LUCA was identified. A total of 6.1 million prokaryotic genes from Bacteria and Archaea were sequenced, identifying 355 protein clusters from among 286,514 protein clusters that were probably common to the LUCA. The results suggest that the LUCA was anaerobic with a Wood–Ljungdahl (reductive Acetyl-CoA) pathway, nitrogen- and carbon-fixing, thermophilic. Its cofactors suggest dependence upon an environment rich in hydrogen, carbon dioxide, iron, and transition metals. Its genetic material was probably DNA, requiring the 4-nucleotide genetic code, messenger RNA, transfer RNA, and ribosomes to translate the code into proteins such as enzymes. LUCA likely inhabited an anaerobic hydrothermal vent setting in a geochemically active environment. It was evidently already a complex organism, and must have had precursors; it was not the first living thing. The physiology of LUCA has been in dispute. Previous research identified 60 proteins common to all life. Metabolic reactions inferred in LUCA are the incomplete reverse Krebs cycle, gluconeogenesis, the pentose phosphate pathway, glycolysis, reductive amination, and transamination. Suitable geological environments A variety of geologic and environmental settings have been proposed for an origin of life. These theories are often in competition with one another as there are many views of prebiotic compound availability, geophysical setting, and early life characteristics. The first organism on Earth likely differed from LUCA. Between the first appearance of life and where all modern phylogenies began branching, an unknown amount of time passed, with unknown gene transfers, extinctions, and adaptation to environmental niches. Modern phylogenies provide more genetic evidence about LUCA than about its precursors. Early micro-fossils may have come from a hot world of gases such as methane, ammonia, carbon dioxide, and hydrogen sulfide, toxic to much current life. Analysis of the tree of life places thermophilic and hyperthermophilic bacteria and archaea closest to the root, suggesting that life may have evolved in a hot environment. The deep sea or alkaline hydrothermal vent theory posits that life began at submarine hydrothermal vents. William Martin and Michael Russell have suggested that this could have been in metal-sulphide-walled compartments acting as precursors for cell walls. These form where hydrogen-rich fluids emerge from below the sea floor, as a result of serpentinization of ultra-mafic olivine with seawater and a pH interface with carbon dioxide-rich ocean water. The vents form a sustained chemical energy source derived from redox reactions, in which electron donors (molecular hydrogen) react with electron acceptors (carbon dioxide); see iron–sulfur world theory. These are exothermic reactions.[b] Russell demonstrated that alkaline vents create an abiogenic proton motive force chemiosmotic gradient, ideal for abiogenesis. Their microscopic compartments "provide a natural means of concentrating organic molecules," composed of iron-sulfur minerals such as mackinawite, endowed these mineral cells with the catalytic properties envisaged by Günter Wächtershäuser. This movement of ions across the membrane depends on two factors: These two gradients together can be expressed as an electrochemical gradient, providing energy for abiogenic synthesis. The proton motive force measures the potential energy stored as proton and voltage gradients across a membrane (differences in proton concentration and electrical potential). The surfaces of mineral particles inside deep-ocean hydrothermal vents have catalytic properties similar to those of enzymes, and can create simple organic molecules, such as methanol (CH3OH) and formic, acetic, and pyruvic acids out of the dissolved CO2 in the water, if driven by an applied voltage or by reaction with H2 or H2S. Starting in 1981, researchers proposed that life might have started at hydrothermal vents, that spontaneous chemistry in the Earth's crust driven by rock–water interactions at disequilibrium thermodynamically underpinned life's origin, and that the founding lineages of the archaea and bacteria were H2-dependent autotrophs that used CO2 as their terminal acceptor in energy metabolism. In 2016, Martin suggested that the LUCA "may have depended heavily on the geothermal energy of the vent to survive". That same year, RNA was produced in synthetic alkaline hydrothermal chimneys simulating deep-sea vents. Researchers were able to generate RNA oligomers of up to 4 units in length. This RNA was synthesized using activated ribonucleotides. Additionally, these RNA oligomers could only be synthesized under certain conditions. Pores at deep sea hydrothermal vents are suggested to have been occupied by membrane-bound compartments which promoted biochemical reactions. Metabolic intermediates in the Krebs cycle, gluconeogenesis, amino acid bio-synthetic pathways, glycolysis, the pentose phosphate pathway, and including sugars like ribose, and lipid precursors can occur non-enzymatically at conditions relevant to deep-sea alkaline hydrothermal vents. If the deep marine hydrothermal setting was the site, then life could have arisen as early as 4.0–4.2 Gya. If life evolved in the ocean at depths of more than ten meters, it would have been shielded both from impacts and the then high levels of solar ultraviolet radiation. The available energy in hydrothermal vents is maximized at 100–150 °C, the temperatures at which hyperthermophilic bacteria and thermoacidophilic archaea live. Arguments against a hydrothermal origin of life state that hyperthermophily was a result of convergent evolution in bacteria and archaea, and that a mesophilic environment is more likely. Production of prebiotic organic compounds at hydrothermal vents is estimated to be 108 kg/yr. While a large amount of key prebiotic compounds, such as methane, are found at vents, they are in far lower concentrations than in a Miller-Urey Experiment environment. Additionally, some organic compounds originally thought to have been formed at vents are now understood to have been formed by other geological processes and later inherited by vents. Methane at alkaline vents, for example, was once thought to have been synthesized from catalytic synthesis after serpentinization, but is now understood to more likely come from leached fluid inclusions formed deeper in oceanic crust from magmatic carbon. The concentrations of methane the rate is 2–4 orders of magnitude lower than those in Miller-Urey experiments. Other counter-arguments include the inability to concentrate prebiotic materials, due to strong dilution by seawater. This open system cycles compounds through vent minerals, leaving little residence time to accumulate. All modern cells rely on phosphates and potassium for nucleotide backbone and protein formation respectively, making it likely that the first life forms shared these functions. These elements were not available in high quantities in the Archaean oceans, as both primarily come from the weathering of continental rocks on land, far from vents, and phosphate is lost into relatively insoluble apatite (calcium phosphate). However, phosphate can be concentrated in lakes, and modern analogs exist, such as the most phosphate-rich natural body of water in the world, Last Chance Lake, Canada. Submarine hydrothermal vents are not conducive to condensation reactions needed for polymerisation of macromolecules. An older argument was that key polymers were encapsulated in vesicles after condensation, which supposedly would not happen in saltwater. However, while salinity inhibits vesicle formation from low-diversity mixtures of fatty acids, vesicle formation from a broader, more realistic mix of fatty-acid and 1-alkanol species is more resilient. Importantly, no studies to date have been able to experimentally demonstrate synthesis of de novo sugars, amino acids, nucleases, nucleosides, nucleotides, or membrane-forming fatty acids under plausible vent conditions. Surface bodies of water provide environments that dry out and rewet. Wet-dry cycles concentrate prebiotic compounds and enable condensation reactions to polymerise macromolecules. Moreover, lakes and ponds receive detrital input from weathering of continental apatite-containing rocks, the most common source of phosphates. The amount of exposed continental crust in the Hadean is unknown, but models of early ocean depths and rates of ocean island and continental crust growth make it plausible that there was exposed land. Another line of evidence for a surface start to life is the requirement for Ultraviolet radiation (UV) for organism function. UV is necessary for the formation of the U+C nucleotide base pair by partial hydrolysis and nucleobase loss. Simultaneously, UV can be harmful and sterilising to life, especially for simple early lifeforms with little ability to repair radiation damage. Radiation levels from a young Sun were likely greater, and, with no ozone layer, harmful shortwave UV rays would reach the surface of Earth. For life to begin, a shielded environment with influx from UV-exposed sources is necessary to both benefit and protect from UV. Shielding under ice, liquid water, mineral surfaces (e.g. clay) or regolith is possible in a range of surface water settings. Most branching phylogenies are thermophilic or hyperthermophilic, making it possible that LUCA and preceding lifeforms were similarly thermophilic. Hot springs are formed from the heating of groundwater by geothermal activity. This intersection allows for influxes of material from deep penetrating waters and from surface runoff that transports eroded continental sediments. Interconnected groundwater systems create a mechanism for distribution of life to wider area. Mulkidjanian and co-authors argue that marine environments did not provide the ionic balance and composition universally found in cells, or the ions required by essential proteins and ribozymes, especially with respect to high K+/Na+ ratio, Mn2+, Zn2+ and phosphate concentrations. They argue that the only environments that do this are hot springs similar to ones at Kamchatka. Mineral deposits in these environments under an anoxic atmosphere would have suitable pH, contain precipitates of photocatalytic sulfide minerals that absorb harmful ultraviolet radiation, and have wet-dry cycles that concentrate substrate solutions enough for spontaneous formation of biopolymers created both by chemical reactions in the hydrothermal environment, and by exposure to UV light during transport from vents to adjacent pools. The hypothesized pre-biotic environments are similar to hydrothermal vents, with additional components that help explain peculiarities of the LUCA. A phylogenomic and geochemical analysis of proteins plausibly traced to the LUCA shows that the ionic composition of its intracellular fluid is identical to that of hot springs. The LUCA likely was dependent upon synthesized organic matter for its growth. Experiments show that RNA-like polymers can be synthesized in wet-dry cycling and UV light exposure. These polymers were encapsulated in vesicles after condensation. Potential sources of organics at hot springs might have been transport by interplanetary dust particles, extraterrestrial projectiles, or atmospheric or geochemical synthesis. Hot springs could have been abundant in volcanic landmasses during the Hadean. A mesophilic start in surface bodies of waters hypothesis has evolved from Darwin's concept of a 'warm little pond' and the Oparin-Haldane hypothesis. Freshwater bodies under temperate climates can accumulate prebiotic materials while providing suitable environmental conditions conducive to simple life forms. The Archaean climate is uncertain. Atmospheric reconstructions from geochemical proxies and models suggest that sufficient greenhouse gases were present to maintain surface temperatures between 0–40 °C. If so, the temperature was suitable for life could begin. Evidence for mesophily from biomolecular studies includes Galtier's G+C nucleotide thermometer. G+C are more abundant in thermophiles due to the added stability of an additional hydrogen bond not present between A+T nucleotides. rRNA sequencing of modern lifeforms shows that LUCA's reconstructed G+C content was likely representative of moderate temperatures. The diversity of thermophiles today could be a product of convergent evolution and horizontal gene transfer rather than an inherited trait from LUCA. The reverse gyrase topoisomerase is found exclusively in thermophiles and hyperthermophiles, as it allows for coiling of DNA. This enzyme requires the complex molecule ATP to function. If an origin of life is hypothesised to involve a simple organism that had not yet evolved a membrane, let alone ATP, this would make the existence of reverse gyrase improbable. Moreover, phylogenetic studies show that reverse gyrase originated in archaea, and transferred to bacteria by horizontal gene transfer, implying it was not present in the LUCA. Cold-start theories presuppose large ice-covered regions. Stellar evolution models predict that the Sun's luminosity was ≈25% weaker than it is today. Fuelner states that although this significant decrease in solar energy would have formed an icy planet, there is strong evidence for the presence of liquid water, possibly driven by a greenhouse effect. This would mean an early Earth with both liquid oceans and icy poles. Ice melts that form from ice sheets or glacier melts create freshwater pools, another niche capable of wet-dry cycles. While surface pools would be exposed to intense UV radiation, bodies of water within and under ice would be shielded, while remaining connected to exposed areas through ice cracks. Impact melting would allow freshwater and meteoritic input, creating prebiotic components. Near-seawater levels of sodium chloride destabilize fatty acid membrane self-assembly, making freshwater settings appealing for early membranous life. Icy environments would trade the faster reaction rates that occur in warm environments for increased stability and accumulation of larger polymers. Experiments simulating Europa-like conditions of ≈20 °C have synthesised amino acids and adenine, showing that Miller-Urey type syntheses can occur at low temperatures. In an RNA world, the ribozyme would have had even more functions than in a later DNA-RNA-protein-world. For RNA to function, it must be able to fold, a process hindered by temperatures above 30 °C. While RNA folding in psychrophilic organisms is slower, so is hydrolysis, so folding is more successful. Shorter nucleotides would not suffer from higher temperatures. An alternative geological environment has been proposed by the geologist Ulrich Schreiber and the physical chemist Christian Mayer: the continental crust. Tectonic fault zones could present a stable and well-protected environment for long-term prebiotic evolution. Inside these systems of cracks and cavities, water and carbon dioxide present the bulk solvents. Their phase state could vary between liquid, gaseous and supercritical, depending on pressure and temperature. When forming two separate phases (e.g. liquid water and supercritical carbon dioxide in depths of little more than 1 km), the system provides optimal conditions for phase transfer reactions. Concurrently, the contents of the tectonic fault zones are being supplied by a multitude of inorganic educts (e.g. carbon monoxide, hydrogen, ammonia, hydrogen cyanide, nitrogen, and even phosphate from dissolved apatite) and simple organic molecules formed by hydrothermal chemistry (e.g. amino acids, long-chain amines, fatty acids, long-chain aldehydes). Part of the tectonic fault zones is at a depth of around 1000 m. For the carbon dioxide part of the bulk solvent, it provides temperature and pressure conditions near the phase transition point between the supercritical and the gaseous state. This allows lipophilic organic molecules that dissolve well in supercritical CO2 to accumulate, but not in its gaseous state, leading to their local precipitation. Periodic pressure variations such as caused by geysers or tidal influences result in periodic phase transitions, keeping the local reaction environment in a constant non-equilibrium state. In presence of amphiphilic compounds (such as the long chain amines and fatty acids), subsequent generations of vesicles are formed that are constantly selected for their stability. Homochirality Homochirality is the uniformity of materials composed of chiral (non-mirror-symmetric) units. Living organisms use molecules with the same chirality: with almost no exceptions, amino acids are left-handed while nucleotides and sugars are right-handed. Chiral molecules can be synthesized, but in the absence of a chiral source or a chiral catalyst, they are formed in a 50/50 (racemic) mixture of both forms. Non-racemic mixtures can arise from racemic materials by asymmetric physical laws such as the electroweak interaction or asymmetric environments such as circularly polarized light. Once established, chirality would be selected for. A small bias in the population can be amplified by asymmetric autocatalysis, as in the Soai reaction, where a chiral molecule catalyzes its own production. See also Notes References Sources External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_note-jf-211] | [TOKENS: 10728] |
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006—over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to £700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, — with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) — as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999–2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256×224 to 640×480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consoles—including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears out—usually unevenly—due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Image_scanner] | [TOKENS: 7698] |
Contents Image scanner An image scanner (often abbreviated to just scanner) is a device that optically scans images, printed text, handwriting, or an object and converts it to a digital image. The most common type of scanner used in the home and the office is the flatbed scanner, where the document is placed on a glass bed. A sheetfed scanner, which moves the page across an image sensor using a series of rollers, may be used to scan one page of a document at a time or multiple pages, as in an automatic document feeder. A handheld scanner is a portable version of an image scanner that can be used on any flat surface. Scans are typically downloaded to the computer that the scanner is connected to, although some scanners are able to store scans on standalone flash media (e.g., memory cards and USB drives). Modern scanners typically use a charge-coupled device (CCD) or a contact image sensor (CIS) as the image sensor, whereas drum scanners, developed earlier and still used for the highest possible image quality, use a photomultiplier tube (PMT) as the image sensor. Document cameras, which use commodity or specialized high-resolution cameras, photograph documents all at once. History Image scanners are considered the successors of early facsimile (fax) and wirephoto machines. Unlike scanners, these devices were used to transmit images over long distances rather than for processing and storing images locally.: 2 : 305 The earliest attempt at a fax machine was patented in 1843 by the Scottish clockmaker Alexander Bain but never put into production. In his design, a metal stylus linked to a pendulum scans across a copper plate with a raised image. When the stylus makes contact with a raised part of the plate, it sends a pulse across a pair of wires to a receiver containing an electrode linked to another pendulum. A piece of paper impregnated with an electrochemically sensitive solution resides underneath the electrode and changes color whenever a pulse reaches the electrode. A gear advances the copper plate and paper in tandem with each swing of the pendulum; over time, the result is a perfect reproduction of the copper plate. In Bain's system, it is critical that the pendulums of the transceiver and receiver are in perfect step, or else the reproduced image will be distorted. In 1847, the English physicist Frederick Bakewell developed the first working fax machine. Bakewell's machine was similar to Bain's but used a revolving drum coated in tinfoil, with non-conductive ink painted on the foil and a stylus that scans across the drum and sends a pulse down a pair of wires when it contacts a conductive point on the foil. The receiver contains an electrode that touches a sheet of chemically treated paper, which changes color when the electrode receives a pulse; the result is a reverse contrast (white-on-blue) reproduction of the original image. Bakewell's fax machine was marginally more successful than Bain's but suffered from the same synchronization issues. In 1862, Giovanni Caselli solved this with the pantelegraph, the first fax machine put into regular service. Largely based on Bain's design, it ensured complete synchronization by flanking the pendulums of both the transceiver and receiver between two magnetic regulators, which become magnetized with each swing of the pendulum and become demagnetized when the pendulum reaches the maxima and minima of each oscillation. In 1893, the American engineer Elisha Gray introduced the telautograph, the first widely commercially successful fax machine that used linkage bars translating x- and y-axis motion at the receiver to scan a pen across the paper and strike it only when actuated by the stylus moving across the transceiver drum. Because it could use commodity stationery paper, it became popular in business and hospitals. In 1902, the German engineer Arthur Korn introduced the phototelautograph, a fax machine that used a light-sensitive selenium cell to scan a paper to be copied, instead of relying on a metallic drum and stylus. It was even more commercially successful than Gray's machine and became the basis for wirephoto (also known as telephotography) machines used by newspapers around the world from the early 1900s onward. Before the advent of digital image processing in the middle of the 20th century, the term scanner originally referred to analog equipment used within offset printing presses. These analog scanners varied in design depending on their purpose: some scanned images stored as color transparency film onto color separation plates that could be used to print the original image en masse; while others were used to convert simple cyan, magenta, and yellow (CMY) plates into cyan, magenta, yellow, and black (CMYK) in order to produce prints with darker, richer colors—a process known then in the trade as color correction (unrelated to the modern, cinematographic sense). Converting from CMY to CMYK used to be a highly manual affair involving techniques such as masking. Analog scanners automated this process to a large extent.: 305 Alexander Murray and Richard Morse invented and patented the first analog color scanner at Eastman Kodak in 1937. Their machine was of a drum scanner design that imaged a color transparency mounted in the drum, with a light source placed underneath the film, and three photocells with red, green, and blue color filters reading each spot on the transparency to translate the image into three electronic signals. In Murray and Morse's initial design, the drum was connected to three lathes that etched CMY halftone dots onto three offset cylinders directly. The rights to the patent were sold to Printing Developments Incorporated (P.D.I.) in 1946, who improved on the design by using a photomultiplier tube to image the points on the negative, which produced an amplified signal that was then fed to a single-purpose computer that processed the RGB signals into color-corrected CMYK values. The processed signals are then sent to four lathes that etch CMYK halftone dots onto the offset cylinders. In 1948, Arthur Hardy of the Interchemical Corporation and F. L. Wurzburg of the Massachusetts Institute of Technology invented the first analog, color flatbed image scanner, intended for producing color-corrected lithographic plates from a color negative. In this system, three color-separated plates (of CMY values) are prepared from a color negative via dot etching and placed in the scanner bed. Above each plate are rigidly fixed, equidistant light beam projectors that focus a beam of light onto one corner of the plate. The entire bed with all three plates moves horizontally, back and forth, to reach the opposite corners of the plate; with each horiztonal oscillation of the bed, the bed moves down one step to cover the entire vertical area of the plate. While this is happening, the beam of light focused on a given spot on the plate gets reflected and bounced off to a photocell adjacent to the projector. Each photocell connects to an analog image processor, which evaluates the reflectance of the combined CMY values using Neugebauer equations and outputs a signal to a light projector hovering over a fourth, unexposed lithographic plate. This plate receives a color-corrected, continuous-tone dot-etch of either the cyan, magenta, or yellow values. The fourth plate is replaced with another unexposed plate, and the process repeats until three color-corrected plates, of cyan, magenta and yellow, are produced. In the 1950s, the Radio Corporation of America (RCA) took Hardy and Wurzburg's patent and replaced the projector-and-photocell arrangement with a video camera tube focusing on one spot of the plate. The first digital imaging system was the Bartlane system in 1920. Named after the pair who invented it, Harry G. Bartholomew and Maynard D. McFarlane, the Bartlane system used zinc plates etched with an image from a film negative projected at five different exposure levels to correspond to five quantization levels. All five plates are affixed to a long, motor-driven rotating cylinder, with five equidistant contacts scanning over each plate at the same starting position. The Bartlane system was initially used exclusively by telegraph, with the five-bit Baudot code used to transmit the grayscale digital image. In 1921, the system was modified for offline use, with a five-bit paper tape punch punching holes depending on whether its connections to the contacts are bridged or not. The result was a stored digital image with five gray levels. Reproduction of the image was achieved with a lamp passing over the punched holes, exposing five different intensities of light onto a film negative. The first scanner to store its images digitally onto a computer was a drum scanner built in 1957 at the National Bureau of Standards (NBS, later NIST) by a team led by Russell A. Kirsch. It used a photomultiplier tube to detect light at a given point and produced an amplified signal that a computer could read and store into memory. The computer of choice at the time was the SEAC mainframe; the maximum horizontal resolution that the SEAC was capable of processing was 176 pixels. The first image ever scanned on this machine was a photograph of Kirsch's three-month-old son, Walden. In 1969, Dacom introduced the 111 fax machine, which was the first digital fax machine to employ data compression using an on-board computer. It employed a flatbed design with a continuous feed capable of scanning up to letter paper in 1-bit monochrome (black and white). The first flatbed scanner used for digital image processing was the Autokon line introduced by ECRM Inc. in 1975. The inaugural Autokon 8400 used a laser beam to scan pages up to 11 by 14 inches at a maximum resolution of 1000 lines per inch. Although it was only capable of scanning in 1-bit monochrome, the on-board processor was capable of halftoning, unsharp masking, contrast adjustment, and anamorphic distortions, among other features.: 53 The Autokon 8400 could either be connected to a film recorder to create a negative for producing plates or connected to a mainframe or minicomputer for further image processing and digital storage.: 53 The Autokon series was expanded over the following two decades and enjoyed widespread use in newspapers and prepress. In 1977, Raymond Kurzweil, of his start-up company Kurzweil Computer Products, released the Kurzweil Reading Machine, which was the first flatbed scanner with a charge-coupled device (CCD) imaging element. The Kurzweil Reading Machine was invented to assist blind people in reading books that had not been translated to braille. It comprised the image scanner and a Data General Nova minicomputer—the latter performing the image processing, optical character recognition (OCR), and speech synthesis. The first scanners for personal computers appeared in the mid-1980s, starting with ThunderScan for the Macintosh in December 1984. Designed by Andy Hertzfeld and released by Thunderware Inc., the ThunderScan contains a specialized image sensor built into a plastic housing the same shape as the ink ribbon cartridge of Apple's ImageWriter printer. The ThunderScan slots into the ImageWriter's ribbon carrier and connects to both the ImageWriter and the Macintosh simultaneously. The ImageWriter's carriage, controlled by the ThunderScan, moves left-to-right to scan one 200-dpi (dots per inch) line at a time, with the carriage return serving to advance the scanner down the print to be scanned. The ThunderScan was the Macintosh's first scanner and sold well but operated very slowly and was only capable of scanning prints at 1-bit monochrome. In 1999, Canon iterated on this idea with the IS-22, a cartridge that fit into their inkjet printers to convert them into sheetfed scanners. In early 1985, Datacopy released the first flatbed scanner for the IBM PC, the Datacopy Model 700. Based on a CCD imaging element, the Model 700 was capable of scanning letter-sized documents at a maximum resolution of 200 dpi at 1-bit monochrome. The Model 700 came with a special interface card for connecting to the PC, and an optional, aftermarket OCR software card and software package were sold for the Model 700.: 69 In April 1985, LaserFAX Inc. introduced a CCD-based color flatbed scanner, the SpectraSCAN 200 (later rebranded the SpectraFAX 200), for the IBM PC. The SpectraSCAN 200 worked by placing color filters over the CCD and taking four passes (three for each primary color and one for black) per scan to build up a color reproduction. The SpectraSCAN 200 took between two and three minutes to produce a scan of a letter-sized print at 200-dpi; its grayscale counterpart, the DS-200, took only 30 seconds to make a scan at the same size and resolution. The SpectraSCAN was the first flatbed scanner capable of scanning in color. The first relatively affordable flatbed scanner for personal computers appeared in February 1987 with Hewlett-Packard's ScanJet, which was capable of scanning 4-bit (64-shade) grayscale images at a maximum resolution of 300 dpi. By the beginning of 1988, the ScanJet had accounted for 27 percent of all scanner sales in terms of dollar volume, per Gartner Dataquest. In February 1989, the company introduced the ScanJet Plus, which increased the bit depth to 8 bits (256 shades) while costing only US$200 more than the original ScanJet's $1990 (equivalent to $5,169 in 2025). This led to a massive price drop in grayscale scanners with equivalent or lesser features in the market. The number of third-party developers producing software and hardware supporting these scanners jumped dramatically in turn, effectively popularizing the scanner for the personal computer user. By 1999, the cost of the average color-capable scanner had dropped to $300 (equivalent to $580 in 2025). That year, Computer Shopper declared 1999 "the year that scanners finally became a mainstream commodity". Types A flatbed scanner is a type of scanner that provides a glass bed (platen) on which the object to be scanned lies motionless. The scanning element moves vertically from under the glass, scanning either the entirety of the platen or a predetermined portion. The driver software for most flatbed scanners allows users to prescan their documents—in essence, to take a quick, low-resolution pass at a document in order to judge what area of the document should be scanned (if not the entirety of it), before scanning it at a higher resolution. Some flatbed scanners incorporate sheet-feeding mechanisms called automatic document feeders (ADFs) that use the same scanning element as the flatbed portion. This type of scanner is sometimes called a reflective scanner, because it works by shining white light onto the object to be scanned and reading the intensity and color of light that is reflected from it, usually a line at a time. They are designed for scanning prints or other flat, opaque materials, but some have available transparency adapters, which—for a number of reasons—in most cases, are not very well suited to scanning film. A sheetfed scanner, also known as a document feeder, is a type of scanner that uses motor-driven rollers to move one single sheet of paper at a time past a stationary scanning element (two scanning elements, in the case of scanners with duplex functionality). Unlike flatbed scanners, sheetfed scanners are not equipped to scan bound material such as books or magazines, nor are they suitable for any material thicker than plain printer paper. Some sheetfed scanners, called automatic document feeders (ADFs), are capable of scanning several sheets in one session, although others only accept one page at a time. Some sheetfed scanners are portable, powered by batteries, and have their own storage, eventually transferring stored scans to a computer. A handheld scanner is a type of scanner that must be manually dragged or gilded by hand across the surface of the object to be scanned. Scanning documents in this manner requires a steady hand, as an uneven scanning rate produces distorted images. Some handheld scanners have an indicator light on the scanner for this purpose, actuating if the user is moving the scanner too fast. They typically have at least one button that starts the scan when pressed; it is held by the user for the duration of the scan. Some other handheld scanners have switches to set the optical resolution, as well as a roller, which generates a clock pulse for synchronization with the computer. Older hand scanners were monochrome, and produced light from an array of green LEDs to illuminate the image; later ones scan in monochrome or color, as desired. A hand scanner may also have a small window through which the document being scanned could be viewed. As hand scanners are much narrower than most normal document or book sizes, software (or the end user) needed to combine several narrow "strips" of scanned documents to produce the finished article. Inexpensive, portable, battery-powered or USB-powered wand scanners and pen scanners, typically capable of scanning an area as wide as a normal letter and much longer, remain available as of 2024[update]. Some computer mice can also scan documents. A drum scanner is a type of scanner that uses a clear, motor-driven rotating cylinder (drum) onto which a print, a film negative, a transparency, or any other flat object is taped or otherwise secured. A beam of light either projects past, or reflects off, the material to be scanned onto a series of mirrors, which focus the beam onto the drum scanner's photomultiplier tube (PMT). After one revolution, the beam of light moves down a single step. When scanning transparent media, such as negatives, a light beam is directed from within the cylinder onto the media; when scanning opaque items, a light beam from above is reflected off the surface of the media. When only one PMT is present, three passes of the image are required for a full-color RGB scan. When three PMTs are present, only a single pass is required. The photomultiplier tubes of drum scanners offer superior dynamic range to that of CCD sensors. For this reason, drum scanners can extract more detail from very dark shadow areas of a transparency than flatbed scanners using CCD sensors. The smaller dynamic range of the CCD sensors (versus photomultiplier tubes) can lead to loss of shadow detail, especially when scanning very dense transparency film. Drum scanners are also able to resolve true detail in excess of 10000 dpi, producing higher-resolution scans than any CCD scanner. An overhead scanner is a type of scanner that places the scanning element in a housing on top of a vertical post, hovering above the document or object to be scanned, which lies stationary on an open-air bed. Chinon Industries patented a specific type of overhead scanner, which uses a rotating mirror to reflect the contents of the bed onto a linear CCD, in 1987. Although very flexible—allowing users to scan not only two-dimensional prints and documents but any 3D object, of any size—the Chinon design required the user to provide uniform illumination of the object to be scanned and was more cumbersome to set up. A more modern type of overhead scanner is a document camera (also known as a video scanner), which uses a digital camera to capture a document all at once. Most document cameras output live video of the document and are usually reserved for displaying documents to a live audience, but they may also be used as replacements for image scanners, capturing a single frame of the output as an image file. Document cameras may even use the same APIs as scanners when connected to computers. A planetary scanner is a type of very-high-resolution document camera used for capturing certain fragile documents. A book scanner is another kind of document camera, pairing a digital camera with a scanning area defined by a mat to assist in scanning books. Some more advanced models of book scanners project a laser onto the page for calibration and software skew correction. A film scanner, also known as a slide scanner or a transparency scanner, is a type of specialized flatbed scanner specifically for scanning film negatives and slides. A typical film scanner works by passing a narrowly focused beam of light through the film and reading the intensity and color of the light that emerges. The lowest-cost dedicated film scanners can be had for less than $50, and they might be sufficient for modest needs. From there they inch up in staggered levels of quality and advanced features upward of five figures. Image scanners are usually used in conjunction with a computer which controls the scanner and stores scans. Small portable scanners, either sheetfed or handheld and operated by batteries and with storage capability, are available for use away from a computer; stored scans can be transferred later. Many can scan both small documents such as business cards and till receipts, as well as letter-sized documents. The higher-resolution cameras fitted to some smartphones can produce reasonable quality document scans by taking a photograph with the phone's camera and post-processing it with a scanning app, a range of which are available for most phone operating systems, to whiten the background of a page, correct perspective distortion so that the shape of a rectangular document is corrected, convert to black-and-white, etc. Many such apps can scan multiple-page documents with successive camera exposures and output them either as a single file or multiple-page files. Some smartphone scanning apps can save documents directly to online storage locations, such as Dropbox and Evernote, send via email, or fax documents via email-to-fax gateways. Smartphone scanner apps can be broadly divided into three categories: Scanning elements Scanners equipped with charge-coupled device (CCD) scanning elements require a sophisticated series of mirrors and lenses to reproduce an image, but the result of this complexity is a much higher-quality scan. Because CCDs have a much greater depth of field, they are more forgiving when it comes to scanning documents that are difficult to get perfectly flat against the platen (such as bound books). Scanners equipped with contact image sensor (CIS) scanning elements are designed to be in near-direct contact with the document to be scanned and thus do not require the complex optics of CCDs scanners. However, their depth of field is much worse, resulting in blurry scans if the scanned document is not perfectly flush against the platten. Because the sensors require far less power than CCD scanners, CIS scanners are able to be manufactured down to a low cost and are typically much lighter in weight and depth than CCD scanners. Scanners equipped with photomultiplier tubes (PMT) are nearly exclusively drum scanners. Scan quality Color scanners typically read RGB (red-green-blue) color data from the array. This data is then processed with some proprietary algorithm to correct for different exposure conditions, and sent to the computer via the device's input/output interface (usually USB, previous to which was SCSI or bidirectional parallel port in older units). Color depth varies depending on the scanning array characteristics, but is usually at least 24 bits. High-quality models have 36-48 bits of color depth. Another qualifying parameter for a scanner is its resolution, measured in pixels per inch (ppi), sometimes more accurately referred to as samples per inch (spi). Instead of using the scanner's true optical resolution, the only meaningful parameter, manufacturers like to refer to the interpolated resolution, which is much higher thanks to software interpolation. As of 2009[update], a high-end flatbed scanner can scan up to 5400 ppi and drum scanners have an optical resolution of between 3000 and 24000 ppi. Effective resolution refers to the true resolution of a scanner, and is determined by using a resolution test chart. The effective resolution of most all consumer flatbed scanners is considerably lower than the manufactures' given optical resolution. Manufacturers often claim interpolated resolutions as high as 19200 ppi; but such numbers carry little meaningful value because the number of possible interpolated pixels is unlimited, and doing so does not increase the level of captured detail. The size of the file created increases with the square of the resolution; doubling the resolution quadruples the file size. A resolution must be chosen that is within the capabilities of the equipment, preserves sufficient detail, and does not produce a file of excessive size. The file size can be reduced for a given resolution by using "lossy" compression methods such as JPEG, at some cost in quality. If the best possible quality is required lossless compression should be used; reduced-quality files of smaller size can be produced from such an image when required (e.g., image designed to be printed on a full page, and a much smaller file to be displayed as part of a fast-loading web page). Purity can be diminished by scanner noise, optical flare, poor analog to digital conversion, scratches, dust, Newton's rings, out-of-focus sensors, improper scanner operation, and poor software. Drum scanners are said to produce the purest digital representations of the film, followed by high-end film scanners that use the larger Kodak Tri-Linear sensors. The third important parameter for a scanner is its dynamic range (also known as density range). A high-density range means that the scanner is able to record shadow details and brightness details in one scan. Density of film is measured on a base 10 log scale and varies between 0.0 (transparent) and 5.0, about 16 stops. Density range is the space taken up in the 0 to 5 scale, and Dmin and Dmax denote where the least dense and most dense measurements on a negative or positive film. The density range of negative film is up to 3.6d, while slide film dynamic range is 2.4d. Color negative density range after processing is 2.0d thanks to the compression of the 12 stops into a small density range. Dmax will be the densest on slide film for shadows, and densest on negative film for highlights. Some slide films can have a Dmax close to 4.0d with proper exposure, and so can black-and-white negative film. Consumer-level flatbed photo scanners have a dynamic range in the 2.0–3.0 range, which can be inadequate for scanning all types of photographic film, as Dmax can be and often is between 3.0d and 4.0d with traditional black-and-white film. Color film compresses its 12 stops of a possible 16 stops (film latitude) into just 2.0d of space via the process of dye coupling and removal of all silver from the emulsion. Kodak Vision 3 has 18 stops. So, color-negative film scans the easiest of all film types on the widest range of scanners. Because traditional black-and-white film retains the image creating silver after processing, density range can be almost twice that of color film. This makes scanning traditional black-and-white film more difficult and requires a scanner with at least a 3.6d dynamic range, but also a Dmax between 4.0d to 5.0d. High-end (photo lab) flatbed scanners can reach a dynamic range of 3.7, and Dmax around 4.0d. Dedicated film scanners have a dynamic range between 3.0d–4.0d. Office document scanners can have a dynamic range of less than 2.0d. Drum scanners have a dynamic range of 3.6–4.5. For scanning film, infrared cleaning is a technique used to remove the effects of dust and scratches on images scanned from film; many modern scanners incorporate this feature. It works by scanning the film with infrared light; the dyes in typical color film emulsions are transparent to infrared light, but dust and scratches are not, and block infrared; scanner software can use the visible and infrared information to detect scratches and process the image to greatly reduce their visibility, considering their position, size, shape, and surroundings. Scanner manufacturers usually have their own names attached to this technique. For example, Epson, Minolta, Nikon, Konica Minolta, Microtek, and others use Digital ICE, while Canon uses its own system, FARE (Film Automatic Retouching and Enhancement). Plustek uses LaserSoft Imaging iSRD. Some independent software developers design infrared cleaning tools. By combining full-color imagery with 3D models, modern hand-held scanners are able to completely reproduce objects electronically. The addition of 3D color printers enables accurate miniaturization of these objects, with applications across many industries and professions. For scanner apps, the scan quality is highly dependent on the quality of the phone camera and on the framing chosen by the user of the app. Connectivity Scans must virtually always be transferred from the scanner to a computer or information storage system for further processing or storage. There are two basic issues: (1) how the scanner is physically connected to the computer and (2) how the application retrieves the information from the scanner. The file size of a scan can go up to about 100 MB for a 600 dpi, 23 × 28 cm (slightly larger than A4 paper) uncompressed 24-bit image. Scanned files must be transferred and stored. Scanners can generate this volume of data in a matter of seconds, making a fast connection desirable. Scanners communicate to their host computer using one of the following physical interfaces, listing roughly from slow to fast: During the early 1990s professional flatbed scanners were available over a local computer network. This proved useful to publishers, print shops, etc. This functionality largely fell out of use as the cost of flatbed scanners reduced enough to make sharing unnecessary. From 2000 all-in-one multi-purpose devices became available which were suitable for both small offices and consumers, with printing, scanning, copying, and fax capability in a single apparatus that can be made available to all members of a workgroup. Battery-powered portable scanners store scans on internal memory; they can later be transferred to a computer either by direct connection, typically USB, or in some cases a memory card may be removed from the scanner and plugged into the computer. A raster image editor must be able to communicate with a scanner. There are many different scanners, and many of those scanners use different protocols. In order to simplify applications programming, some application programming interfaces (APIs) were developed. The API presents a uniform interface to the scanner. This means that the application does not need to know the specific details of the scanner in order to access it directly. For example, Adobe Photoshop supports the TWAIN standard; therefore in theory Photoshop can acquire an image from any scanner that has a TWAIN driver. In practice, there are often problems with an application communicating with a scanner. Either the application or the scanner manufacturer (or both) may have faults in their implementation of the API. Typically, the API is implemented as a dynamically linked library. Each scanner manufacturer provides software that translates the API procedure calls into primitive commands that are issued to a hardware controller (such as the SCSI, USB, or FireWire controller). The manufacturer's part of the API is commonly called a device driver, but that designation is not strictly accurate: the API does not run in kernel mode and does not directly access the device. Rather the scanner API library translates application requests into hardware requests. Common scanner software API include: Although no software beyond a scanning utility is a feature of any scanner, many scanners come bundled with software. Typically, in addition to the scanning utility, some type of raster image editor (such as Photoshop or GIMP) and optical character recognition (OCR) software are supplied. OCR software converts graphical images of text into standard text that can be edited using common word-processing and text-editing software; accuracy is rarely perfect. Output data Some scanners, especially those designed for scanning printed documents, only work in black and white, but most modern scanners work in color. For the latter, the scanned result is a non-compressed RGB image, which can be transferred to a computer's memory. The color output of different scanners is not the same due to the spectral response of their sensing elements, the nature of their light source, and the correction applied by the scanning software. While most image sensors have a linear response, the output values are usually gamma-compressed. Some scanners compress and clean up the image using embedded firmware. Once on the computer, the image can be processed with a raster graphics editor (such as Photoshop) and saved on a storage device (such as a hard disk). Scans may be stored uncompressed in image file formats such as BMP; stored losslessly compressed in file formats such as TIFF and PNG; stored lossy-compressed in file formats such as JPEG; or stored as embedded images or converted to vector graphics within a PDF. Optical character recognition (OCR) software allows a scanned image of text to be converted into editable text with reasonable accuracy, so long as the text is cleanly printed and in a typeface and size that can be read by the software. OCR capability may be integrated into the scanning software, or the scanned image file can be processed with a separate OCR program. Specific uses Document processing requirements differ from those of image scanning. These requirements include scanning speed, automated paper feed, and the ability to automatically scan both the front and the back of a document. On the other hand, image scanning typically requires the ability to handle fragile and or three-dimensional objects as well as scan at a much higher resolution. Document scanners have document feeders, usually larger than those sometimes found on copiers or all-purpose scanners. Scans are made at high speed, from 20 up to 420 pages per minute, often in grayscale, although many scanners support color. Many scanners can scan both sides of double-sided originals (duplex operation). Sophisticated document scanners have firmware or software that cleans up scans of text as they are produced, eliminating accidental marks and sharpening type; this would be unacceptable for photographic work, where marks cannot reliably be distinguished from desired fine detail. Files created are compressed as they are made. The resolution used is usually from 150 to 300 dpi, although the hardware may be capable of 600 or higher resolution; this produces images of text good enough to read and for OCR, without the higher demands on storage space required by higher-resolution images. Document scans are often processed using OCR technology to create editable and searchable files. Most scanners use ISIS or TWAIN device drivers to scan documents into TIFF format so that the scanned pages can be fed into a document management system that will handle the archiving and retrieval of the scanned pages. Lossy JPEG compression, which is very efficient for pictures, is undesirable for text documents, as slanted straight edges take on a jagged appearance, and solid black (or other color) text on a light background compresses well with lossless compression formats. While paper feeding and scanning can be done automatically and quickly, preparation and indexing are necessary and require much work by humans. Preparation involves manually inspecting the papers to be scanned and making sure that they are in order, unfolded, without staples or anything else that might jam the scanner. Additionally, some industries such as legal and medical may require documents to have Bates Numbering or some other mark giving a document identification number and date/time of the document scan. Indexing involves associating relevant keywords to files so that they can be retrieved by content. This process can sometimes be automated to some extent, but it often requires manual labour performed by data-entry clerks. One common practice is the use of barcode-recognition technology: during preparation, barcode sheets with folder names or index information are inserted into the document files, folders, and document groups. Using automatic batch scanning, the documents are saved into appropriate folders, and an index is created for integration into document management systems. A specialized form of document scanning is book scanning. Technical difficulties arise from the books usually being bound and sometimes fragile and irreplaceable, but some manufacturers have developed specialized machinery to deal with this. Often special robotic mechanisms are used to automate the page-turning and scanning process. Flatbed scanners have been used as digital backs for large-format cameras to create high-resolution digital images of static subjects. A modified flatbed scanner has been used for documentation and quantification of thin layer chromatograms detected by fluorescence quenching on silica gel layers containing an ultraviolet (UV) indicator. The ChromImage is allegedly the first commercial flatbed scanner densitometer. It enables acquisition of TLC plate images and quantification of chromatograms by use of Galaxie-TLC software. Other than being turned into densitometers, flatbed scanners were also turned into colorimeters using different methods. Trichromatic Color Analyser is allegedly the first distributable system using a flatbed scanner as a tristimulus colorimetric device. Flatbed scanners may also be used to create artwork directly, in a practice known as scanography. In the biomedical research field, detection devices for DNA microarrays are also referred to as scanners. These scanners are high-resolution systems (up to 1 μm/pixel), similar to microscopes. Detection is performed using CCDs or photomultiplier tubes. In pathology, scanners are used to capture glass slides with tissue from biopsies and other kinds of sampling, allowing for various methods of digital pathology such as telepathology and the application of artificial intelligence for interpretation. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Thirty-seventh_government_of_Israel#cite_note-6] | [TOKENS: 9915] |
Contents Thirty-seventh government of Israel The thirty-seventh government of Israel is the current cabinet of Israel, formed on 29 December 2022, following the Knesset election the previous month. The coalition government currently consists of five parties — Likud, Shas, Otzma Yehudit, Religious Zionist Party and New Hope — and is led by Benjamin Netanyahu, who took office as the prime minister of Israel for the sixth time. The government is widely regarded as the most right-wing government in the country's history, and includes far-right politicians. Several of the government's policy proposals have led to controversies, both within Israel and abroad, with the government's attempts at reforming the judiciary leading to a wave of demonstrations across the country. Following the outbreak of the Gaza war, opposition leader Yair Lapid initiated discussions with Netanyahu on the formation of an emergency government. On 11 October 2023, National Unity MKs Benny Gantz, Gadi Eisenkot, Gideon Sa'ar, Hili Tropper, and Yifat Shasha-Biton joined the Security Cabinet of Israel to form an emergency national unity government. Their accession to the Security Cabinet and to the government (as ministers without portfolio) was approved by the Knesset the following day. Gantz, Netanyahu, and Defense Minister Yoav Gallant became part of the newly formed Israeli war cabinet, with Eisenkot and Ron Dermer serving as observers. National Unity left the government in June 2024. New Hope rejoined the government in September. Otzma Yehudit announced on 19 January 2025 that it had withdrawn from the government, which took effect on 21 January, following the cabinet's acceptance of the three-phase Gaza war ceasefire proposal, though it rejoined two months later. United Torah Judaism left the government in July 2025 over dissatisfaction with the government's draft conscription law. Shas left the government several days later, though it remains part of the coalition. Background The right-wing bloc of parties, led by Benjamin Netanyahu, known in Israel as the national camp, won 64 of the 120 seats in the elections for the Knesset, while the coalition led by the incumbent prime minister Yair Lapid won 51 seats. The new majority has been variously described as the most right-wing government in Israeli history, as well as Israel's most religious government. Shortly after the elections, Lapid conceded to Netanyahu, and congratulated him, wishing him luck "for the sake of the Israeli people". On 15 November, the swearing-in ceremony for the newly elected members of the 25th Knesset was held during the opening session. The vote to appoint a new Speaker of the Knesset, which is usually conducted at the opening session, as well as the swearing in of cabinet members were postponed since ongoing coalition negotiations had not yet resulted in agreement on these positions. Government formation Yair Lapid Yesh Atid Benjamin Netanyahu Likud On 3 November 2022, Netanyahu told his aide Yariv Levin to begin informal coalition talks with allied parties, after 97% of the vote was counted. The leader of the Shas party Aryeh Deri met with Yitzhak Goldknopf, the leader of United Torah Judaism and its Agudat Yisrael faction, on 4 November. The two parties agreed to cooperate as members of the next government. The Degel HaTorah faction of United Torah Judaism stated on 5 November that it will maintain its ideological stance about not seeking any ministerial posts, as per the instruction of its spiritual leader Rabbi Gershon Edelstein, but will seek other senior posts like Knesset committee chairmen and deputy ministers. Netanyahu himself started holding talks on 6 November. He first met with Moshe Gafni, the leader of Degel HaTorah, and then with Goldknopf. Meanwhile, the Religious Zionist Party leader Bezalel Smotrich and the leader of its Otzma Yehudit faction Itamar Ben-Gvir pledged that they would not enter the coalition without the other faction. Gafni later met with Smotrich for coalition talks. Smotrich then met with Netanyahu. On 7 November, Netanyahu met with Ben-Gvir who demanded the Ministry of Public Security with expanded powers for himself and the Ministry of Education or Transport and Road Safety for Yitzhak Wasserlauf. A major demand among all of Netanyahu's allies was that the Knesset be allowed to ignore the rulings of the Supreme Court. Netanyahu met with the Noam faction leader and its sole MK Avi Maoz on 8 November after he threatened to boycott the coalition. He demanded complete control of the Western Wall by the Haredi rabbinate and removal of what he considered as anti-Zionist and anti-Jewish content in schoolbooks. President Isaac Herzog began consultations with heads of all the political parties on 9 November after the election results were certified. During the consultations, he expressed his reservations about Ben-Gvir becoming a member in the next government. Shas met with Likud for coalition talks on 10 November. By 11 November, Netanyahu had secured recommendations from 64 MKs, which constituted a majority. He was given the mandate to form the thirty-seventh government of Israel by President Herzog on 13 November. Otzma Yehudit and Noam officially split from Religious Zionism on 20 November as per a pre-election agreement. On 25 November, Otzma Yehudit and Likud signed a coalition agreement, under which Ben-Gvir will assume the newly created position of National Security Minister, whose powers would be more expansive than that of the Minister of Public Security, including overseeing the Israel Police and the Israel Border Police in the West Bank, as well as giving powers to authorities to shoot thieves stealing from military bases. Yitzhak Wasserlauf was given the Ministry for the Development of the Negev and the Galilee with expanded powers to regulate new West Bank settlements, while separating it from the "Periphery" portfolio, which will be given to Shas. The deal also includes giving the Ministry of Heritage to Amihai Eliyahu, separating it from the "Jerusalem Affairs" portfolio, the chairmanship of the Knesset's Public Security Committee to Zvika Fogel and that of the Special Committee for the Israeli Citizens' Fund to Limor Son Har-Melech, the post of Deputy Economic Minister to Almog Cohen, establishment of a national guard, and expansion of mobilization of reservists in the Border Police. Netanyahu and Maoz signed a coalition agreement on 27 November, under which the latter would become a deputy minister, would head an agency on Jewish identity in the Prime Minister's Office, and would also head Nativ, which processes the aliyah from the former Soviet Union. The agency for Jewish identity would have authority over educational content taught outside the regular curriculum in schools, in addition to the department of the Ministry of Education overseeing external teaching and partnerships, which would bring nonofficial organisations permitted to teach and lecture at schools under its purview. Likud signed a coalition agreement with the Religious Zionist Party on 1 December. Under the deal, Smotrich would serve as the Minister of Finance in rotation with Aryeh Deri, and the party will receive the post of a minister within the Ministry of Defense with control over the departments administering settlement and open lands under the Coordinator of Government Activities in the Territories, in addition to another post of a deputy minister. The deal also includes giving the post of Minister of Aliyah and Integration to Ofir Sofer, the newly created National Missions Ministry to Orit Strook, and the chairmanship of the Knesset's Constitution, Law and Justice Committee to Simcha Rothman. Likud and United Torah Judaism signed a coalition agreement on 6 December, to allow request for an extension to the deadline. Under it, the party would receive the Ministry of Construction and Housing, the chairmanship of the Knesset Finance Committee which will be given to Moshe Gafni, the Ministry of Jerusalem and Tradition (which would replace the Ministry of Jerusalem Affairs and Heritage), in addition to several posts of deputy ministers and chairmanships of Knesset committees. Likud also signed a deal with Shas by 8 December, securing interim coalition agreements with all of their allies. Under the deal, Deri will first serve as the Minister of Interior and Health, before rotating posts with Smotrich after two years. The party will also receive the Ministry of Religious Services and Welfare Ministries, as well as posts of deputy ministers in the Ministry of Education and Interior. The vote to replace then-incumbent Knesset speaker Mickey Levy was scheduled for 13 December, after Likud and its allies secured the necessary number of signatures for it. Yariv Levin of Likud was elected as an interim speaker by 64 votes, while his opponents Merav Ben-Ari of Yesh Atid and Ayman Odeh of Hadash received 45 and five votes respectively. Netanyahu asked Herzog for a 14-day extension after the agreement with Shas to finalise the roles his allied parties would play. Herzog on 9 December extended the deadline to 21 December. On that date, Netanyahu informed Herzog that he had succeeded in forming a coalition, with the new government expected to be sworn in by 2 January 2023. The government was sworn in on 29 December 2022. Timeline Israeli law stated that people convicted of crimes cannot serve in the government. An amendment to that law was made in late 2022, known colloquially as the Deri Law, to allow those who had been convicted without prison time to serve. This allowed Deri to be appointed to the cabinet. Shas leader Aryeh Deri was appointed to be Minister of Health, Minister of the Interior, and Vice Prime Minister in December 2022. He was fired in January 2023, following a Supreme Court decision that his appointment was unreasonable, since he had been convicted of fraud, and had promised not to seek government roles through a plea deal. In March 2023, Defence Minister Yoav Gallant called on the government to delay legislation related to the judicial reform. Prime Minister Netanyahu announced that he had been dismissed from his position, leading to the continuation of mass protests across the country (which had started in January in Tel Aviv). Gallant continued to serve as a minister as he had not received formal notice of dismissal, and two weeks later it was announced that Netanyahu had reversed his decision. Public Safety Minister Itamar Ben-Gvir (Otzma Yehudit leader) and Minister of Justice Yariv Levin (Likud) both threatened to resign if the judicial reform was delayed.[better source needed] After the outbreak of the Gaza war, five members of the National Unity party joined the government as ministers without portfolio, with leader Benny Gantz being made a member of the new Israeli war cabinet (along with Netanyahu and Gallant). As the war progressed, minister of national security Itamar Ben-Gvir threatened to leave the government if the war was ended. A month later in mid December, he again threatened to leave if the war did not maintain "full strength". Gideon Sa'ar stated on 16 March that his New Hope party would resign from the government and join the opposition if Prime Minister Benjamin Netanyahu did not appoint him to the Israeli war cabinet. Netanyahu did not do so, resulting in Sa'ar's New Hope party leaving the government nine days later, reducing the size of the coalition from 76 MKs to 72. Ben-Gvir and Bezalel Smotrich, of the National Religious Party–Religious Zionism party, have indicated that they will withdraw their parties from the government if the January 2025 Gaza war ceasefire is adopted, which would bring down the government. Ben-Gvir announced on 5 June that the members of his party would be allowed to vote as they wish, though his party resumed support on 9 June. On 18 May, Gantz set an 8 June deadline for withdrawal from the coalition, which was delayed by a day following the 2024 Nuseirat rescue operation. Gantz and his party left the government on 9 June, giving the government 64 seats in the Knesset. Sa'ar and his New Hope party rejoined the Netanyahu government on 30 September, increasing the number of seats held by the government to 68. The High Court of Justice ruled on 28 March 2024 that yeshiva funds would no longer be available for students who are "eligible for enlistment", effectively allowing ultra-Orthodox Jews to be drafted into the IDF. Attorney general Gali Baharav-Miara indicated on 31 March that the conscription process must begin on 1 April. The court ruled on 25 June that the IDF must begin to draft yeshiva students. Likud announced on 7 July that it would not put forward any legislation after Shas and United Torah Judaism said that they would boycott the plenary session over the lack of legislation dealing with the Haredi draft. The Ultra-Orthodox boycott continued for a second day, with UTJ briefly ending its boycott on 9 July to unsuccessfully vote in favor of a bill which would have weakened the Law of Return. Yuli Edelstein, who was replaced by Boaz Bismuth on the Foreign Affairs and Defense Committee in early August, published a draft version of the conscription law shortly before his ouster. Bismuth cancelled the work on the draft law in September 2025, which Edelstein called "a shame." Bismuth released the official version of the draft law in late November 2025. It weakened penalties for draft evaders, with Edelstein saying it was "the exact opposite" of the bill which he attempted to pass. Members of Otzma Yehudit resigned from the government on 19 January 2025 over the January 2025 Gaza war ceasefire, which took effect on 21 January. The members rejoined in March, following the "resumption" of the war in Gaza. Avi Maoz of the Noam party left the government in March 2025. On 4 June 2025, senior rabbis for United Torah Judaism Dov Lando and Moshe Hillel Hirsch instructed the party's MKs to pass a bill which would dissolve the Knesset. Yesh Atid, Yisrael Beytenu and The Democrats announced that they will "submit a bill" for dissolution on 11 June, with Yesh Atid tabling the bill on 4 June. There were also reports that Shas would vote in favor of Knesset dissolution amidst division within the governing coalition on Haredi conscription. This jeopardized the coalition's majority and would have triggered new elections if the bill passed. The following day, Agudat Yisrael, one of the United Torah Judaism factions, confirmed that it would submit a bill to dissolve the Knesset. Asher Medina, a Shas spokesman, indicated on 9 June that the party would vote in favor of a preliminary bill to dissolve the Knesset. The rabbis of Degel HaTorah instructed the parties' MKs on 12 June 2025 to oppose the dissolution of the Knesset, which was followed by Yuli Edelstein and the Shas and Degel HaTorah parties announcing that a deal had been reached, with "rabbinical leaders" telling their parties to delay the dissolution vote by a week. Shas and Degel HaTorah voted against the dissolution bill, which led to the bill failing its preliminary reading in a vote of 61 against and 53 in favor. MKs Ya'akov Tessler and Moshe Roth of Agudat Yisrael voted in favor of dissolution. Another dissolution bill will be unable to be brought forward for six months. If the bill had passed its preliminary reading, in addition to three more readings, an election would have been held in approximately three months; The Jerusalem Post posited it would have been held in October. Degel HaTorah announced on 14 July 2025 that it would leave the government because members of the party were dissatisfied after viewing the proposed draft bill by Yuli Edelstein regarding Haredi exemptions from the Israeli draft. Several hours later, Agudat Yisrael announced that it would also leave the government. Deputy Transportation Minister Uri Maklev, Moshe Gafni, the head of the Knesset Finance Committee, Ya'akov Asher, the head of the Knesset Interior and Environment Protection Committee and Jerusalem Affairs minister Meir Porush all submitted their resignations, with their resignations taking effect in 48 hours. Sports Minister Ya'akov Tessler and "Special Committee for Public Petitions Chair" Yitzhak Pindrus also submitted resignations. Yisrael Eichler submitted his resignation as the "head of the Knesset Labor and Welfare Committee" the same day. The resignations will leave Netanyahu's government with a 60-seat majority in the Knesset, as Avi Maoz, of the Noam party, left the government in March 2025. Despite Edelstein's ouster in August, a spokesman for UTJ head Yitzhak Goldknopf remarked that it would not change the faction's withdrawal from the government. The religious council for Shas, called the Moetzet Chachmei HaTorah, instructed the party on 16 July to leave the government, but stay in the coalition. The following day, various cabinet ministers submitted their resignations, including "Interior Minister Moshe Arbel, Social Affairs Minister Ya'akov Margi and Religious Services Minister Michael Malchieli." Malchieli reportedly has postponed his resignation so he could attend a 20 July meeting of the panel investigating whether attorney general Gali Baharav-Miara should be dismissed. Deputy Minister of Agriculture Moshe Abutbul, Minister of Health Uriel Buso and Haim Biton, a minister in the Education Ministry, also submitted their resignation letters, while Arbel retracted his resignation letter. The last cabinet member from the party to submit it was Labor Minister Yoav Ben-Tzur. The ministers who resigned will return to the Knesset, replacing MKs Moshe Roth, Yitzhak Pindrus and Eliyahu Baruchi. Members of government Listed below are the current ministers in the government: Principles and priorities According to the agreements signed between Likud and each of its coalition partners, and the incoming government's published guideline principles, its stated priorities are to combat the cost of living, further centralize Orthodox control over the state religious services, pass judicial reforms which include legislation to reduce judicial controls on executive and legislative power, expand settlements in the West Bank, and consider an annexation of the West Bank. Before the vote of confidence in his new government in the Knesset, Netanyahu presented three top priorities for the new government: internal security and governance, halting the nuclear program of Iran, and the development of infrastructure, with a focus on further connecting the center of the country with its periphery. Policies The government's flagship program, centered around reforms in the judicial branch, drew widespread criticism. Critics said it would have negative effects on the separation of powers, the office of the Attorney General, the economy, public health, women and minorities, workers' rights, scientific research, the overall strength of Israel's democracy and its foreign relations. After weeks of public protests on Israel's streets, joined by a growing number of military reservists, Minister of Defense Yoav Gallant spoke against the reform on 25 March, calling for a halt of the legislative process "for the sake of Israel's security". The next day, Netanyahu announced that he would be removed from his post, sparking another wave of protest across Israel and ultimately leading to Netanyahu agreeing to pause the legislation. On 10 April, Netanyahu announced that Gallant would keep his post. On 27 March 2023, after the public protests and general strikes, Netanyahu announced a pause in the reform process to allow for dialogue with opposition parties. However, negotiations aimed at reaching a compromise collapsed in June, and the government resumed its plans to unilaterally pass parts of the legislation. On 24 July 2023, the Knesset passed a bill that curbs the power of the Supreme Court to declare government decisions unreasonable; on 1 January 2024, the Supreme Court struck the bill down. The Knesset passed a "watered-down" version of the judicial reform package in late March 2025 which "changes the composition" of the judicial selection committee. In December 2022 Minister of National Security Itamar Ben-Gvir sought to amend the law that regulates the operations of the Israel Police, such that the ministry will have more direct control of its forces and policies, including its investigative priorities. Attorney General Gali Baharav-Miara objected to the draft proposal, raising concerns that the law would enable the politicization of police work, and the draft was amended to partially address those concerns. Nevertheless, in March 2023 Deputy Attorney General Gil Limon stated that the Attorney General's fears had been realized, referring to several instances of ministerial involvement in the day-to-day work of the otherwise independent police force – statements that were repeated by the Attorney General herself two days later. Separately, Police Commissioner Kobi Shabtai instructed Deputy Commissioners to avoid direct communication with the minister, later stating that "the Israel Police will remain apolitical, and act only according to law". Following appeals by the Association for Civil Rights in Israel and the Movement for Quality Government in Israel, the High Court of Justice instructed Ben-Gvir "to refrain from giving operational directions to the police... [especially] as regards to protests and demonstrations against the government." As talks of halting the judicial reform gained wind during March 2023, Minister of National Security Itamar Ben-Gvir threatened to resign if the legislation implementing the changes was suspended. To appease Ben-Gvir, Prime Minister Netanyahu announced that the government would promote the creation of a new National Guard, to be headed by Ben-Gvir. On 29 March, thousands of Israelis demonstrated in Tel Aviv, Haifa and Jerusalem against this decision. On 1 April, the New York Times quoted Gadeer Nicola, head of the Arab department at the Association for Civil Rights in Israel, as saying "If this thing passes, it will be an imminent danger to the rights of Arab citizens in this country. This will create two separate systems of applying the law. The regular police which will operate against Jewish citizens — and a militarized militia to deal only with Arab citizens." The same day, while speaking on Israel's Channel 13 about those whom he'd like to see enlist in the National Guard, Ben-Gvir specifically mentioned La Familia, the far-right fan club of the Beitar Jerusalem soccer team. On 2 April, Israel's cabinet approved the establishment of a law enforcement body that would operate independently of the police, under Ben-Gvir's authority. According to the decision, the Minister was to establish a committee chaired by the Director General of the Ministry of National Security, with representatives of the ministries of defense, justice and finance, as well as the police and the IDF, to outline the operations of the new organization. The committee's recommendations will be submitted to the government for consideration. Addressing a conference on 4 April, Police Commissioner Kobi Shabtai said that he is not opposed to the establishment of a security body which would answer to the police, but "a separate body? Absolutely not." The police chief said he had warned Ben-Gvir that the establishment of a security body separate from the police is "unnecessary, with extremely high costs that may harm citizens' personal security." During a press conference on 10 April, Prime Minister Netanyahu said, in what has been seen by some news outlets as a concession to the protesters, that "This will not be anyone's militia, it will be a security body, orderly, professional, that will be subordinate to one of the [existing] security bodies." The committee established by the government recommended the government to order the establishment of the National Guard immediately while allocating budgets. The National Guard, under whose command will be a superintendent of the police, will not be subordinate to Ben-Gvir. It will be subordinate to the police commissioner and will be part of Israel Border Police. The Ministry of Defense and Finance opposed the conclusions. The Israeli National Security Council called for further discussion on this. The coalition's efforts to expand the purview of Rabbinical courts; force some organizations, such as hospitals, to enforce certain religious practices; amend the Law Prohibiting Discrimination to allow gender segregation and discrimination on the grounds of religious belief; expand funding for religious causes; and put into law the exemption of yeshiva and kolel students from conscription have drawn criticism. According to the Haaretz op-ed of 7 March 2023, "the current coalition is interested... in modifying the public space so it suits the religious lifestyle. The legal coup is meant to castrate anyone who can prevent it, most of all the HCJ." Several banks and institutional investors, including the Israel Discount Bank and AIG have committed to avoid investing in, or providing credit to any organization that will discriminate against others on ground of religion, race, gender or sexual orientation. A series of technology companies and investment firms including Wiz, Intel Israel, Salesforce and Microsoft Israel Research and Development, have criticized the proposed changes to the Law Prohibiting Discrimination, with Wiz stating that it will require its suppliers to commit to preventing discrimination. Over sixty prominent law firms pledged that they will neither represent, nor do business with discriminating individuals and organizations. Insight Partners, a major private equity fund operating in Israel, released a statement warning against intolerance and any attempt to harm personal liberties. Orit Lahav, chief executive of the women's rights organization Mavoi Satum ("Dead End"), said that "the Rabbinical courts are the most discriminatory institution in the State of Israel... Limiting the HCJ[d] while expanding the jurisdiction of the Rabbinical courts would... cause significant harm to women." Anat Thon Ashkenazy, Director of the Center for Democratic Values and Institutions at the Israel Democracy Institute, said that "almost every part of the reform could harm women... the meaning of an override clause is that even if the court says that the law on gender segregation is illegitimate, is harmful, the Knesset could say 'Okay, we say otherwise'". She added that "there is a very broad institutional framework here, after which there will come legislation that harms women's right and we will have no way of protecting or stopping it." During July 2023, 20 professional medical associations signed a letter of position warning against the ramifications to public health that would result from the exclusion of women from the public sphere. They cited, among others, a rise in prevalence of risk factors for cardiovascular disease, pregnancy-related ailments, psychological distress, and the risk of suicide. On 30 July the Knesset passed an amendment to penal law adding sexual offenses to those offenses whose penalty can be doubled if done on grounds of "nationalistic terrorism, racism or hostility towards a certain community". According to MK Limor Son Har-Melech, the bill is meant to penalize any individual who "[intends to] harm a woman sexually based on her Jewishness". The law was criticized by MK Gilad Kariv as "populist, nationalistic, and dangerous towards the Arab citizens of Israel", and by MK Ahmad Tibi as a "race law", and was objected to by legal advisors at the Ministry of Justice and the Knesset Committee on National Security. Activist Orit Kamir wrote that "the amendment... is neither feminist, equal, nor progressive, but the opposite: it subordinates women's sexuality to the nationalistic, racist patriarchy. It hijacks the Law for Prevention of Sexual Harassment to serve a world view that tags women as sexual objects that personify the nation's honor." Yael Sherer, director of the Lobby to Combat Sexual Violence, criticized the law as being informed by dated ideas about sexual assault, and proposed that MKs "dedicate a session... to give victims of sexual assault an opportunity to come out of the darkness... instead of [submitting] declarative bills that change nothing and are not meant but for grabbing headlines". In Israel, during 2022, 24 women "were murdered because they were women," which was an increase of 50% compared to 2021. A law permitting courts to order men subject to a restraining order following domestic violence offenses to wear electronic tags was drafted during the previous Knesset and had passed its first reading unanimously. On 22 March 2023, the Knesset voted to reject the bill. It had been urged to do so by National Security Minister Itamar Ben-Gvir, who said that the bill was unfair to men. Earlier in the week, Ben-Gvir had blocked the measure from advancing in the ministerial legislative committee. The MKs voting against the bill included Prime Minister Netanyahu. The Association of Families of Murder Victims said that by rejecting the law, National Security Minister Itamar Ben-Gvir "brings joy to violent men and abandons the women threatened with murder… unsupervised restraining orders endanger women's lives even more. They give women the illusion of being protected, and then they are murdered." MK Pnina Tamano-Shata, chairwoman of the Knesset Committee on the Status of Women and Gender Equality, said that "the coalition proved today that it despises women's lives." The NGO Amutat Bat Melech [he], which assists Orthodox and ultra-Orthodox women who suffer from domestic violence, said that: "Rejecting the electronic bracelet bill is disconnected from the terrible reality of seven femicides since the beginning of the year. This is an effective tool of the first degree that could have saved lives and reduced the threat to women suffering from domestic violence. This is a matter of life and death, whose whole purpose is to provide a solution to defend women." The agreement signed by the coalition parties includes the setting up of a committee to draft changes to the Law of Return. Israeli religious parties have long demanded that the "grandchild clause" of the Law of Return be cancelled. This clause grants citizenship to anyone with at least one Jewish grandparent, as long as they do not practice another religion. If the grandchild clause were to be removed from the Law of Return then around 3 million people who are currently eligible for aliyah would no longer be eligible. The heads of the Jewish Agency, the Jewish Federations of North America, the World Zionist Organization and Keren Hayesod sent a joint letter to Prime Minister Netanyahu, expressing their "deep concern" about any changes to the Law of Return, adding that "Any change in the delicate and sensitive status quo on issues such as the Law of Return or conversion could threaten to unravel the ties between us and keep us away from each other." The Executive Council of Australian Jewry and the Zionist Federation of Australia issued a joint statement saying "We… view with deep concern… proposals in relation to religious pluralism and the law of return that risk damaging Israel's… relationship with Diaspora Jewry." On 19 March 2023, Israeli Finance Minister Bezalel Smotrich spoke in Paris at a memorial service for a Likud activist. The lectern at which Smotrich spoke was covered with a flag depicting the 'Greater Land of Israel,' encompassing the whole of Mandatory Palestine, as well as Trans-Jordan. During his speech, Smotrich said that "there's no such thing as Palestinians because there's no such thing as a Palestinian people." He added that the Palestinian people are a fictitious nation invented only to fight the Zionist movement, asking "Is there a Palestinian history or culture? There isn't any." The event received widespread media coverage. On 21 March, a spokesman for the US State Department sharply criticized Smotrich's comments. "The comments, which were delivered at a podium adorned with an inaccurate and provocative map, are offensive, they are deeply concerning, and, candidly, they're dangerous. The Palestinians have a rich history and culture, and the United States greatly values our partnership with the Palestinian people," he said. The Jordanian Foreign Ministry also voiced disapproval: "The Israeli Minister of Finance's use, during his participation in an event held yesterday in Paris, of a map of Israel that includes the borders of the Hashemite Kingdom of Jordan and the occupied Palestinian territories represents a reckless inflammatory act, and a violation of international norms and the Jordanian-Israeli peace treaty." Additionally, a map encompassing Mandatory Palestine and Trans-Jordan with a Jordanian flag on it was placed on a central lectern in the Jordanian Parliament. Jordan's parliament voted to expel the Israeli ambassador. Israel's Ministry of Foreign Affairs released a clarification relating to the matter, stating that "Israel is committed to the 1994 peace agreement with Jordan. There has been no change in the position of the State of Israel, which recognizes the territorial integrity of the Hashemite Kingdom of Jordan". Ahead of a Europe Day event due to take place on 9 May 2023, far-right wing National Security Minister Itamar Ben-Gvir was assigned as a representative of the government and a speaker at the event by the government secretariat, which deals with placing ministers at receptions on the occasion of the national days of the foreign embassies. The European Union requested that Ben-Gvir not attend, but the government did not make changes to the plan. On 8 May, the European delegation to Israel cancelled the reception, stating that: "The EU Delegation to Israel is looking forward to celebrating Europe Day on May 9, as it does every year. Regrettably, this year we have decided to cancel the diplomatic reception, as we do not want to offer a platform to someone whose views contradict the values the European Union stands for. However, the Europe Day cultural event for the Israeli public will be maintained to celebrate with our friends and partners in Israel the strong and constructive bilateral relationship". Israel's Opposition Leader Yair Lapid stated: "Sending Itamar Ben-Gvir to a gathering of EU ambassadors is a serious professional mistake. The government is embarrassing a large group of friendly countries, jeopardizing future votes in international institutions, and damaging our foreign relations. Last year, after a decade of efforts, we succeeded in signing an economic-political agreement with the European Union that will contribute to the Israeli economy and our foreign relations. Why risk it, and for what? Ben-Gvir is not a legitimate person in the international community (and not really in Israel either), and sometimes you have to be both wise and just and simply send someone else". On 23 February 2023, Defense Minister Gallant signed an agreement assigning governmental powers in the West Bank to a body to be headed by Minister Bezalel Smotrich, who will effectively become the governor of the West Bank, controlling almost all areas of life in the area, including planning, building and infrastructure. Israeli governments have hitherto been careful to keep the occupation as a military government. The temporary holding of power by an occupying military force, pending a negotiated settlement, is a principle of international law – an expression of the prohibition against obtaining sovereignty through conquest that was introduced in the wake of World War II. An editorial in Haaretz noted that the assignment of governmental powers in the West Bank to a civilian governor, alongside the plan to expand the dual justice system so that Israeli law will apply fully to settlers in the West Bank, constitutes de jure annexation of the West Bank. On 26 February 2023, following the 2023 Huwara shooting in which two Israelis were killed by an unidentified attacker, hundreds of Israeli settlers attacked the Palestinian town of Huwara and three nearby villages, setting alight hundreds of Palestinian homes (some with people in them), businesses, a school, and numerous vehicles, killing one Palestinian man and injuring 100 others. Bezalel Smotrich subsequently called on Twitter for Huwara to be "wiped out" by the Israeli government. Zvika Fogel MK, of the ultra-nationalist Otzma Yehudit, which forms part of the governing coalition, said that he "looks very favorably upon" the results of the rampage. Members of the coalition proposed an amendment to the Disengagement Law, which would allow Israelis to resettle settlements vacated during the 2005 Israeli disengagement from Gaza and the northern West Bank. The evacuated settlements were considered illegal under international law, according to most countries. The proposal was approved for voting by the Foreign Affairs and Defense Committee on 9 March 2023, while the committee was still waiting for briefing materials from the NSS, IDF, MFA and Shin Bet, and was passed on 21 March. The US has requested clarification from Israeli ambassador Michael Herzog. A US State Department spokesman stated that "The U.S. strongly urges Israel to refrain from allowing the return of settlers to the area covered by the legislation, consistent with both former Prime Minister Sharon and the current Israeli Government's commitment to the United States," noting that the actions represent a clear violation of undertakings given by the Sharon government to the Bush administration in 2005 and Netanyahu's far-right coalition to the Biden administration the previous week. Minister of Communication Shlomo Karhi had initially intended to cut the funding of the Israeli Public Broadcasting Corporation (also known by its blanket branding Kan) by 400 million shekels – roughly half of its total budget – closing several departments, and privatizing content creation. In response, the Director-General of the European Broadcasting Union, Noel Curran, sent two urgent letters to Netanyahu, expressing his concerns and calling on the Israeli government to "safeguard the independence of our Member KAN and ensure it is allowed to operate in a sustainable way, with funding that is both stable, adequate, fair, and transparent." On 25 January 2023, nine journalist organizations representing some of Kan's competitors issued a statement of concern, acknowledging the "important contribution of public broadcasting in creating a worthy, unbiased and non-prejudicial journalistic platform", and noting that "the existence of the [broadcasting] corporation as a substantial public broadcast organization strengthens media as a whole, adding to the competition in the market rather than weakening it." They also expressed their concern that the "real reason" for the proposal was actually "an attempt to silence voices from which... [the Minister] doesn't always draw satisfaction". The same day, hundreds of journalists, actors and filmmakers protested in Tel Aviv. The proposal was eventually put on hold. On 22 February 2023 it was reported that Prime Minister Netanyahu was attempting to appoint his close associate Yossi Shelley as the deputy to the National Statistician — a highly sensitive position in charge of providing accurate data for decision makers. The appointment of Shelley, who did not possess the required qualifications for the role, was withdrawn following publication. In its daily editorial, Haaretz tied this attempt with the judicial reform: "once they take control of the judiciary, law enforcement and public media, they wish to control the state's data base, the dry numerical data it uses to plan its future". Netanyahu also proposed Avi Simhon for the role, and eventually froze all appointments at the Israel Central Bureau of Statistics. Also on 22 February 2023, it was revealed that Yoav Kish, the Minister of Education, was promoting a draft government decision change to the National Library of Israel board of directors which would grant him more power over the institution. In response, the Hebrew University — which owned the library until 2008 – announced that if the draft is accepted, it will withdraw its collections from the library. The university's collections, which according to the university constitute some 80% of the library's collection, include the Agnon archive, the original manuscript of Hatikvah, and the Rothschild Haggadah, the oldest known Haggadah. A group of 300 authors and poets signed an open letter against the move, further noting their objection against "political takeover" of public broadcasting, as well as "any legislation that will castrate the judiciary and damage the democratic foundations of the state of Israel". Several days later, it was reported that a series of donors decided to withhold their donations to the library, totaling some 80 million shekels. On 3 March a petition against the move by 1,500 academics, including Israel Prize laureates, was sent to Kish. The proposal has been seen by some as retribution against Shai Nitzan, the former State Attorney and the library's current rector. On 5 March it was reported that the Legal Advisor to the Ministry of Finance, Asi Messing, was withholding the proposal. According to Messing, the proposal – which was being promoted as part of the Economic Arrangements Law – "was not reviewed... by the qualified personnel in the Ministry of Finance, does not align with any of the common goals of the economic plan, was not agreed to by myself and was not approved by the Attorney General." As of February 2023, the government has been debating several proposals that will significantly weaken the Ministry of Environmental Protection, including reducing the environmental regulation of planning and development and electricity production. One of the main proposals, the transferal of a 3 billion shekel fund meant to finance waste management plants from the Ministry of Environmental Protection to the Ministry of the Interior, was eventually withdrawn. The Minister of Environmental Protection, Idit Silman, has been criticized for using for meeting with climate change denialists, for wasteful and personally-motivated travel on the ministry's expense, for politicizing the role, and for engaging in political activity on the ministry's time. The government has been noted for an unusually high number of dismissals and resignations of senior career civil servants, and for the frequent attempts to replace them with candidates with known political associations, who are often less competent. According to sources, Netanyahu and people in his vicinity are seeking out civil servants who were appointed by the previous government, intent on replacing them with people loyal to him. Governmental nominees for various positions have been criticized for lack of expertise. In addition to the nominee to the position of Deputy National Statistician (see above), the Director General of the Ministry of Finance, Shlomi Heisler; the Director General of the Ministry of Justice, Itamar Donenfeld; and the Director General of Ministry of Transport, Moshe Ben Zaken, have all been criticized for incompetence, lack of familiarity with their Ministries' subject matter, lack of interest in the job, or lack of experience in managing large organizations. It has been reported that in some ministries, senior officials were enacting slowdowns as a means for dealing with the new ministers and director generals. On 28 July the director general of the Ministry of Education resigned, citing as reason the societal "rift". Asaf Zalel, a retired Air Force Brigadier General, was appointed in January. When asked about attempts to appoint his personal friend and attorney to the board of directors of a state-owned company, Minister David Amsalem replied: "that is my job, due to my authority to appoint directors. I put forward people that I know and hold in esteem". Under Minister of Transport Miri Regev, the ministry has either dismissed or lost the heads of the National Public Transport Authority, Israel Airports Authority, National Road Safety Authority, Israel Railways, and several officials in Netivei Israel. The current chair of Netivei Israel is Likud member and Regev associate Yigal Amadi, and the legal counsel is Einav Abuhzira, daughter of a former Likud branch chair. Abuhzira was appointed instead of Elad Berdugo, nephew of Netanyahu surrogate Yaakov Bardugo, after he was disqualified for the role by the Israel Government Companies Authority. In July 2023 the Ministry of Communications, Shlomo Karhi, and the minister in charge of the Israel Government Companies Authority, Dudi Amsalem, deposed the chair of the Israel Postal Company, Michael Vaknin. The chair, who was hired to lead the company's financial recovery after years of operational loss and towards privatization, has gained the support of officials at the Authority and at the Ministry of Finance; nevertheless, the ministers claimed that his performance is inadequate, and nominated in his place Yiftah Ron-Tal, who has known ties to Netanyahu and Smotrich. They also nominated four new directors, two of which have known political associations, and a third who was a witness in Netanyahu's trial. The coalition is allowed to spend a portion of the state's budget on a discretionary basis, meant to coax member parties to reach an agreement on the budget. As of May 2023, the government was pushing an allocation of over 13 billion shekels over two years - almost seven times the amount allocated by the previous government. Most of the funds will be allocated for uses associated with the religious, orthodox and settler communities. The head of the Budget Department at the Ministry of Finance, Yoav Gardos, objected to the allocations, claiming they would exacerbate unemployment in the Orthodox community, which is projected to cost the economy a total of 6.7 trillion shekels in lost produce by 2065. At the onset of the Gaza war and the declaration of a state of national emergency, Minister of Finance Bezalel Smotrich instructed government agencies to continue with the planned distribution of discretionary funds. Corruption During March 2023, the government was promoting an amendment to the Law on Public Service (Gifts) that would allow Netanyahu to receive donations to fund his legal defense. The amendment follows a decision by the High Court of Justice (HCJ) that forced Netanyahu to refund US$270,000 given to him and his wife by his late cousin, Nathan Mileikowsky, for their legal defense. This is in contrast to past statements by Minister of Justice Yariv Levin, who spoke against the possible conflict of interests that can result from such transactions. The bill was opposed by the Attorney General Gali Baharav-Miara, who stressed that it could "create a real opportunity for governmental corruption", and was eventually withdrawn at the end of March. As of March 2023, the coalition was promoting a bill that would prevent judicial review of ministerial appointments. The bill is intended to prevent the HCJ from reviewing the appointment of the twice-convicted chairman of Shas, Aryeh Deri (convicted of bribery, fraud, and breach of trust), to a ministerial position, after his previous appointment was annulled on grounds of unreasonableness. The bill follows on the heels of another amendment, that relaxed the ban on the appointment of convicted criminals, so that Deri - who was handed a suspended sentence after his second conviction - could be appointed. The bill is opposed by the Attorney General, as well as by the Knesset Legal Adviser, Sagit Afik. Israeli law allows for declaring a Prime Minister (as well as several other high-ranking public officials) to be temporarily or permanently incapacitated, but does not specify the conditions which can lead to a declaration of incapacitation. In the case of the Prime Minister, the authority to do so is given to the Attorney General. In March 2023, the coalition advanced a bill that passes this authority from the Attorney General to the government with the approval of the Knesset committee, and clarified that incapacitation can only result from medical or mental conditions. On 3 January 2024, the Supreme Court ruled by a majority of 6 out of 11 that the validity of the law will be postponed to the next Knesset because the bill in its immediate application is a personal law and is intended to serve a distinct personal purpose. Later, the court rejected a petition regarding the definition of Netanyahu as an incapacitated prime minister due to his ongoing trial and conflict of interests. Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Invertebrates] | [TOKENS: 4159] |
Contents Invertebrate Invertebrates are animals that neither develop nor retain a vertebral column (commonly known as a spine or backbone), which evolved from the notochord. It is a paraphyletic grouping including all animals excluding the chordate subphylum Vertebrata, i.e. vertebrates. Well-known phyla of invertebrates include arthropods, molluscs, annelids, echinoderms, flatworms, cnidarians, and sponges. The majority of animal species are invertebrates; one estimate puts the figure at 97%. Many invertebrate taxa have a greater number and diversity of species than the entire subphylum of Vertebrata. Invertebrates vary widely in size, from 10 μm (0.0004 in) myxozoans to the 9–10 m (30–33 ft) colossal squid. Some so-called invertebrates, such as the Tunicata and Cephalochordata, are actually sister chordate subphyla to Vertebrata, being more closely related to vertebrates than to other invertebrates. This makes the "invertebrates" paraphyletic, so the term has no significance in taxonomy. Etymology The word "invertebrate" comes from the Latin word vertebra, which means a joint in general, and sometimes specifically a joint from the spinal column of a vertebrate. The jointed aspect of vertebra is derived from the concept of turning, expressed in the root verto or vorto, to turn. The prefix in- means "not" or "without". Taxonomic significance The term invertebrates does not describe a taxon in the same way that Arthropoda, Vertebrata or Manidae do. Each of those terms describes a valid taxon, phylum, subphylum or family. "Invertebrata" is a term of convenience, not a taxon; it has very little circumscriptional significance except within the Chordata. The Vertebrata as a subphylum comprises such a small proportion of the Metazoa that to speak of the kingdom Animalia in terms of "Vertebrata" and "Invertebrata" has limited practicality. In the more formal taxonomy of Animalia other attributes that logically should precede the presence or absence of the vertebral column in constructing a cladogram, for example, the presence of a notochord. That would at least circumscribe the Chordata. However, even the notochord would be a less fundamental criterion than aspects of embryological development and symmetry or perhaps Bauplan. Despite this, the concept of invertebrates as a taxon of animals has persisted for over a century among the laity, and within the zoological community and in its literature it remains in use as a term of convenience for animals that are not members of the Vertebrata. The following text reflects earlier scientific understanding of the term and of those animals which have constituted it. According to this understanding, invertebrates do not possess a skeleton of bone, either internal or external. They include hugely varied body plans. Many have fluid-filled, hydrostatic skeletons, like jellyfish or worms. Others have hard exoskeletons, outer shells like those of insects and crustaceans. The most familiar invertebrates include the Protozoa, Porifera, Coelenterata, Platyhelminthes, Nematoda, Annelida, Echinodermata, Mollusca and Arthropoda. Arthropoda include insects, crustaceans and arachnids. Number of extant species By far the largest number of described invertebrate species are insects. The following table lists the number of described extant species for major invertebrate groups as estimated in the IUCN Red List of Threatened Species, 2014.3. The IUCN estimates that 66,178 extant vertebrate species have been described, which means that over 95% of the described animal species in the world are invertebrates. Characteristics The trait that is common to all invertebrates is the absence of a vertebral column (backbone): this creates a distinction between invertebrates and vertebrates. The distinction is one of convenience only; it is not based on any clear biologically homologous trait, any more than the common trait of having wings functionally unites insects, bats, and birds, or than not having wings unites tortoises, snails and sponges. Being animals, invertebrates are heterotrophs, and require sustenance in the form of the consumption of other organisms. With a few exceptions, such as the Porifera, invertebrates generally have bodies composed of differentiated tissues. There is also typically a digestive chamber with one or two openings to the exterior. The body plans of most multicellular organisms exhibit some form of symmetry, whether radial, bilateral, or spherical. A minority, however, exhibit no symmetry. One example of asymmetric invertebrates includes all gastropod species. This is easily seen in snails and sea snails, which have helical shells. Slugs appear externally symmetrical, but their pneumostome (breathing hole) is located on the right side. Other gastropods develop external asymmetry, such as Glaucus atlanticus that develops asymmetrical cerata as they mature. The origin of gastropod asymmetry is a subject of scientific debate. Other examples of asymmetry are found in fiddler crabs and hermit crabs. They often have one claw much larger than the other. If a male fiddler loses its large claw, it will grow another on the opposite side after moulting. Sessile animals such as sponges are asymmetrical alongside coral colonies (with the exception of the individual polyps that exhibit radial symmetry); Alpheidae claws that lack pincers; and some copepods, polyopisthocotyleans, and monogeneans which parasitize by attachment or residency within the gill chamber of their fish hosts). Neurons differ in invertebrates from mammalian cells. Invertebrates cells fire in response to similar stimuli as mammals, such as tissue trauma, high temperature, or changes in pH. The first invertebrate in which a neuron cell was identified was the medicinal leech, Hirudo medicinalis. Learning and memory using nociceptors have been described in the sea hare, Aplysia. Mollusk neurons are able to detect increasing pressures and tissue trauma. Neurons have been identified in a wide range of invertebrate species, including annelids, molluscs, nematodes and arthropods. One type of invertebrate respiratory system is the open respiratory system composed of spiracles, tracheae, and tracheoles that terrestrial arthropods have to transport metabolic gases to and from tissues. The distribution of spiracles can vary greatly among the many orders of insects, but in general each segment of the body can have only one pair of spiracles, each of which connects to an atrium and has a relatively large tracheal tube behind it. The tracheae are invaginations of the cuticular exoskeleton that branch (anastomose) throughout the body with diameters from only a few micrometres up to 0.8 mm. The smallest tubes, tracheoles, penetrate cells and serve as sites of diffusion for water, oxygen, and carbon dioxide. Gas may be conducted through the respiratory system by means of active ventilation or passive diffusion. Unlike vertebrates, insects do not generally carry oxygen in their haemolymph. A tracheal tube may contain ridge-like circumferential rings of taenidia in various geometries such as loops or helices. In the head, thorax, or abdomen, tracheae may also be connected to air sacs. Many insects, such as grasshoppers and bees, which actively pump the air sacs in their abdomen, are able to control the flow of air through their body. In some aquatic insects, the tracheae exchange gas through the body wall directly, in the form of a gill, or function essentially as normal, via a plastron. Despite being internal, the tracheae of arthropods are shed during moulting (ecdysis). Only vertebrate animals have ears, though many invertebrates detect sound using other kinds of sense organs. In insects, tympanal organs are used to hear distant sounds. They are located either on the head or elsewhere, depending on the insect family. The tympanal organs of some insects are extremely sensitive, offering acute hearing beyond that of most other animals. The female cricket fly Ormia ochracea has tympanal organs on each side of her abdomen. They are connected by a thin bridge of exoskeleton and they function like a tiny pair of eardrums, but, because they are linked, they provide acute directional information. The fly uses her "ears" to detect the call of her host, a male cricket. Depending on where the song of the cricket is coming from, the fly's hearing organs will reverberate at slightly different frequencies. This difference may be as little as 50 billionths of a second, but it is enough to allow the fly to home in directly on a singing male cricket and parasitise it. Simpler structures allow other arthropods to detect near-field sounds. Spiders and cockroaches, for example, have hairs on their legs, which are used for detecting sound. Caterpillars may also have hairs on their body that perceive vibrations and allow them to respond to sound.[citation needed] Like vertebrates, most invertebrates reproduce at least partly through sexual reproduction. They produce specialized reproductive cells that undergo meiosis to produce smaller, motile spermatozoa or larger, non-motile ova. These fuse to form zygotes, which develop into new individuals. Others are capable of asexual reproduction, or sometimes, both methods of reproduction. Extensive research with model invertebrate species such as Drosophila melanogaster and Caenorhabditis elegans has contributed much to our understanding of meiosis and reproduction. However, beyond the few model systems, the modes of reproduction found in invertebrates show incredible diversity. In one extreme example, it is estimated that 10% of orbatid mite species have persisted without sexual reproduction and have reproduced asexually for more than 400 million years. Invertebrates have an extremely diverse array of reproductive systems, the only commonality may be that they all lay eggs. Also, aside from cephalopods and arthropods, nearly all other invertebrates exhibit external fertilization. Social behavior is widespread in invertebrates, including cockroaches, termites, aphids, thrips, ants, bees, Passalidae, Acari, spiders, and more. Social interaction is particularly salient in eusocial species but applies to other invertebrates as well. Insects recognize information transmitted by other insects. The term invertebrates covers several phyla. One of these are the sponges (Porifera). They were long thought to have diverged from other animals early. They lack the complex organization found in most other phyla. Their cells are differentiated, but in most cases not organized into distinct tissues. Sponges typically feed by drawing in water through pores. Some speculate that sponges are not so primitive, but may instead be secondarily simplified. The Ctenophora and the Cnidaria, which includes sea anemones, corals, and jellyfish, are radially symmetric and have digestive chambers with a single opening, which serves as both the mouth and the anus. Both have distinct tissues, but they are not organized into organs. There are only two main germ layers, the ectoderm and endoderm, with only scattered cells between them. As such, they are sometimes called diploblastic. The Echinodermata are radially symmetric and exclusively marine, including starfish (Asteroidea), sea urchins, (Echinoidea), brittle stars (Ophiuroidea), sea cucumbers (Holothuroidea) and feather stars (Crinoidea). The largest animal phylum is also included within invertebrates: the Arthropoda, including insects, spiders, crabs, and their kin. All these organisms have a body divided into repeating segments, typically with paired appendages. In addition, they possess a hardened exoskeleton that is periodically shed during growth. Two smaller phyla, the Onychophora and Tardigrada, are close relatives of the arthropods and share some traits with them, excluding the hardened exoskeleton. The Nematoda, or roundworms, are perhaps the second largest animal phylum, and are also invertebrates. Roundworms are typically microscopic, and occur in nearly every environment where there is water. A number are important parasites. Smaller phyla related to them are the Kinorhyncha, Priapulida, and Loricifera. These groups have a reduced coelom, called a pseudocoelom. Other invertebrates include the Nemertea, or ribbon worms, and the Sipuncula. Another phylum is Platyhelminthes, the flatworms. These were originally considered primitive, but it now appears they developed from more complex ancestors. Flatworms are acoelomates, lacking a body cavity, as are their closest relatives, the microscopic Gastrotricha. The Rotifera, or rotifers, are common in aqueous environments. Invertebrates also include the Acanthocephala, or spiny-headed worms, the Gnathostomulida, Micrognathozoa, and the Cycliophora. Also included are two of the most successful animal phyla, the Mollusca and Annelida. The former, which is the second-largest animal phylum by number of described species, includes animals such as snails, clams, and squids, and the latter comprises the segmented worms, such as earthworms and leeches. These two groups have long been considered close relatives because of the common presence of trochophore larvae, but the annelids were considered closer to the arthropods because they are both segmented. Now, this is generally considered convergent evolution, owing to many morphological and genetic differences between the two phyla. Among lesser phyla of invertebrates are the Hemichordata, or acorn worms, and the Chaetognatha, or arrow worms. Other phyla include Acoelomorpha, Brachiopoda, Bryozoa, Entoprocta, Phoronida, and Xenoturbellida. Classification Invertebrates can be classified into several main categories, some of which are taxonomically obsolescent or debatable, but still used as terms of convenience. Each however appears in its own article at the following links. An informal classification of invertabrates between macroinvertebrates, which can be seen withe the naked eye, and microinvertebrates which cannot be seen by the unaided eye is used for convenience. The line is arbitrarily drawn at a length of about 1mm. History The earliest animal fossils are of invertebrates. 665-million-year-old fossils in the Trezona Formation at Trezona Bore, West Central Flinders, South Australia have been interpreted as being early sponges. Some paleontologists suggest that animals appeared much earlier, possibly as early as 1 billion years ago though they probably became multicellular in the Tonian. Trace fossils such as tracks and burrows found in the late Neoproterozoic Era indicate the presence of triploblastic worms, roughly as large (about 5 mm wide) and complex as earthworms. Around 453 MYA, animals began diversifying, and many of the important groups of invertebrates diverged from one another. Fossils of invertebrates are found in various types of sediment from the Phanerozoic. Fossils of invertebrates are commonly used in stratigraphy. Carl Linnaeus divided these animals into only two groups, the Insecta and the now-obsolete Vermes (worms). Jean-Baptiste Lamarck, who was appointed to the position of "Curator of Insecta and Vermes" at the Muséum National d'Histoire Naturelle in 1793, both coined the term "invertebrate" to describe such animals and divided the original two groups into ten, by splitting Arachnida and Crustacea from the Linnean Insecta, and Mollusca, Annelida, Cirripedia, Radiata, Coelenterata and Infusoria from the Linnean Vermes. They are now classified into over 30 phyla, from simple organisms such as sea sponges and flatworms to complex animals such as arthropods and molluscs. Invertebrates are animals without a vertebral column. This has led to the conclusion that invertebrates are a group that deviates from the normal, vertebrates. This has been said to be because researchers in the past, such as Lamarck, viewed vertebrates as a "standard": in Lamarck's theory of evolution, he believed that characteristics acquired through the evolutionary process involved not only survival, but also progression toward a "higher form", to which humans and vertebrates were closer than invertebrates were. Although goal-directed evolution has been abandoned, the distinction of invertebrates and vertebrates persists to this day, even though the grouping has been noted to be "hardly natural or even very sharp." Another reason cited for this continued distinction is that Lamarck created a precedent through his classifications which is now difficult to escape from. It is also possible that some humans believe that, they themselves being vertebrates, the group deserves more attention than invertebrates. In any event, in the 1968 edition of Invertebrate Zoology, it is noted that "division of the Animal Kingdom into vertebrates and invertebrates is artificial and reflects human bias in favor of man's own relatives." The book also points out that the group lumps a vast number of species together, so that no one characteristic describes all invertebrates. In addition, some species included are only remotely related to one another, with some more related to vertebrates than other invertebrates (see Paraphyly). In research For many centuries, invertebrates were neglected by biologists, in favor of big vertebrates and "useful" or charismatic species. Invertebrate biology was not a major field of study until the work of Linnaeus and Lamarck in the 18th century. During the 20th century, invertebrate zoology became one of the major fields of natural sciences, with prominent discoveries in the fields of medicine, genetics, palaeontology, and ecology. The study of invertebrates has also benefited law enforcement, as arthropods, and especially insects, were discovered to be a source of information for forensic investigators. Two of the most commonly studied model organisms nowadays are invertebrates: the fruit fly Drosophila melanogaster and the nematode Caenorhabditis elegans. They have long been the most intensively studied model organisms, and were among the first life-forms to be genetically sequenced. This was facilitated by the severely reduced state of their genomes, but many genes, introns, and linkages have been lost. Analysis of the starlet sea anemone genome has emphasised the importance of sponges, placozoans, and choanoflagellates, also being sequenced, in explaining the arrival of 1,500 ancestral genes unique to animals. Invertebrates are also used by scientists in the field of aquatic biomonitoring to evaluate the effects of water pollution and climate change. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-:12-288] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/HD_40657] | [TOKENS: 261] |
Contents HD 40657 HD 40657 is a single star in the equatorial constellation of Orion, near the constellation border with Monoceros. It has an orange hue and is faintly visible to the naked eye with an apparent visual magnitude of 4.52. The star is located at a distance of approximately 333 light years from the Sun based on parallax. It is drifting further away with a radial velocity of +26 km/s. This is an aging giant star with a stellar classification of K1.5III-IIIb CN-1, where the suffix notation indicates an underabundance of cyanogen in the spectrum. Having exhausted the supply of hydrogen at its core, this star cooled and expanded off the main sequence. At present it has 22 times the radius of the Sun. HD 40657 is a suspected variable star with a brightness that has been measured ranging from magnitude 4.54 down to 4.58. It is an estimated 2.27 billion years old with 1.68 times the mass of the Sun and is spinning with a projected rotational velocity of 2.1 km/s. The star is radiating 171 times the Sun's luminosity from its swollen photosphere at an effective temperature of 4,536 K. References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Mars#cite_note-191] | [TOKENS: 11899] |
Contents Mars Mars is the fourth planet from the Sun. It is also known as the "Red Planet", for its orange-red appearance. Mars is a desert-like rocky planet with a tenuous atmosphere that is primarily carbon dioxide (CO2). At the average surface level the atmospheric pressure is a few thousandths of Earth's, atmospheric temperature ranges from −153 to 20 °C (−243 to 68 °F), and cosmic radiation is high. Mars retains some water, in the ground as well as thinly in the atmosphere, forming cirrus clouds, fog, frost, larger polar regions of permafrost and ice caps (with seasonal CO2 snow), but no bodies of liquid surface water. Its surface gravity is roughly a third of Earth's or double that of the Moon. Its diameter, 6,779 km (4,212 mi), is about half the Earth's, or twice the Moon's, and its surface area is the size of all the dry land of Earth. Fine dust is prevalent across the surface and the atmosphere, being picked up and spread at the low Martian gravity even by the weak wind of the tenuous atmosphere. The terrain of Mars roughly follows a north-south divide, the Martian dichotomy, with the northern hemisphere mainly consisting of relatively flat, low lying plains, and the southern hemisphere of cratered highlands. Geologically, the planet is fairly active with marsquakes trembling underneath the ground, but also hosts many enormous volcanoes that are extinct (the tallest is Olympus Mons, 21.9 km or 13.6 mi tall), as well as one of the largest canyons in the Solar System (Valles Marineris, 4,000 km or 2,500 mi long). Mars has two natural satellites that are small and irregular in shape: Phobos and Deimos. With a significant axial tilt of 25 degrees, Mars experiences seasons, like Earth (which has an axial tilt of 23.5 degrees). A Martian solar year is equal to 1.88 Earth years (687 Earth days), a Martian solar day (sol) is equal to 24.6 hours. Mars formed along with the other planets approximately 4.5 billion years ago. During the martian Noachian period (4.5 to 3.5 billion years ago), its surface was marked by meteor impacts, valley formation, erosion, the possible presence of water oceans and the loss of its magnetosphere. The Hesperian period (beginning 3.5 billion years ago and ending 3.3–2.9 billion years ago) was dominated by widespread volcanic activity and flooding that carved immense outflow channels. The Amazonian period, which continues to the present, is the currently dominating and remaining influence on geological processes. Because of Mars's geological history, the possibility of past or present life on Mars remains an area of active scientific investigation, with some possible traces needing further examination. Being visible with the naked eye in Earth's sky as a red wandering star, Mars has been observed throughout history, acquiring diverse associations in different cultures. In 1963 the first flight to Mars took place with Mars 1, but communication was lost en route. The first successful flyby exploration of Mars was conducted in 1965 with Mariner 4. In 1971 Mariner 9 entered orbit around Mars, being the first spacecraft to orbit any body other than the Moon, Sun or Earth; following in the same year were the first uncontrolled impact (Mars 2) and first successful landing (Mars 3) on Mars. Probes have been active on Mars continuously since 1997. At times, more than ten probes have simultaneously operated in orbit or on the surface, more than at any other planet beyond Earth. Mars is an often proposed target for future crewed exploration missions, though no such mission is currently planned. Natural history Scientists have theorized that during the Solar System's formation, Mars was created as the result of a random process of run-away accretion of material from the protoplanetary disk that orbited the Sun. Mars has many distinctive chemical features caused by its position in the Solar System. Elements with comparatively low boiling points, such as chlorine, phosphorus, and sulfur, are much more common on Mars than on Earth; these elements were probably pushed outward by the young Sun's energetic solar wind. After the formation of the planets, the inner Solar System may have been subjected to the so-called Late Heavy Bombardment. About 60% of the surface of Mars shows a record of impacts from that era, whereas much of the remaining surface is probably underlain by immense impact basins caused by those events. However, more recent modeling has disputed the existence of the Late Heavy Bombardment. There is evidence of an enormous impact basin in the Northern Hemisphere of Mars, spanning 10,600 by 8,500 kilometres (6,600 by 5,300 mi), or roughly four times the size of the Moon's South Pole–Aitken basin, which would be the largest impact basin yet discovered if confirmed. It has been hypothesized that the basin was formed when Mars was struck by a Pluto-sized body about four billion years ago. The event, thought to be the cause of the Martian hemispheric dichotomy, created the smooth Borealis basin that covers 40% of the planet. A 2023 study shows evidence, based on the orbital inclination of Deimos (a small moon of Mars), that Mars may once have had a ring system 3.5 billion years to 4 billion years ago. This ring system may have been formed from a moon, 20 times more massive than Phobos, orbiting Mars billions of years ago; and Phobos would be a remnant of that ring. Epochs: The geological history of Mars can be split into many periods, but the following are the three primary periods: Geological activity is still taking place on Mars. The Athabasca Valles is home to sheet-like lava flows created about 200 million years ago. Water flows in the grabens called the Cerberus Fossae occurred less than 20 million years ago, indicating equally recent volcanic intrusions. The Mars Reconnaissance Orbiter has captured images of avalanches. Physical characteristics Mars is approximately half the diameter of Earth or twice that of the Moon, with a surface area only slightly less than the total area of Earth's dry land. Mars is less dense than Earth, having about 15% of Earth's volume and 11% of Earth's mass, resulting in about 38% of Earth's surface gravity. Mars is the only presently known example of a desert planet, a rocky planet with a surface akin to that of Earth's deserts. The red-orange appearance of the Martian surface is caused by iron(III) oxide (nanophase Fe2O3) and the iron(III) oxide-hydroxide mineral goethite. It can look like butterscotch; other common surface colors include golden, brown, tan, and greenish, depending on the minerals present. Like Earth, Mars is differentiated into a dense metallic core overlaid by less dense rocky layers. The outermost layer is the crust, which is on average about 42–56 kilometres (26–35 mi) thick, with a minimum thickness of 6 kilometres (3.7 mi) in Isidis Planitia, and a maximum thickness of 117 kilometres (73 mi) in the southern Tharsis plateau. For comparison, Earth's crust averages 27.3 ± 4.8 km in thickness. The most abundant elements in the Martian crust are silicon, oxygen, iron, magnesium, aluminum, calcium, and potassium. Mars is confirmed to be seismically active; in 2019, it was reported that InSight had detected and recorded over 450 marsquakes and related events. Beneath the crust is a silicate mantle responsible for many of the tectonic and volcanic features on the planet's surface. The upper Martian mantle is a low-velocity zone, where the velocity of seismic waves is lower than surrounding depth intervals. The mantle appears to be rigid down to the depth of about 250 km, giving Mars a very thick lithosphere compared to Earth. Below this the mantle gradually becomes more ductile, and the seismic wave velocity starts to grow again. The Martian mantle does not appear to have a thermally insulating layer analogous to Earth's lower mantle; instead, below 1050 km in depth, it becomes mineralogically similar to Earth's transition zone. At the bottom of the mantle lies a basal liquid silicate layer approximately 150–180 km thick. The Martian mantle appears to be highly heterogenous, with dense fragments up to 4 km across, likely injected deep into the planet by colossal impacts ~4.5 billion years ago; high-frequency waves from eight marsquakes slowed as they passed these localized regions, and modeling indicates the heterogeneities are compositionally distinct debris preserved because Mars lacks plate tectonics and has a sluggishly convecting interior that prevents complete homogenization. Mars's iron and nickel core is at least partially molten, and may have a solid inner core. It is around half of Mars's radius, approximately 1650–1675 km, and is enriched in light elements such as sulfur, oxygen, carbon, and hydrogen. The temperature of the core is estimated to be 2000–2400 K, compared to 5400–6230 K for Earth's solid inner core. In 2025, based on data from the InSight lander, a group of researchers reported the detection of a solid inner core 613 kilometres (381 mi) ± 67 kilometres (42 mi) in radius. Mars is a terrestrial planet with a surface that consists of minerals containing silicon and oxygen, metals, and other elements that typically make up rock. The Martian surface is primarily composed of tholeiitic basalt, although parts are more silica-rich than typical basalt and may be similar to andesitic rocks on Earth, or silica glass. Regions of low albedo suggest concentrations of plagioclase feldspar, with northern low albedo regions displaying higher than normal concentrations of sheet silicates and high-silicon glass. Parts of the southern highlands include detectable amounts of high-calcium pyroxenes. Localized concentrations of hematite and olivine have been found. Much of the surface is deeply covered by finely grained iron(III) oxide dust. The Phoenix lander returned data showing Martian soil to be slightly alkaline and containing elements such as magnesium, sodium, potassium and chlorine. These nutrients are found in soils on Earth, and are necessary for plant growth. Experiments performed by the lander showed that the Martian soil has a basic pH of 7.7, and contains 0.6% perchlorate by weight, concentrations that are toxic to humans. Streaks are common across Mars and new ones appear frequently on steep slopes of craters, troughs, and valleys. The streaks are dark at first and get lighter with age. The streaks can start in a tiny area, then spread out for hundreds of metres. They have been seen to follow the edges of boulders and other obstacles in their path. The commonly accepted hypotheses include that they are dark underlying layers of soil revealed after avalanches of bright dust or dust devils. Several other explanations have been put forward, including those that involve water or even the growth of organisms. Environmental radiation levels on the surface are on average 0.64 millisieverts of radiation per day, and significantly less than the radiation of 1.84 millisieverts per day or 22 millirads per day during the flight to and from Mars. For comparison the radiation levels in low Earth orbit, where Earth's space stations orbit, are around 0.5 millisieverts of radiation per day. Hellas Planitia has the lowest surface radiation at about 0.342 millisieverts per day, featuring lava tubes southwest of Hadriacus Mons with potentially levels as low as 0.064 millisieverts per day, comparable to radiation levels during flights on Earth. Although Mars has no evidence of a structured global magnetic field, observations show that parts of the planet's crust have been magnetized, suggesting that alternating polarity reversals of its dipole field have occurred in the past. This paleomagnetism of magnetically susceptible minerals is similar to the alternating bands found on Earth's ocean floors. One hypothesis, published in 1999 and re-examined in October 2005 (with the help of the Mars Global Surveyor), is that these bands suggest plate tectonic activity on Mars four billion years ago, before the planetary dynamo ceased to function and the planet's magnetic field faded. Geography and features Although better remembered for mapping the Moon, Johann Heinrich von Mädler and Wilhelm Beer were the first areographers. They began by establishing that most of Mars's surface features were permanent and by more precisely determining the planet's rotation period. In 1840, Mädler combined ten years of observations and drew the first map of Mars. Features on Mars are named from a variety of sources. Albedo features are named for classical mythology. Craters larger than roughly 50 km are named for deceased scientists and writers and others who have contributed to the study of Mars. Smaller craters are named for towns and villages of the world with populations of less than 100,000. Large valleys are named for the word "Mars" or "star" in various languages; smaller valleys are named for rivers. Large albedo features retain many of the older names but are often updated to reflect new knowledge of the nature of the features. For example, Nix Olympica (the snows of Olympus) has become Olympus Mons (Mount Olympus). The surface of Mars as seen from Earth is divided into two kinds of areas, with differing albedo. The paler plains covered with dust and sand rich in reddish iron oxides were once thought of as Martian "continents" and given names like Arabia Terra (land of Arabia) or Amazonis Planitia (Amazonian plain). The dark features were thought to be seas, hence their names Mare Erythraeum, Mare Sirenum and Aurorae Sinus. The largest dark feature seen from Earth is Syrtis Major Planum. The permanent northern polar ice cap is named Planum Boreum. The southern cap is called Planum Australe. Mars's equator is defined by its rotation, but the location of its Prime Meridian was specified, as was Earth's (at Greenwich), by choice of an arbitrary point; Mädler and Beer selected a line for their first maps of Mars in 1830. After the spacecraft Mariner 9 provided extensive imagery of Mars in 1972, a small crater (later called Airy-0), located in the Sinus Meridiani ("Middle Bay" or "Meridian Bay"), was chosen by Merton E. Davies, Harold Masursky, and Gérard de Vaucouleurs for the definition of 0.0° longitude to coincide with the original selection. Because Mars has no oceans, and hence no "sea level", a zero-elevation surface had to be selected as a reference level; this is called the areoid of Mars, analogous to the terrestrial geoid. Zero altitude was defined by the height at which there is 610.5 Pa (6.105 mbar) of atmospheric pressure. This pressure corresponds to the triple point of water, and it is about 0.6% of the sea level surface pressure on Earth (0.006 atm). For mapping purposes, the United States Geological Survey divides the surface of Mars into thirty cartographic quadrangles, each named for a classical albedo feature it contains. In April 2023, The New York Times reported an updated global map of Mars based on images from the Hope spacecraft. A related, but much more detailed, global Mars map was released by NASA on 16 April 2023. The vast upland region Tharsis contains several massive volcanoes, which include the shield volcano Olympus Mons. The edifice is over 600 km (370 mi) wide. Because the mountain is so large, with complex structure at its edges, giving a definite height to it is difficult. Its local relief, from the foot of the cliffs which form its northwest margin to its peak, is over 21 km (13 mi), a little over twice the height of Mauna Kea as measured from its base on the ocean floor. The total elevation change from the plains of Amazonis Planitia, over 1,000 km (620 mi) to the northwest, to the summit approaches 26 km (16 mi), roughly three times the height of Mount Everest, which in comparison stands at just over 8.8 kilometres (5.5 mi). Consequently, Olympus Mons is either the tallest or second-tallest mountain in the Solar System; the only known mountain which might be taller is the Rheasilvia peak on the asteroid Vesta, at 20–25 km (12–16 mi). The dichotomy of Martian topography is striking: northern plains flattened by lava flows contrast with the southern highlands, pitted and cratered by ancient impacts. It is possible that, four billion years ago, the Northern Hemisphere of Mars was struck by an object one-tenth to two-thirds the size of Earth's Moon. If this is the case, the Northern Hemisphere of Mars would be the site of an impact crater 10,600 by 8,500 kilometres (6,600 by 5,300 mi) in size, or roughly the area of Europe, Asia, and Australia combined, surpassing Utopia Planitia and the Moon's South Pole–Aitken basin as the largest impact crater in the Solar System. Mars is scarred by 43,000 impact craters with a diameter of 5 kilometres (3.1 mi) or greater. The largest exposed crater is Hellas, which is 2,300 kilometres (1,400 mi) wide and 7,000 metres (23,000 ft) deep, and is a light albedo feature clearly visible from Earth. There are other notable impact features, such as Argyre, which is around 1,800 kilometres (1,100 mi) in diameter, and Isidis, which is around 1,500 kilometres (930 mi) in diameter. Due to the smaller mass and size of Mars, the probability of an object colliding with the planet is about half that of Earth. Mars is located closer to the asteroid belt, so it has an increased chance of being struck by materials from that source. Mars is more likely to be struck by short-period comets, i.e., those that lie within the orbit of Jupiter. Martian craters can[discuss] have a morphology that suggests the ground became wet after the meteor impact. The large canyon, Valles Marineris (Latin for 'Mariner Valleys, also known as Agathodaemon in the old canal maps), has a length of 4,000 kilometres (2,500 mi) and a depth of up to 7 kilometres (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 kilometres (277 mi) long and nearly 2 kilometres (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 kilometres (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement. Images from the Thermal Emission Imaging System (THEMIS) aboard NASA's Mars Odyssey orbiter have revealed seven possible cave entrances on the flanks of the volcano Arsia Mons. The caves, named after loved ones of their discoverers, are collectively known as the "seven sisters". Cave entrances measure from 100 to 252 metres (328 to 827 ft) wide and they are estimated to be at least 73 to 96 metres (240 to 315 ft) deep. Because light does not reach the floor of most of the caves, they may extend much deeper than these lower estimates and widen below the surface. "Dena" is the only exception; its floor is visible and was measured to be 130 metres (430 ft) deep. The interiors of these caverns may be protected from micrometeoroids, UV radiation, solar flares and high energy particles that bombard the planet's surface. Martian geysers (or CO2 jets) are putative sites of small gas and dust eruptions that occur in the south polar region of Mars during the spring thaw. "Dark dune spots" and "spiders" – or araneiforms – are the two most visible types of features ascribed to these eruptions. Similarly sized dust will settle from the thinner Martian atmosphere sooner than it would on Earth. For example, the dust suspended by the 2001 global dust storms on Mars only remained in the Martian atmosphere for 0.6 years, while the dust from Mount Pinatubo took about two years to settle. However, under current Martian conditions, the mass movements involved are generally much smaller than on Earth. Even the 2001 global dust storms on Mars moved only the equivalent of a very thin dust layer – about 3 μm thick if deposited with uniform thickness between 58° north and south of the equator. Dust deposition at the two rover sites has proceeded at a rate of about the thickness of a grain every 100 sols. Atmosphere Mars lost its magnetosphere 4 billion years ago, possibly because of numerous asteroid strikes, so the solar wind interacts directly with the Martian ionosphere, lowering the atmospheric density by stripping away atoms from the outer layer. Both Mars Global Surveyor and Mars Express have detected ionized atmospheric particles trailing off into space behind Mars, and this atmospheric loss is being studied by the MAVEN orbiter. Compared to Earth, the atmosphere of Mars is quite rarefied. Atmospheric pressure on the surface today ranges from a low of 30 Pa (0.0044 psi) on Olympus Mons to over 1,155 Pa (0.1675 psi) in Hellas Planitia, with a mean pressure at the surface level of 600 Pa (0.087 psi). The highest atmospheric density on Mars is equal to that found 35 kilometres (22 mi) above Earth's surface. The resulting mean surface pressure is only 0.6% of Earth's 101.3 kPa (14.69 psi). The scale height of the atmosphere is about 10.8 kilometres (6.7 mi), which is higher than Earth's 6 kilometres (3.7 mi), because the surface gravity of Mars is only about 38% of Earth's. The atmosphere of Mars consists of about 96% carbon dioxide, 1.93% argon and 1.89% nitrogen along with traces of oxygen and water. The atmosphere is quite dusty, containing particulates about 1.5 μm in diameter which give the Martian sky a tawny color when seen from the surface. It may take on a pink hue due to iron oxide particles suspended in it. Despite repeated detections of methane on Mars, there is no scientific consensus as to its origin. One suggestion is that methane exists on Mars and that its concentration fluctuates seasonally. The existence of methane could be produced by non-biological process such as serpentinization involving water, carbon dioxide, and the mineral olivine, which is known to be common on Mars, or by Martian life. Compared to Earth, its higher concentration of atmospheric CO2 and lower surface pressure may be why sound is attenuated more on Mars, where natural sources are rare apart from the wind. Using acoustic recordings collected by the Perseverance rover, researchers concluded that the speed of sound there is approximately 240 m/s for frequencies below 240 Hz, and 250 m/s for those above. Auroras have been detected on Mars. Because Mars lacks a global magnetic field, the types and distribution of auroras there differ from those on Earth; rather than being mostly restricted to polar regions as is the case on Earth, a Martian aurora can encompass the planet. In September 2017, NASA reported radiation levels on the surface of the planet Mars were temporarily doubled, and were associated with an aurora 25 times brighter than any observed earlier, due to a massive, and unexpected, solar storm in the middle of the month. Mars has seasons, alternating between its northern and southern hemispheres, similar to on Earth. Additionally the orbit of Mars has, compared to Earth's, a large eccentricity and approaches perihelion when it is summer in its southern hemisphere and winter in its northern, and aphelion when it is winter in its southern hemisphere and summer in its northern. As a result, the seasons in its southern hemisphere are more extreme and the seasons in its northern are milder than would otherwise be the case. The summer temperatures in the south can be warmer than the equivalent summer temperatures in the north by up to 30 °C (54 °F). Martian surface temperatures vary from lows of about −110 °C (−166 °F) to highs of up to 35 °C (95 °F) in equatorial summer. The wide range in temperatures is due to the thin atmosphere which cannot store much solar heat, the low atmospheric pressure (about 1% that of the atmosphere of Earth), and the low thermal inertia of Martian soil. The planet is 1.52 times as far from the Sun as Earth, resulting in just 43% of the amount of sunlight. Mars has the largest dust storms in the Solar System, reaching speeds of over 160 km/h (100 mph). These can vary from a storm over a small area, to gigantic storms that cover the entire planet. They tend to occur when Mars is closest to the Sun, and have been shown to increase global temperature. Seasons also produce dry ice covering polar ice caps. Hydrology While Mars contains water in larger amounts, most of it is dust covered water ice at the Martian polar ice caps. The volume of water ice in the south polar ice cap, if melted, would be enough to cover most of the surface of the planet with a depth of 11 metres (36 ft). Water in its liquid form cannot persist on the surface due to Mars's low atmospheric pressure, which is less than 1% that of Earth. Only at the lowest of elevations are the pressure and temperature high enough for liquid water to exist for short periods. Although little water is present in the atmosphere, there is enough to produce clouds of water ice and different cases of snow and frost, often mixed with snow of carbon dioxide dry ice. Landforms visible on Mars strongly suggest that liquid water has existed on the planet's surface. Huge linear swathes of scoured ground, known as outflow channels, cut across the surface in about 25 places. These are thought to be a record of erosion caused by the catastrophic release of water from subsurface aquifers, though some of these structures have been hypothesized to result from the action of glaciers or lava. One of the larger examples, Ma'adim Vallis, is 700 kilometres (430 mi) long, much greater than the Grand Canyon, with a width of 20 kilometres (12 mi) and a depth of 2 kilometres (1.2 mi) in places. It is thought to have been carved by flowing water early in Mars's history. The youngest of these channels is thought to have formed only a few million years ago. Elsewhere, particularly on the oldest areas of the Martian surface, finer-scale, dendritic networks of valleys are spread across significant proportions of the landscape. Features of these valleys and their distribution strongly imply that they were carved by runoff resulting from precipitation in early Mars history. Subsurface water flow and groundwater sapping may play important subsidiary roles in some networks, but precipitation was probably the root cause of the incision in almost all cases. Along craters and canyon walls, there are thousands of features that appear similar to terrestrial gullies. The gullies tend to be in the highlands of the Southern Hemisphere and face the Equator; all are poleward of 30° latitude. A number of authors have suggested that their formation process involves liquid water, probably from melting ice, although others have argued for formation mechanisms involving carbon dioxide frost or the movement of dry dust. No partially degraded gullies have formed by weathering and no superimposed impact craters have been observed, indicating that these are young features, possibly still active. Other geological features, such as deltas and alluvial fans preserved in craters, are further evidence for warmer, wetter conditions at an interval or intervals in earlier Mars history. Such conditions necessarily require the widespread presence of crater lakes across a large proportion of the surface, for which there is independent mineralogical, sedimentological and geomorphological evidence. Further evidence that liquid water once existed on the surface of Mars comes from the detection of specific minerals such as hematite and goethite, both of which sometimes form in the presence of water. The chemical signature of water vapor on Mars was first unequivocally demonstrated in 1963 by spectroscopy using an Earth-based telescope. In 2004, Opportunity detected the mineral jarosite. This forms only in the presence of acidic water, showing that water once existed on Mars. The Spirit rover found concentrated deposits of silica in 2007 that indicated wet conditions in the past, and in December 2011, the mineral gypsum, which also forms in the presence of water, was found on the surface by NASA's Mars rover Opportunity. It is estimated that the amount of water in the upper mantle of Mars, represented by hydroxyl ions contained within Martian minerals, is equal to or greater than that of Earth at 50–300 parts per million of water, which is enough to cover the entire planet to a depth of 200–1,000 metres (660–3,280 ft). On 18 March 2013, NASA reported evidence from instruments on the Curiosity rover of mineral hydration, likely hydrated calcium sulfate, in several rock samples including the broken fragments of "Tintina" rock and "Sutton Inlier" rock as well as in veins and nodules in other rocks like "Knorr" rock and "Wernicke" rock. Analysis using the rover's DAN instrument provided evidence of subsurface water, amounting to as much as 4% water content, down to a depth of 60 centimetres (24 in), during the rover's traverse from the Bradbury Landing site to the Yellowknife Bay area in the Glenelg terrain. In September 2015, NASA announced that they had found strong evidence of hydrated brine flows in recurring slope lineae, based on spectrometer readings of the darkened areas of slopes. These streaks flow downhill in Martian summer, when the temperature is above −23 °C, and freeze at lower temperatures. These observations supported earlier hypotheses, based on timing of formation and their rate of growth, that these dark streaks resulted from water flowing just below the surface. However, later work suggested that the lineae may be dry, granular flows instead, with at most a limited role for water in initiating the process. A definitive conclusion about the presence, extent, and role of liquid water on the Martian surface remains elusive. Researchers suspect much of the low northern plains of the planet were covered with an ocean hundreds of meters deep, though this theory remains controversial. In March 2015, scientists stated that such an ocean might have been the size of Earth's Arctic Ocean. This finding was derived from the ratio of protium to deuterium in the modern Martian atmosphere compared to that ratio on Earth. The amount of Martian deuterium (D/H = 9.3 ± 1.7 10−4) is five to seven times the amount on Earth (D/H = 1.56 10−4), suggesting that ancient Mars had significantly higher levels of water. Results from the Curiosity rover had previously found a high ratio of deuterium in Gale Crater, though not significantly high enough to suggest the former presence of an ocean. Other scientists caution that these results have not been confirmed, and point out that Martian climate models have not yet shown that the planet was warm enough in the past to support bodies of liquid water. Near the northern polar cap is the 81.4 kilometres (50.6 mi) wide Korolev Crater, which the Mars Express orbiter found to be filled with approximately 2,200 cubic kilometres (530 cu mi) of water ice. In November 2016, NASA reported finding a large amount of underground ice in the Utopia Planitia region. The volume of water detected has been estimated to be equivalent to the volume of water in Lake Superior (which is 12,100 cubic kilometers). During observations from 2018 through 2021, the ExoMars Trace Gas Orbiter spotted indications of water, probably subsurface ice, in the Valles Marineris canyon system. Orbital motion Mars's average distance from the Sun is roughly 230 million km (143 million mi), and its orbital period is 687 (Earth) days. The solar day (or sol) on Mars is only slightly longer than an Earth day: 24 hours, 39 minutes, and 35.244 seconds. A Martian year is equal to 1.8809 Earth years, or 1 year, 320 days, and 18.2 hours. The gravitational potential difference and thus the delta-v needed to transfer between Mars and Earth is the second lowest for Earth. The axial tilt of Mars is 25.19° relative to its orbital plane, which is similar to the axial tilt of Earth. As a result, Mars has seasons like Earth, though on Mars they are nearly twice as long because its orbital period is that much longer. In the present day, the orientation of the north pole of Mars is close to the star Deneb. Mars has a relatively pronounced orbital eccentricity of about 0.09; of the seven other planets in the Solar System, only Mercury has a larger orbital eccentricity. It is known that in the past, Mars has had a much more circular orbit. At one point, 1.35 million Earth years ago, Mars had an eccentricity of roughly 0.002, much less than that of Earth today. Mars's cycle of eccentricity is 96,000 Earth years compared to Earth's cycle of 100,000 years. Mars has its closest approach to Earth (opposition) in a synodic period of 779.94 days. It should not be confused with Mars conjunction, where the Earth and Mars are at opposite sides of the Solar System and form a straight line crossing the Sun. The average time between the successive oppositions of Mars, its synodic period, is 780 days; but the number of days between successive oppositions can range from 764 to 812. The distance at close approach varies between about 54 and 103 million km (34 and 64 million mi) due to the planets' elliptical orbits, which causes comparable variation in angular size. At their furthest Mars and Earth can be as far as 401 million km (249 million mi) apart. Mars comes into opposition from Earth every 2.1 years. The planets come into opposition near Mars's perihelion in 2003, 2018 and 2035, with the 2020 and 2033 events being particularly close to perihelic opposition. The mean apparent magnitude of Mars is +0.71 with a standard deviation of 1.05. Because the orbit of Mars is eccentric, the magnitude at opposition from the Sun can range from about −3.0 to −1.4. The minimum brightness is magnitude +1.86 when the planet is near aphelion and in conjunction with the Sun. At its brightest, Mars (along with Jupiter) is second only to Venus in apparent brightness. Mars usually appears distinctly yellow, orange, or red. When farthest away from Earth, it is more than seven times farther away than when it is closest. Mars is usually close enough for particularly good viewing once or twice at 15-year or 17-year intervals. Optical ground-based telescopes are typically limited to resolving features about 300 kilometres (190 mi) across when Earth and Mars are closest because of Earth's atmosphere. As Mars approaches opposition, it begins a period of retrograde motion, which means it will appear to move backwards in a looping curve with respect to the background stars. This retrograde motion lasts for about 72 days, and Mars reaches its peak apparent brightness in the middle of this interval. Moons Mars has two relatively small (compared to Earth's) natural moons, Phobos (about 22 km (14 mi) in diameter) and Deimos (about 12 km (7.5 mi) in diameter), which orbit at 9,376 km (5,826 mi) and 23,460 km (14,580 mi) around the planet. The origin of both moons is unclear, although a popular theory states that they were asteroids captured into Martian orbit. Both satellites were discovered in 1877 by Asaph Hall and were named after the characters Phobos (the deity of panic and fear) and Deimos (the deity of terror and dread), twins from Greek mythology who accompanied their father Ares, god of war, into battle. Mars was the Roman equivalent to Ares. In modern Greek, the planet retains its ancient name Ares (Aris: Άρης). From the surface of Mars, the motions of Phobos and Deimos appear different from that of the Earth's satellite, the Moon. Phobos rises in the west, sets in the east, and rises again in just 11 hours. Deimos, being only just outside synchronous orbit – where the orbital period would match the planet's period of rotation – rises as expected in the east, but slowly. Because the orbit of Phobos is below a synchronous altitude, tidal forces from Mars are gradually lowering its orbit. In about 50 million years, it could either crash into Mars's surface or break up into a ring structure around the planet. The origin of the two satellites is not well understood. Their low albedo and carbonaceous chondrite composition have been regarded as similar to asteroids, supporting a capture theory. The unstable orbit of Phobos would seem to point toward a relatively recent capture. But both have circular orbits near the equator, which is unusual for captured objects, and the required capture dynamics are complex. Accretion early in the history of Mars is plausible, but would not account for a composition resembling asteroids rather than Mars itself, if that is confirmed. Mars may have yet-undiscovered moons, smaller than 50 to 100 metres (160 to 330 ft) in diameter, and a dust ring is predicted to exist between Phobos and Deimos. A third possibility for their origin as satellites of Mars is the involvement of a third body or a type of impact disruption. More-recent lines of evidence for Phobos having a highly porous interior, and suggesting a composition containing mainly phyllosilicates and other minerals known from Mars, point toward an origin of Phobos from material ejected by an impact on Mars that reaccreted in Martian orbit, similar to the prevailing theory for the origin of Earth's satellite. Although the visible and near-infrared (VNIR) spectra of the moons of Mars resemble those of outer-belt asteroids, the thermal infrared spectra of Phobos are reported to be inconsistent with chondrites of any class. It is also possible that Phobos and Deimos were fragments of an older moon, formed by debris from a large impact on Mars, and then destroyed by a more recent impact upon the satellite. More recently, a study conducted by a team of researchers from multiple countries suggests that a lost moon, at least fifteen times the size of Phobos, may have existed in the past. By analyzing rocks which point to tidal processes on the planet, it is possible that these tides may have been regulated by a past moon. Human observations and exploration The history of observations of Mars is marked by oppositions of Mars when the planet is closest to Earth and hence is most easily visible, which occur every couple of years. Even more notable are the perihelic oppositions of Mars, which are distinguished because Mars is close to perihelion, making it even closer to Earth. The ancient Sumerians named Mars Nergal, the god of war and plague. During Sumerian times, Nergal was a minor deity of little significance, but, during later times, his main cult center was the city of Nineveh. In Mesopotamian texts, Mars is referred to as the "star of judgement of the fate of the dead". The existence of Mars as a wandering object in the night sky was also recorded by the ancient Egyptian astronomers and, by 1534 BCE, they were familiar with the retrograde motion of the planet. By the period of the Neo-Babylonian Empire, the Babylonian astronomers were making regular records of the positions of the planets and systematic observations of their behavior. For Mars, they knew that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. They invented arithmetic methods for making minor corrections to the predicted positions of the planets. In Ancient Greece, the planet was known as Πυρόεις. Commonly, the Greek name for the planet now referred to as Mars, was Ares. It was the Romans who named the planet Mars, for their god of war, often represented by the sword and shield of the planet's namesake. In the fourth century BCE, Aristotle noted that Mars disappeared behind the Moon during an occultation, indicating that the planet was farther away. Ptolemy, a Greek living in Alexandria, attempted to address the problem of the orbital motion of Mars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection later called the Almagest (from the Arabic for "greatest"), which became the authoritative treatise on Western astronomy for the next fourteen centuries. Literature from ancient China confirms that Mars was known by Chinese astronomers by no later than the fourth century BCE. In the East Asian cultures, Mars is traditionally referred to as the "fire star" (火星) based on the Wuxing system. In 1609 Johannes Kepler published a 10 year study of Martian orbit, using the diurnal parallax of Mars, measured by Tycho Brahe, to make a preliminary calculation of the relative distance to the planet. From Brahe's observations of Mars, Kepler deduced that the planet orbited the Sun not in a circle, but in an ellipse. Moreover, Kepler showed that Mars sped up as it approached the Sun and slowed down as it moved farther away, in a manner that later physicists would explain as a consequence of the conservation of angular momentum.: 433–437 In 1610 the first use of a telescope for astronomical observation, including Mars, was performed by Italian astronomer Galileo Galilei. With the telescope the diurnal parallax of Mars was again measured in an effort to determine the Sun-Earth distance. This was first performed by Giovanni Domenico Cassini in 1672. The early parallax measurements were hampered by the quality of the instruments. The only occultation of Mars by Venus observed was that of 13 October 1590, seen by Michael Maestlin at Heidelberg. By the 19th century, the resolution of telescopes reached a level sufficient for surface features to be identified. On 5 September 1877, a perihelic opposition to Mars occurred. The Italian astronomer Giovanni Schiaparelli used a 22-centimetre (8.7 in) telescope in Milan to help produce the first detailed map of Mars. These maps notably contained features he called canali, which, with the possible exception of the natural canyon Valles Marineris, were later shown to be an optical illusion. These canali were supposedly long, straight lines on the surface of Mars, to which he gave names of famous rivers on Earth. His term, which means "channels" or "grooves", was popularly mistranslated in English as "canals". Influenced by the observations, the orientalist Percival Lowell founded an observatory which had 30- and 45-centimetre (12- and 18-in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894, and the following less favorable oppositions. He published several books on Mars and life on the planet, which had a great influence on the public. The canali were independently observed by other astronomers, like Henri Joseph Perrotin and Louis Thollon in Nice, using one of the largest telescopes of that time. The seasonal changes (consisting of the diminishing of the polar caps and the dark areas formed during Martian summers) in combination with the canals led to speculation about life on Mars, and it was a long-held belief that Mars contained vast seas and vegetation. As bigger telescopes were used, fewer long, straight canali were observed. During observations in 1909 by Antoniadi with an 84-centimetre (33 in) telescope, irregular patterns were observed, but no canali were seen. The first spacecraft from Earth to visit Mars was Mars 1 of the Soviet Union, which flew by in 1963, but contact was lost en route. NASA's Mariner 4 followed and became the first spacecraft to successfully transmit from Mars; launched on 28 November 1964, it made its closest approach to the planet on 15 July 1965. Mariner 4 detected the weak Martian radiation belt, measured at about 0.1% that of Earth, and captured the first images of another planet from deep space. Once spacecraft visited the planet during the 1960s and 1970s, many previous concepts of Mars were radically broken. After the results of the Viking life-detection experiments, the hypothesis of a dead planet was generally accepted. The data from Mariner 9 and Viking allowed better maps of Mars to be made. Until 1997 and after Viking 1 shut down in 1982, Mars was only visited by three unsuccessful probes, two flying past without contact (Phobos 1, 1988; Mars Observer, 1993), and one (Phobos 2 1989) malfunctioning in orbit before reaching its destination Phobos. In 1997 Mars Pathfinder became the first successful rover mission beyond the Moon and started together with Mars Global Surveyor (operated until late 2006) an uninterrupted active robotic presence at Mars that has lasted until today. It produced complete, extremely detailed maps of the Martian topography, magnetic field and surface minerals. Starting with these missions a range of new improved crewless spacecraft, including orbiters, landers, and rovers, have been sent to Mars, with successful missions by the NASA (United States), Jaxa (Japan), ESA, United Kingdom, ISRO (India), Roscosmos (Russia), the United Arab Emirates, and CNSA (China) to study the planet's surface, climate, and geology, uncovering the different elements of the history and dynamic of the hydrosphere of Mars and possible traces of ancient life. As of 2023[update], Mars is host to ten functioning spacecraft. Eight are in orbit: 2001 Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, ExoMars Trace Gas Orbiter, the Hope orbiter, and the Tianwen-1 orbiter. Another two are on the surface: the Mars Science Laboratory Curiosity rover and the Perseverance rover. Collected maps are available online at websites including Google Mars. NASA provides two online tools: Mars Trek, which provides visualizations of the planet using data from 50 years of exploration, and Experience Curiosity, which simulates traveling on Mars in 3-D with Curiosity. Planned missions to Mars include: As of February 2024[update], debris from these types of missions has reached over seven tons. Most of it consists of crashed and inactive spacecraft as well as discarded components. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Key areas include establishing telecommunications, payload delivery and surface imaging. Habitability and habitation During the late 19th century, it was widely accepted in the astronomical community that Mars had life-supporting qualities, including the presence of oxygen and water. However, in 1894 W. W. Campbell at Lick Observatory observed the planet and found that "if water vapor or oxygen occur in the atmosphere of Mars it is in quantities too small to be detected by spectroscopes then available". That observation contradicted many of the measurements of the time and was not widely accepted. Campbell and V. M. Slipher repeated the study in 1909 using better instruments, but with the same results. It was not until the findings were confirmed by W. S. Adams in 1925 that the myth of the Earth-like habitability of Mars was finally broken. However, even in the 1960s, articles were published on Martian biology, putting aside explanations other than life for the seasonal changes on Mars. The current understanding of planetary habitability – the ability of a world to develop environmental conditions favorable to the emergence of life – favors planets that have liquid water on their surface. Most often this requires the orbit of a planet to lie within the habitable zone, which for the Sun is estimated to extend from within the orbit of Earth to about that of Mars. During perihelion, Mars dips inside this region, but Mars's thin (low-pressure) atmosphere prevents liquid water from existing over large regions for extended periods. The past flow of liquid water demonstrates the planet's potential for habitability. Recent evidence has suggested that any water on the Martian surface may have been too salty and acidic to support regular terrestrial life. The environmental conditions on Mars are a challenge to sustaining organic life: the planet has little heat transfer across its surface, it has poor insulation against bombardment by the solar wind due to the absence of a magnetosphere and has insufficient atmospheric pressure to retain water in a liquid form (water instead sublimes to a gaseous state). Mars is nearly, or perhaps totally, geologically dead; the end of volcanic activity has apparently stopped the recycling of chemicals and minerals between the surface and interior of the planet. Evidence suggests that the planet was once significantly more habitable than it is today, but whether living organisms ever existed there remains unknown. The Viking probes of the mid-1970s carried experiments designed to detect microorganisms in Martian soil at their respective landing sites and had positive results, including a temporary increase in CO2 production on exposure to water and nutrients. This sign of life was later disputed by scientists, resulting in a continuing debate, with NASA scientist Gilbert Levin asserting that Viking may have found life. A 2014 analysis of Martian meteorite EETA79001 found chlorate, perchlorate, and nitrate ions in sufficiently high concentrations to suggest that they are widespread on Mars. UV and X-ray radiation would turn chlorate and perchlorate ions into other, highly reactive oxychlorines, indicating that any organic molecules would have to be buried under the surface to survive. Small quantities of methane and formaldehyde detected by Mars orbiters are both claimed to be possible evidence for life, as these chemical compounds would quickly break down in the Martian atmosphere. Alternatively, these compounds may instead be replenished by volcanic or other geological means, such as serpentinite. Impact glass, formed by the impact of meteors, which on Earth can preserve signs of life, has also been found on the surface of the impact craters on Mars. Likewise, the glass in impact craters on Mars could have preserved signs of life, if life existed at the site. The Cheyava Falls rock discovered on Mars in June 2024 has been designated by NASA as a "potential biosignature" and was core sampled by the Perseverance rover for possible return to Earth and further examination. Although highly intriguing, no definitive final determination on a biological or abiotic origin of this rock can be made with the data currently available. Several plans for a human mission to Mars have been proposed, but none have come to fruition. The NASA Authorization Act of 2017 directed NASA to study the feasibility of a crewed Mars mission in the early 2030s; the resulting report concluded that this would be unfeasible. In addition, in 2021, China was planning to send a crewed Mars mission in 2033. Privately held companies such as SpaceX have also proposed plans to send humans to Mars, with the eventual goal to settle on the planet. As of 2024, SpaceX has proceeded with the development of the Starship launch vehicle with the goal of Mars colonization. In plans shared with the company in April 2024, Elon Musk envisions the beginning of a Mars colony within the next twenty years. This would be enabled by the planned mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. The moon Phobos has been proposed as an anchor point for a space elevator. Besides national space agencies and space companies, groups such as the Mars Society and The Planetary Society advocate for human missions to Mars. In culture Mars is named after the Roman god of war (Greek Ares), but was also associated with the demi-god Heracles (Roman Hercules) by ancient Greek astronomers, as detailed by Aristotle. This association between Mars and war dates back at least to Babylonian astronomy, in which the planet was named for the god Nergal, deity of war and destruction. It persisted into modern times, as exemplified by Gustav Holst's orchestral suite The Planets, whose famous first movement labels Mars "The Bringer of War". The planet's symbol, a circle with a spear pointing out to the upper right, is also used as a symbol for the male gender. The symbol dates from at least the 11th century, though a possible predecessor has been found in the Greek Oxyrhynchus Papyri. The idea that Mars was populated by intelligent Martians became widespread in the late 19th century. Schiaparelli's "canali" observations combined with Percival Lowell's books on the subject put forward the standard notion of a planet that was a drying, cooling, dying world with ancient civilizations constructing irrigation works. Many other observations and proclamations by notable personalities added to what has been termed "Mars Fever". In the present day, high-resolution mapping of the surface of Mars has revealed no artifacts of habitation, but pseudoscientific speculation about intelligent life on Mars still continues. Reminiscent of the canali observations, these speculations are based on small scale features perceived in the spacecraft images, such as "pyramids" and the "Face on Mars". In his book Cosmos, planetary astronomer Carl Sagan wrote: "Mars has become a kind of mythic arena onto which we have projected our Earthly hopes and fears." The depiction of Mars in fiction has been stimulated by its dramatic red color and by nineteenth-century scientific speculations that its surface conditions might support not just life but intelligent life. This gave way to many science fiction stories involving these concepts, such as H. G. Wells's The War of the Worlds, in which Martians seek to escape their dying planet by invading Earth; Ray Bradbury's The Martian Chronicles, in which human explorers accidentally destroy a Martian civilization; as well as Edgar Rice Burroughs's series Barsoom, C. S. Lewis's novel Out of the Silent Planet (1938), and a number of Robert A. Heinlein stories before the mid-sixties. Since then, depictions of Martians have also extended to animation. A comic figure of an intelligent Martian, Marvin the Martian, appeared in Haredevil Hare (1948) as a character in the Looney Tunes animated cartoons of Warner Brothers, and has continued as part of popular culture to the present. After the Mariner and Viking spacecraft had returned pictures of Mars as a lifeless and canal-less world, these ideas about Mars were abandoned; for many science-fiction authors, the new discoveries initially seemed like a constraint, but eventually the post-Viking knowledge of Mars became itself a source of inspiration for works like Kim Stanley Robinson's Mars trilogy. See also Notes References Further reading External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of". |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Mars#cite_note-SeasonalFlows-190] | [TOKENS: 11899] |
Contents Mars Mars is the fourth planet from the Sun. It is also known as the "Red Planet", for its orange-red appearance. Mars is a desert-like rocky planet with a tenuous atmosphere that is primarily carbon dioxide (CO2). At the average surface level the atmospheric pressure is a few thousandths of Earth's, atmospheric temperature ranges from −153 to 20 °C (−243 to 68 °F), and cosmic radiation is high. Mars retains some water, in the ground as well as thinly in the atmosphere, forming cirrus clouds, fog, frost, larger polar regions of permafrost and ice caps (with seasonal CO2 snow), but no bodies of liquid surface water. Its surface gravity is roughly a third of Earth's or double that of the Moon. Its diameter, 6,779 km (4,212 mi), is about half the Earth's, or twice the Moon's, and its surface area is the size of all the dry land of Earth. Fine dust is prevalent across the surface and the atmosphere, being picked up and spread at the low Martian gravity even by the weak wind of the tenuous atmosphere. The terrain of Mars roughly follows a north-south divide, the Martian dichotomy, with the northern hemisphere mainly consisting of relatively flat, low lying plains, and the southern hemisphere of cratered highlands. Geologically, the planet is fairly active with marsquakes trembling underneath the ground, but also hosts many enormous volcanoes that are extinct (the tallest is Olympus Mons, 21.9 km or 13.6 mi tall), as well as one of the largest canyons in the Solar System (Valles Marineris, 4,000 km or 2,500 mi long). Mars has two natural satellites that are small and irregular in shape: Phobos and Deimos. With a significant axial tilt of 25 degrees, Mars experiences seasons, like Earth (which has an axial tilt of 23.5 degrees). A Martian solar year is equal to 1.88 Earth years (687 Earth days), a Martian solar day (sol) is equal to 24.6 hours. Mars formed along with the other planets approximately 4.5 billion years ago. During the martian Noachian period (4.5 to 3.5 billion years ago), its surface was marked by meteor impacts, valley formation, erosion, the possible presence of water oceans and the loss of its magnetosphere. The Hesperian period (beginning 3.5 billion years ago and ending 3.3–2.9 billion years ago) was dominated by widespread volcanic activity and flooding that carved immense outflow channels. The Amazonian period, which continues to the present, is the currently dominating and remaining influence on geological processes. Because of Mars's geological history, the possibility of past or present life on Mars remains an area of active scientific investigation, with some possible traces needing further examination. Being visible with the naked eye in Earth's sky as a red wandering star, Mars has been observed throughout history, acquiring diverse associations in different cultures. In 1963 the first flight to Mars took place with Mars 1, but communication was lost en route. The first successful flyby exploration of Mars was conducted in 1965 with Mariner 4. In 1971 Mariner 9 entered orbit around Mars, being the first spacecraft to orbit any body other than the Moon, Sun or Earth; following in the same year were the first uncontrolled impact (Mars 2) and first successful landing (Mars 3) on Mars. Probes have been active on Mars continuously since 1997. At times, more than ten probes have simultaneously operated in orbit or on the surface, more than at any other planet beyond Earth. Mars is an often proposed target for future crewed exploration missions, though no such mission is currently planned. Natural history Scientists have theorized that during the Solar System's formation, Mars was created as the result of a random process of run-away accretion of material from the protoplanetary disk that orbited the Sun. Mars has many distinctive chemical features caused by its position in the Solar System. Elements with comparatively low boiling points, such as chlorine, phosphorus, and sulfur, are much more common on Mars than on Earth; these elements were probably pushed outward by the young Sun's energetic solar wind. After the formation of the planets, the inner Solar System may have been subjected to the so-called Late Heavy Bombardment. About 60% of the surface of Mars shows a record of impacts from that era, whereas much of the remaining surface is probably underlain by immense impact basins caused by those events. However, more recent modeling has disputed the existence of the Late Heavy Bombardment. There is evidence of an enormous impact basin in the Northern Hemisphere of Mars, spanning 10,600 by 8,500 kilometres (6,600 by 5,300 mi), or roughly four times the size of the Moon's South Pole–Aitken basin, which would be the largest impact basin yet discovered if confirmed. It has been hypothesized that the basin was formed when Mars was struck by a Pluto-sized body about four billion years ago. The event, thought to be the cause of the Martian hemispheric dichotomy, created the smooth Borealis basin that covers 40% of the planet. A 2023 study shows evidence, based on the orbital inclination of Deimos (a small moon of Mars), that Mars may once have had a ring system 3.5 billion years to 4 billion years ago. This ring system may have been formed from a moon, 20 times more massive than Phobos, orbiting Mars billions of years ago; and Phobos would be a remnant of that ring. Epochs: The geological history of Mars can be split into many periods, but the following are the three primary periods: Geological activity is still taking place on Mars. The Athabasca Valles is home to sheet-like lava flows created about 200 million years ago. Water flows in the grabens called the Cerberus Fossae occurred less than 20 million years ago, indicating equally recent volcanic intrusions. The Mars Reconnaissance Orbiter has captured images of avalanches. Physical characteristics Mars is approximately half the diameter of Earth or twice that of the Moon, with a surface area only slightly less than the total area of Earth's dry land. Mars is less dense than Earth, having about 15% of Earth's volume and 11% of Earth's mass, resulting in about 38% of Earth's surface gravity. Mars is the only presently known example of a desert planet, a rocky planet with a surface akin to that of Earth's deserts. The red-orange appearance of the Martian surface is caused by iron(III) oxide (nanophase Fe2O3) and the iron(III) oxide-hydroxide mineral goethite. It can look like butterscotch; other common surface colors include golden, brown, tan, and greenish, depending on the minerals present. Like Earth, Mars is differentiated into a dense metallic core overlaid by less dense rocky layers. The outermost layer is the crust, which is on average about 42–56 kilometres (26–35 mi) thick, with a minimum thickness of 6 kilometres (3.7 mi) in Isidis Planitia, and a maximum thickness of 117 kilometres (73 mi) in the southern Tharsis plateau. For comparison, Earth's crust averages 27.3 ± 4.8 km in thickness. The most abundant elements in the Martian crust are silicon, oxygen, iron, magnesium, aluminum, calcium, and potassium. Mars is confirmed to be seismically active; in 2019, it was reported that InSight had detected and recorded over 450 marsquakes and related events. Beneath the crust is a silicate mantle responsible for many of the tectonic and volcanic features on the planet's surface. The upper Martian mantle is a low-velocity zone, where the velocity of seismic waves is lower than surrounding depth intervals. The mantle appears to be rigid down to the depth of about 250 km, giving Mars a very thick lithosphere compared to Earth. Below this the mantle gradually becomes more ductile, and the seismic wave velocity starts to grow again. The Martian mantle does not appear to have a thermally insulating layer analogous to Earth's lower mantle; instead, below 1050 km in depth, it becomes mineralogically similar to Earth's transition zone. At the bottom of the mantle lies a basal liquid silicate layer approximately 150–180 km thick. The Martian mantle appears to be highly heterogenous, with dense fragments up to 4 km across, likely injected deep into the planet by colossal impacts ~4.5 billion years ago; high-frequency waves from eight marsquakes slowed as they passed these localized regions, and modeling indicates the heterogeneities are compositionally distinct debris preserved because Mars lacks plate tectonics and has a sluggishly convecting interior that prevents complete homogenization. Mars's iron and nickel core is at least partially molten, and may have a solid inner core. It is around half of Mars's radius, approximately 1650–1675 km, and is enriched in light elements such as sulfur, oxygen, carbon, and hydrogen. The temperature of the core is estimated to be 2000–2400 K, compared to 5400–6230 K for Earth's solid inner core. In 2025, based on data from the InSight lander, a group of researchers reported the detection of a solid inner core 613 kilometres (381 mi) ± 67 kilometres (42 mi) in radius. Mars is a terrestrial planet with a surface that consists of minerals containing silicon and oxygen, metals, and other elements that typically make up rock. The Martian surface is primarily composed of tholeiitic basalt, although parts are more silica-rich than typical basalt and may be similar to andesitic rocks on Earth, or silica glass. Regions of low albedo suggest concentrations of plagioclase feldspar, with northern low albedo regions displaying higher than normal concentrations of sheet silicates and high-silicon glass. Parts of the southern highlands include detectable amounts of high-calcium pyroxenes. Localized concentrations of hematite and olivine have been found. Much of the surface is deeply covered by finely grained iron(III) oxide dust. The Phoenix lander returned data showing Martian soil to be slightly alkaline and containing elements such as magnesium, sodium, potassium and chlorine. These nutrients are found in soils on Earth, and are necessary for plant growth. Experiments performed by the lander showed that the Martian soil has a basic pH of 7.7, and contains 0.6% perchlorate by weight, concentrations that are toxic to humans. Streaks are common across Mars and new ones appear frequently on steep slopes of craters, troughs, and valleys. The streaks are dark at first and get lighter with age. The streaks can start in a tiny area, then spread out for hundreds of metres. They have been seen to follow the edges of boulders and other obstacles in their path. The commonly accepted hypotheses include that they are dark underlying layers of soil revealed after avalanches of bright dust or dust devils. Several other explanations have been put forward, including those that involve water or even the growth of organisms. Environmental radiation levels on the surface are on average 0.64 millisieverts of radiation per day, and significantly less than the radiation of 1.84 millisieverts per day or 22 millirads per day during the flight to and from Mars. For comparison the radiation levels in low Earth orbit, where Earth's space stations orbit, are around 0.5 millisieverts of radiation per day. Hellas Planitia has the lowest surface radiation at about 0.342 millisieverts per day, featuring lava tubes southwest of Hadriacus Mons with potentially levels as low as 0.064 millisieverts per day, comparable to radiation levels during flights on Earth. Although Mars has no evidence of a structured global magnetic field, observations show that parts of the planet's crust have been magnetized, suggesting that alternating polarity reversals of its dipole field have occurred in the past. This paleomagnetism of magnetically susceptible minerals is similar to the alternating bands found on Earth's ocean floors. One hypothesis, published in 1999 and re-examined in October 2005 (with the help of the Mars Global Surveyor), is that these bands suggest plate tectonic activity on Mars four billion years ago, before the planetary dynamo ceased to function and the planet's magnetic field faded. Geography and features Although better remembered for mapping the Moon, Johann Heinrich von Mädler and Wilhelm Beer were the first areographers. They began by establishing that most of Mars's surface features were permanent and by more precisely determining the planet's rotation period. In 1840, Mädler combined ten years of observations and drew the first map of Mars. Features on Mars are named from a variety of sources. Albedo features are named for classical mythology. Craters larger than roughly 50 km are named for deceased scientists and writers and others who have contributed to the study of Mars. Smaller craters are named for towns and villages of the world with populations of less than 100,000. Large valleys are named for the word "Mars" or "star" in various languages; smaller valleys are named for rivers. Large albedo features retain many of the older names but are often updated to reflect new knowledge of the nature of the features. For example, Nix Olympica (the snows of Olympus) has become Olympus Mons (Mount Olympus). The surface of Mars as seen from Earth is divided into two kinds of areas, with differing albedo. The paler plains covered with dust and sand rich in reddish iron oxides were once thought of as Martian "continents" and given names like Arabia Terra (land of Arabia) or Amazonis Planitia (Amazonian plain). The dark features were thought to be seas, hence their names Mare Erythraeum, Mare Sirenum and Aurorae Sinus. The largest dark feature seen from Earth is Syrtis Major Planum. The permanent northern polar ice cap is named Planum Boreum. The southern cap is called Planum Australe. Mars's equator is defined by its rotation, but the location of its Prime Meridian was specified, as was Earth's (at Greenwich), by choice of an arbitrary point; Mädler and Beer selected a line for their first maps of Mars in 1830. After the spacecraft Mariner 9 provided extensive imagery of Mars in 1972, a small crater (later called Airy-0), located in the Sinus Meridiani ("Middle Bay" or "Meridian Bay"), was chosen by Merton E. Davies, Harold Masursky, and Gérard de Vaucouleurs for the definition of 0.0° longitude to coincide with the original selection. Because Mars has no oceans, and hence no "sea level", a zero-elevation surface had to be selected as a reference level; this is called the areoid of Mars, analogous to the terrestrial geoid. Zero altitude was defined by the height at which there is 610.5 Pa (6.105 mbar) of atmospheric pressure. This pressure corresponds to the triple point of water, and it is about 0.6% of the sea level surface pressure on Earth (0.006 atm). For mapping purposes, the United States Geological Survey divides the surface of Mars into thirty cartographic quadrangles, each named for a classical albedo feature it contains. In April 2023, The New York Times reported an updated global map of Mars based on images from the Hope spacecraft. A related, but much more detailed, global Mars map was released by NASA on 16 April 2023. The vast upland region Tharsis contains several massive volcanoes, which include the shield volcano Olympus Mons. The edifice is over 600 km (370 mi) wide. Because the mountain is so large, with complex structure at its edges, giving a definite height to it is difficult. Its local relief, from the foot of the cliffs which form its northwest margin to its peak, is over 21 km (13 mi), a little over twice the height of Mauna Kea as measured from its base on the ocean floor. The total elevation change from the plains of Amazonis Planitia, over 1,000 km (620 mi) to the northwest, to the summit approaches 26 km (16 mi), roughly three times the height of Mount Everest, which in comparison stands at just over 8.8 kilometres (5.5 mi). Consequently, Olympus Mons is either the tallest or second-tallest mountain in the Solar System; the only known mountain which might be taller is the Rheasilvia peak on the asteroid Vesta, at 20–25 km (12–16 mi). The dichotomy of Martian topography is striking: northern plains flattened by lava flows contrast with the southern highlands, pitted and cratered by ancient impacts. It is possible that, four billion years ago, the Northern Hemisphere of Mars was struck by an object one-tenth to two-thirds the size of Earth's Moon. If this is the case, the Northern Hemisphere of Mars would be the site of an impact crater 10,600 by 8,500 kilometres (6,600 by 5,300 mi) in size, or roughly the area of Europe, Asia, and Australia combined, surpassing Utopia Planitia and the Moon's South Pole–Aitken basin as the largest impact crater in the Solar System. Mars is scarred by 43,000 impact craters with a diameter of 5 kilometres (3.1 mi) or greater. The largest exposed crater is Hellas, which is 2,300 kilometres (1,400 mi) wide and 7,000 metres (23,000 ft) deep, and is a light albedo feature clearly visible from Earth. There are other notable impact features, such as Argyre, which is around 1,800 kilometres (1,100 mi) in diameter, and Isidis, which is around 1,500 kilometres (930 mi) in diameter. Due to the smaller mass and size of Mars, the probability of an object colliding with the planet is about half that of Earth. Mars is located closer to the asteroid belt, so it has an increased chance of being struck by materials from that source. Mars is more likely to be struck by short-period comets, i.e., those that lie within the orbit of Jupiter. Martian craters can[discuss] have a morphology that suggests the ground became wet after the meteor impact. The large canyon, Valles Marineris (Latin for 'Mariner Valleys, also known as Agathodaemon in the old canal maps), has a length of 4,000 kilometres (2,500 mi) and a depth of up to 7 kilometres (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 kilometres (277 mi) long and nearly 2 kilometres (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 kilometres (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement. Images from the Thermal Emission Imaging System (THEMIS) aboard NASA's Mars Odyssey orbiter have revealed seven possible cave entrances on the flanks of the volcano Arsia Mons. The caves, named after loved ones of their discoverers, are collectively known as the "seven sisters". Cave entrances measure from 100 to 252 metres (328 to 827 ft) wide and they are estimated to be at least 73 to 96 metres (240 to 315 ft) deep. Because light does not reach the floor of most of the caves, they may extend much deeper than these lower estimates and widen below the surface. "Dena" is the only exception; its floor is visible and was measured to be 130 metres (430 ft) deep. The interiors of these caverns may be protected from micrometeoroids, UV radiation, solar flares and high energy particles that bombard the planet's surface. Martian geysers (or CO2 jets) are putative sites of small gas and dust eruptions that occur in the south polar region of Mars during the spring thaw. "Dark dune spots" and "spiders" – or araneiforms – are the two most visible types of features ascribed to these eruptions. Similarly sized dust will settle from the thinner Martian atmosphere sooner than it would on Earth. For example, the dust suspended by the 2001 global dust storms on Mars only remained in the Martian atmosphere for 0.6 years, while the dust from Mount Pinatubo took about two years to settle. However, under current Martian conditions, the mass movements involved are generally much smaller than on Earth. Even the 2001 global dust storms on Mars moved only the equivalent of a very thin dust layer – about 3 μm thick if deposited with uniform thickness between 58° north and south of the equator. Dust deposition at the two rover sites has proceeded at a rate of about the thickness of a grain every 100 sols. Atmosphere Mars lost its magnetosphere 4 billion years ago, possibly because of numerous asteroid strikes, so the solar wind interacts directly with the Martian ionosphere, lowering the atmospheric density by stripping away atoms from the outer layer. Both Mars Global Surveyor and Mars Express have detected ionized atmospheric particles trailing off into space behind Mars, and this atmospheric loss is being studied by the MAVEN orbiter. Compared to Earth, the atmosphere of Mars is quite rarefied. Atmospheric pressure on the surface today ranges from a low of 30 Pa (0.0044 psi) on Olympus Mons to over 1,155 Pa (0.1675 psi) in Hellas Planitia, with a mean pressure at the surface level of 600 Pa (0.087 psi). The highest atmospheric density on Mars is equal to that found 35 kilometres (22 mi) above Earth's surface. The resulting mean surface pressure is only 0.6% of Earth's 101.3 kPa (14.69 psi). The scale height of the atmosphere is about 10.8 kilometres (6.7 mi), which is higher than Earth's 6 kilometres (3.7 mi), because the surface gravity of Mars is only about 38% of Earth's. The atmosphere of Mars consists of about 96% carbon dioxide, 1.93% argon and 1.89% nitrogen along with traces of oxygen and water. The atmosphere is quite dusty, containing particulates about 1.5 μm in diameter which give the Martian sky a tawny color when seen from the surface. It may take on a pink hue due to iron oxide particles suspended in it. Despite repeated detections of methane on Mars, there is no scientific consensus as to its origin. One suggestion is that methane exists on Mars and that its concentration fluctuates seasonally. The existence of methane could be produced by non-biological process such as serpentinization involving water, carbon dioxide, and the mineral olivine, which is known to be common on Mars, or by Martian life. Compared to Earth, its higher concentration of atmospheric CO2 and lower surface pressure may be why sound is attenuated more on Mars, where natural sources are rare apart from the wind. Using acoustic recordings collected by the Perseverance rover, researchers concluded that the speed of sound there is approximately 240 m/s for frequencies below 240 Hz, and 250 m/s for those above. Auroras have been detected on Mars. Because Mars lacks a global magnetic field, the types and distribution of auroras there differ from those on Earth; rather than being mostly restricted to polar regions as is the case on Earth, a Martian aurora can encompass the planet. In September 2017, NASA reported radiation levels on the surface of the planet Mars were temporarily doubled, and were associated with an aurora 25 times brighter than any observed earlier, due to a massive, and unexpected, solar storm in the middle of the month. Mars has seasons, alternating between its northern and southern hemispheres, similar to on Earth. Additionally the orbit of Mars has, compared to Earth's, a large eccentricity and approaches perihelion when it is summer in its southern hemisphere and winter in its northern, and aphelion when it is winter in its southern hemisphere and summer in its northern. As a result, the seasons in its southern hemisphere are more extreme and the seasons in its northern are milder than would otherwise be the case. The summer temperatures in the south can be warmer than the equivalent summer temperatures in the north by up to 30 °C (54 °F). Martian surface temperatures vary from lows of about −110 °C (−166 °F) to highs of up to 35 °C (95 °F) in equatorial summer. The wide range in temperatures is due to the thin atmosphere which cannot store much solar heat, the low atmospheric pressure (about 1% that of the atmosphere of Earth), and the low thermal inertia of Martian soil. The planet is 1.52 times as far from the Sun as Earth, resulting in just 43% of the amount of sunlight. Mars has the largest dust storms in the Solar System, reaching speeds of over 160 km/h (100 mph). These can vary from a storm over a small area, to gigantic storms that cover the entire planet. They tend to occur when Mars is closest to the Sun, and have been shown to increase global temperature. Seasons also produce dry ice covering polar ice caps. Hydrology While Mars contains water in larger amounts, most of it is dust covered water ice at the Martian polar ice caps. The volume of water ice in the south polar ice cap, if melted, would be enough to cover most of the surface of the planet with a depth of 11 metres (36 ft). Water in its liquid form cannot persist on the surface due to Mars's low atmospheric pressure, which is less than 1% that of Earth. Only at the lowest of elevations are the pressure and temperature high enough for liquid water to exist for short periods. Although little water is present in the atmosphere, there is enough to produce clouds of water ice and different cases of snow and frost, often mixed with snow of carbon dioxide dry ice. Landforms visible on Mars strongly suggest that liquid water has existed on the planet's surface. Huge linear swathes of scoured ground, known as outflow channels, cut across the surface in about 25 places. These are thought to be a record of erosion caused by the catastrophic release of water from subsurface aquifers, though some of these structures have been hypothesized to result from the action of glaciers or lava. One of the larger examples, Ma'adim Vallis, is 700 kilometres (430 mi) long, much greater than the Grand Canyon, with a width of 20 kilometres (12 mi) and a depth of 2 kilometres (1.2 mi) in places. It is thought to have been carved by flowing water early in Mars's history. The youngest of these channels is thought to have formed only a few million years ago. Elsewhere, particularly on the oldest areas of the Martian surface, finer-scale, dendritic networks of valleys are spread across significant proportions of the landscape. Features of these valleys and their distribution strongly imply that they were carved by runoff resulting from precipitation in early Mars history. Subsurface water flow and groundwater sapping may play important subsidiary roles in some networks, but precipitation was probably the root cause of the incision in almost all cases. Along craters and canyon walls, there are thousands of features that appear similar to terrestrial gullies. The gullies tend to be in the highlands of the Southern Hemisphere and face the Equator; all are poleward of 30° latitude. A number of authors have suggested that their formation process involves liquid water, probably from melting ice, although others have argued for formation mechanisms involving carbon dioxide frost or the movement of dry dust. No partially degraded gullies have formed by weathering and no superimposed impact craters have been observed, indicating that these are young features, possibly still active. Other geological features, such as deltas and alluvial fans preserved in craters, are further evidence for warmer, wetter conditions at an interval or intervals in earlier Mars history. Such conditions necessarily require the widespread presence of crater lakes across a large proportion of the surface, for which there is independent mineralogical, sedimentological and geomorphological evidence. Further evidence that liquid water once existed on the surface of Mars comes from the detection of specific minerals such as hematite and goethite, both of which sometimes form in the presence of water. The chemical signature of water vapor on Mars was first unequivocally demonstrated in 1963 by spectroscopy using an Earth-based telescope. In 2004, Opportunity detected the mineral jarosite. This forms only in the presence of acidic water, showing that water once existed on Mars. The Spirit rover found concentrated deposits of silica in 2007 that indicated wet conditions in the past, and in December 2011, the mineral gypsum, which also forms in the presence of water, was found on the surface by NASA's Mars rover Opportunity. It is estimated that the amount of water in the upper mantle of Mars, represented by hydroxyl ions contained within Martian minerals, is equal to or greater than that of Earth at 50–300 parts per million of water, which is enough to cover the entire planet to a depth of 200–1,000 metres (660–3,280 ft). On 18 March 2013, NASA reported evidence from instruments on the Curiosity rover of mineral hydration, likely hydrated calcium sulfate, in several rock samples including the broken fragments of "Tintina" rock and "Sutton Inlier" rock as well as in veins and nodules in other rocks like "Knorr" rock and "Wernicke" rock. Analysis using the rover's DAN instrument provided evidence of subsurface water, amounting to as much as 4% water content, down to a depth of 60 centimetres (24 in), during the rover's traverse from the Bradbury Landing site to the Yellowknife Bay area in the Glenelg terrain. In September 2015, NASA announced that they had found strong evidence of hydrated brine flows in recurring slope lineae, based on spectrometer readings of the darkened areas of slopes. These streaks flow downhill in Martian summer, when the temperature is above −23 °C, and freeze at lower temperatures. These observations supported earlier hypotheses, based on timing of formation and their rate of growth, that these dark streaks resulted from water flowing just below the surface. However, later work suggested that the lineae may be dry, granular flows instead, with at most a limited role for water in initiating the process. A definitive conclusion about the presence, extent, and role of liquid water on the Martian surface remains elusive. Researchers suspect much of the low northern plains of the planet were covered with an ocean hundreds of meters deep, though this theory remains controversial. In March 2015, scientists stated that such an ocean might have been the size of Earth's Arctic Ocean. This finding was derived from the ratio of protium to deuterium in the modern Martian atmosphere compared to that ratio on Earth. The amount of Martian deuterium (D/H = 9.3 ± 1.7 10−4) is five to seven times the amount on Earth (D/H = 1.56 10−4), suggesting that ancient Mars had significantly higher levels of water. Results from the Curiosity rover had previously found a high ratio of deuterium in Gale Crater, though not significantly high enough to suggest the former presence of an ocean. Other scientists caution that these results have not been confirmed, and point out that Martian climate models have not yet shown that the planet was warm enough in the past to support bodies of liquid water. Near the northern polar cap is the 81.4 kilometres (50.6 mi) wide Korolev Crater, which the Mars Express orbiter found to be filled with approximately 2,200 cubic kilometres (530 cu mi) of water ice. In November 2016, NASA reported finding a large amount of underground ice in the Utopia Planitia region. The volume of water detected has been estimated to be equivalent to the volume of water in Lake Superior (which is 12,100 cubic kilometers). During observations from 2018 through 2021, the ExoMars Trace Gas Orbiter spotted indications of water, probably subsurface ice, in the Valles Marineris canyon system. Orbital motion Mars's average distance from the Sun is roughly 230 million km (143 million mi), and its orbital period is 687 (Earth) days. The solar day (or sol) on Mars is only slightly longer than an Earth day: 24 hours, 39 minutes, and 35.244 seconds. A Martian year is equal to 1.8809 Earth years, or 1 year, 320 days, and 18.2 hours. The gravitational potential difference and thus the delta-v needed to transfer between Mars and Earth is the second lowest for Earth. The axial tilt of Mars is 25.19° relative to its orbital plane, which is similar to the axial tilt of Earth. As a result, Mars has seasons like Earth, though on Mars they are nearly twice as long because its orbital period is that much longer. In the present day, the orientation of the north pole of Mars is close to the star Deneb. Mars has a relatively pronounced orbital eccentricity of about 0.09; of the seven other planets in the Solar System, only Mercury has a larger orbital eccentricity. It is known that in the past, Mars has had a much more circular orbit. At one point, 1.35 million Earth years ago, Mars had an eccentricity of roughly 0.002, much less than that of Earth today. Mars's cycle of eccentricity is 96,000 Earth years compared to Earth's cycle of 100,000 years. Mars has its closest approach to Earth (opposition) in a synodic period of 779.94 days. It should not be confused with Mars conjunction, where the Earth and Mars are at opposite sides of the Solar System and form a straight line crossing the Sun. The average time between the successive oppositions of Mars, its synodic period, is 780 days; but the number of days between successive oppositions can range from 764 to 812. The distance at close approach varies between about 54 and 103 million km (34 and 64 million mi) due to the planets' elliptical orbits, which causes comparable variation in angular size. At their furthest Mars and Earth can be as far as 401 million km (249 million mi) apart. Mars comes into opposition from Earth every 2.1 years. The planets come into opposition near Mars's perihelion in 2003, 2018 and 2035, with the 2020 and 2033 events being particularly close to perihelic opposition. The mean apparent magnitude of Mars is +0.71 with a standard deviation of 1.05. Because the orbit of Mars is eccentric, the magnitude at opposition from the Sun can range from about −3.0 to −1.4. The minimum brightness is magnitude +1.86 when the planet is near aphelion and in conjunction with the Sun. At its brightest, Mars (along with Jupiter) is second only to Venus in apparent brightness. Mars usually appears distinctly yellow, orange, or red. When farthest away from Earth, it is more than seven times farther away than when it is closest. Mars is usually close enough for particularly good viewing once or twice at 15-year or 17-year intervals. Optical ground-based telescopes are typically limited to resolving features about 300 kilometres (190 mi) across when Earth and Mars are closest because of Earth's atmosphere. As Mars approaches opposition, it begins a period of retrograde motion, which means it will appear to move backwards in a looping curve with respect to the background stars. This retrograde motion lasts for about 72 days, and Mars reaches its peak apparent brightness in the middle of this interval. Moons Mars has two relatively small (compared to Earth's) natural moons, Phobos (about 22 km (14 mi) in diameter) and Deimos (about 12 km (7.5 mi) in diameter), which orbit at 9,376 km (5,826 mi) and 23,460 km (14,580 mi) around the planet. The origin of both moons is unclear, although a popular theory states that they were asteroids captured into Martian orbit. Both satellites were discovered in 1877 by Asaph Hall and were named after the characters Phobos (the deity of panic and fear) and Deimos (the deity of terror and dread), twins from Greek mythology who accompanied their father Ares, god of war, into battle. Mars was the Roman equivalent to Ares. In modern Greek, the planet retains its ancient name Ares (Aris: Άρης). From the surface of Mars, the motions of Phobos and Deimos appear different from that of the Earth's satellite, the Moon. Phobos rises in the west, sets in the east, and rises again in just 11 hours. Deimos, being only just outside synchronous orbit – where the orbital period would match the planet's period of rotation – rises as expected in the east, but slowly. Because the orbit of Phobos is below a synchronous altitude, tidal forces from Mars are gradually lowering its orbit. In about 50 million years, it could either crash into Mars's surface or break up into a ring structure around the planet. The origin of the two satellites is not well understood. Their low albedo and carbonaceous chondrite composition have been regarded as similar to asteroids, supporting a capture theory. The unstable orbit of Phobos would seem to point toward a relatively recent capture. But both have circular orbits near the equator, which is unusual for captured objects, and the required capture dynamics are complex. Accretion early in the history of Mars is plausible, but would not account for a composition resembling asteroids rather than Mars itself, if that is confirmed. Mars may have yet-undiscovered moons, smaller than 50 to 100 metres (160 to 330 ft) in diameter, and a dust ring is predicted to exist between Phobos and Deimos. A third possibility for their origin as satellites of Mars is the involvement of a third body or a type of impact disruption. More-recent lines of evidence for Phobos having a highly porous interior, and suggesting a composition containing mainly phyllosilicates and other minerals known from Mars, point toward an origin of Phobos from material ejected by an impact on Mars that reaccreted in Martian orbit, similar to the prevailing theory for the origin of Earth's satellite. Although the visible and near-infrared (VNIR) spectra of the moons of Mars resemble those of outer-belt asteroids, the thermal infrared spectra of Phobos are reported to be inconsistent with chondrites of any class. It is also possible that Phobos and Deimos were fragments of an older moon, formed by debris from a large impact on Mars, and then destroyed by a more recent impact upon the satellite. More recently, a study conducted by a team of researchers from multiple countries suggests that a lost moon, at least fifteen times the size of Phobos, may have existed in the past. By analyzing rocks which point to tidal processes on the planet, it is possible that these tides may have been regulated by a past moon. Human observations and exploration The history of observations of Mars is marked by oppositions of Mars when the planet is closest to Earth and hence is most easily visible, which occur every couple of years. Even more notable are the perihelic oppositions of Mars, which are distinguished because Mars is close to perihelion, making it even closer to Earth. The ancient Sumerians named Mars Nergal, the god of war and plague. During Sumerian times, Nergal was a minor deity of little significance, but, during later times, his main cult center was the city of Nineveh. In Mesopotamian texts, Mars is referred to as the "star of judgement of the fate of the dead". The existence of Mars as a wandering object in the night sky was also recorded by the ancient Egyptian astronomers and, by 1534 BCE, they were familiar with the retrograde motion of the planet. By the period of the Neo-Babylonian Empire, the Babylonian astronomers were making regular records of the positions of the planets and systematic observations of their behavior. For Mars, they knew that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. They invented arithmetic methods for making minor corrections to the predicted positions of the planets. In Ancient Greece, the planet was known as Πυρόεις. Commonly, the Greek name for the planet now referred to as Mars, was Ares. It was the Romans who named the planet Mars, for their god of war, often represented by the sword and shield of the planet's namesake. In the fourth century BCE, Aristotle noted that Mars disappeared behind the Moon during an occultation, indicating that the planet was farther away. Ptolemy, a Greek living in Alexandria, attempted to address the problem of the orbital motion of Mars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection later called the Almagest (from the Arabic for "greatest"), which became the authoritative treatise on Western astronomy for the next fourteen centuries. Literature from ancient China confirms that Mars was known by Chinese astronomers by no later than the fourth century BCE. In the East Asian cultures, Mars is traditionally referred to as the "fire star" (火星) based on the Wuxing system. In 1609 Johannes Kepler published a 10 year study of Martian orbit, using the diurnal parallax of Mars, measured by Tycho Brahe, to make a preliminary calculation of the relative distance to the planet. From Brahe's observations of Mars, Kepler deduced that the planet orbited the Sun not in a circle, but in an ellipse. Moreover, Kepler showed that Mars sped up as it approached the Sun and slowed down as it moved farther away, in a manner that later physicists would explain as a consequence of the conservation of angular momentum.: 433–437 In 1610 the first use of a telescope for astronomical observation, including Mars, was performed by Italian astronomer Galileo Galilei. With the telescope the diurnal parallax of Mars was again measured in an effort to determine the Sun-Earth distance. This was first performed by Giovanni Domenico Cassini in 1672. The early parallax measurements were hampered by the quality of the instruments. The only occultation of Mars by Venus observed was that of 13 October 1590, seen by Michael Maestlin at Heidelberg. By the 19th century, the resolution of telescopes reached a level sufficient for surface features to be identified. On 5 September 1877, a perihelic opposition to Mars occurred. The Italian astronomer Giovanni Schiaparelli used a 22-centimetre (8.7 in) telescope in Milan to help produce the first detailed map of Mars. These maps notably contained features he called canali, which, with the possible exception of the natural canyon Valles Marineris, were later shown to be an optical illusion. These canali were supposedly long, straight lines on the surface of Mars, to which he gave names of famous rivers on Earth. His term, which means "channels" or "grooves", was popularly mistranslated in English as "canals". Influenced by the observations, the orientalist Percival Lowell founded an observatory which had 30- and 45-centimetre (12- and 18-in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894, and the following less favorable oppositions. He published several books on Mars and life on the planet, which had a great influence on the public. The canali were independently observed by other astronomers, like Henri Joseph Perrotin and Louis Thollon in Nice, using one of the largest telescopes of that time. The seasonal changes (consisting of the diminishing of the polar caps and the dark areas formed during Martian summers) in combination with the canals led to speculation about life on Mars, and it was a long-held belief that Mars contained vast seas and vegetation. As bigger telescopes were used, fewer long, straight canali were observed. During observations in 1909 by Antoniadi with an 84-centimetre (33 in) telescope, irregular patterns were observed, but no canali were seen. The first spacecraft from Earth to visit Mars was Mars 1 of the Soviet Union, which flew by in 1963, but contact was lost en route. NASA's Mariner 4 followed and became the first spacecraft to successfully transmit from Mars; launched on 28 November 1964, it made its closest approach to the planet on 15 July 1965. Mariner 4 detected the weak Martian radiation belt, measured at about 0.1% that of Earth, and captured the first images of another planet from deep space. Once spacecraft visited the planet during the 1960s and 1970s, many previous concepts of Mars were radically broken. After the results of the Viking life-detection experiments, the hypothesis of a dead planet was generally accepted. The data from Mariner 9 and Viking allowed better maps of Mars to be made. Until 1997 and after Viking 1 shut down in 1982, Mars was only visited by three unsuccessful probes, two flying past without contact (Phobos 1, 1988; Mars Observer, 1993), and one (Phobos 2 1989) malfunctioning in orbit before reaching its destination Phobos. In 1997 Mars Pathfinder became the first successful rover mission beyond the Moon and started together with Mars Global Surveyor (operated until late 2006) an uninterrupted active robotic presence at Mars that has lasted until today. It produced complete, extremely detailed maps of the Martian topography, magnetic field and surface minerals. Starting with these missions a range of new improved crewless spacecraft, including orbiters, landers, and rovers, have been sent to Mars, with successful missions by the NASA (United States), Jaxa (Japan), ESA, United Kingdom, ISRO (India), Roscosmos (Russia), the United Arab Emirates, and CNSA (China) to study the planet's surface, climate, and geology, uncovering the different elements of the history and dynamic of the hydrosphere of Mars and possible traces of ancient life. As of 2023[update], Mars is host to ten functioning spacecraft. Eight are in orbit: 2001 Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, ExoMars Trace Gas Orbiter, the Hope orbiter, and the Tianwen-1 orbiter. Another two are on the surface: the Mars Science Laboratory Curiosity rover and the Perseverance rover. Collected maps are available online at websites including Google Mars. NASA provides two online tools: Mars Trek, which provides visualizations of the planet using data from 50 years of exploration, and Experience Curiosity, which simulates traveling on Mars in 3-D with Curiosity. Planned missions to Mars include: As of February 2024[update], debris from these types of missions has reached over seven tons. Most of it consists of crashed and inactive spacecraft as well as discarded components. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Key areas include establishing telecommunications, payload delivery and surface imaging. Habitability and habitation During the late 19th century, it was widely accepted in the astronomical community that Mars had life-supporting qualities, including the presence of oxygen and water. However, in 1894 W. W. Campbell at Lick Observatory observed the planet and found that "if water vapor or oxygen occur in the atmosphere of Mars it is in quantities too small to be detected by spectroscopes then available". That observation contradicted many of the measurements of the time and was not widely accepted. Campbell and V. M. Slipher repeated the study in 1909 using better instruments, but with the same results. It was not until the findings were confirmed by W. S. Adams in 1925 that the myth of the Earth-like habitability of Mars was finally broken. However, even in the 1960s, articles were published on Martian biology, putting aside explanations other than life for the seasonal changes on Mars. The current understanding of planetary habitability – the ability of a world to develop environmental conditions favorable to the emergence of life – favors planets that have liquid water on their surface. Most often this requires the orbit of a planet to lie within the habitable zone, which for the Sun is estimated to extend from within the orbit of Earth to about that of Mars. During perihelion, Mars dips inside this region, but Mars's thin (low-pressure) atmosphere prevents liquid water from existing over large regions for extended periods. The past flow of liquid water demonstrates the planet's potential for habitability. Recent evidence has suggested that any water on the Martian surface may have been too salty and acidic to support regular terrestrial life. The environmental conditions on Mars are a challenge to sustaining organic life: the planet has little heat transfer across its surface, it has poor insulation against bombardment by the solar wind due to the absence of a magnetosphere and has insufficient atmospheric pressure to retain water in a liquid form (water instead sublimes to a gaseous state). Mars is nearly, or perhaps totally, geologically dead; the end of volcanic activity has apparently stopped the recycling of chemicals and minerals between the surface and interior of the planet. Evidence suggests that the planet was once significantly more habitable than it is today, but whether living organisms ever existed there remains unknown. The Viking probes of the mid-1970s carried experiments designed to detect microorganisms in Martian soil at their respective landing sites and had positive results, including a temporary increase in CO2 production on exposure to water and nutrients. This sign of life was later disputed by scientists, resulting in a continuing debate, with NASA scientist Gilbert Levin asserting that Viking may have found life. A 2014 analysis of Martian meteorite EETA79001 found chlorate, perchlorate, and nitrate ions in sufficiently high concentrations to suggest that they are widespread on Mars. UV and X-ray radiation would turn chlorate and perchlorate ions into other, highly reactive oxychlorines, indicating that any organic molecules would have to be buried under the surface to survive. Small quantities of methane and formaldehyde detected by Mars orbiters are both claimed to be possible evidence for life, as these chemical compounds would quickly break down in the Martian atmosphere. Alternatively, these compounds may instead be replenished by volcanic or other geological means, such as serpentinite. Impact glass, formed by the impact of meteors, which on Earth can preserve signs of life, has also been found on the surface of the impact craters on Mars. Likewise, the glass in impact craters on Mars could have preserved signs of life, if life existed at the site. The Cheyava Falls rock discovered on Mars in June 2024 has been designated by NASA as a "potential biosignature" and was core sampled by the Perseverance rover for possible return to Earth and further examination. Although highly intriguing, no definitive final determination on a biological or abiotic origin of this rock can be made with the data currently available. Several plans for a human mission to Mars have been proposed, but none have come to fruition. The NASA Authorization Act of 2017 directed NASA to study the feasibility of a crewed Mars mission in the early 2030s; the resulting report concluded that this would be unfeasible. In addition, in 2021, China was planning to send a crewed Mars mission in 2033. Privately held companies such as SpaceX have also proposed plans to send humans to Mars, with the eventual goal to settle on the planet. As of 2024, SpaceX has proceeded with the development of the Starship launch vehicle with the goal of Mars colonization. In plans shared with the company in April 2024, Elon Musk envisions the beginning of a Mars colony within the next twenty years. This would be enabled by the planned mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. The moon Phobos has been proposed as an anchor point for a space elevator. Besides national space agencies and space companies, groups such as the Mars Society and The Planetary Society advocate for human missions to Mars. In culture Mars is named after the Roman god of war (Greek Ares), but was also associated with the demi-god Heracles (Roman Hercules) by ancient Greek astronomers, as detailed by Aristotle. This association between Mars and war dates back at least to Babylonian astronomy, in which the planet was named for the god Nergal, deity of war and destruction. It persisted into modern times, as exemplified by Gustav Holst's orchestral suite The Planets, whose famous first movement labels Mars "The Bringer of War". The planet's symbol, a circle with a spear pointing out to the upper right, is also used as a symbol for the male gender. The symbol dates from at least the 11th century, though a possible predecessor has been found in the Greek Oxyrhynchus Papyri. The idea that Mars was populated by intelligent Martians became widespread in the late 19th century. Schiaparelli's "canali" observations combined with Percival Lowell's books on the subject put forward the standard notion of a planet that was a drying, cooling, dying world with ancient civilizations constructing irrigation works. Many other observations and proclamations by notable personalities added to what has been termed "Mars Fever". In the present day, high-resolution mapping of the surface of Mars has revealed no artifacts of habitation, but pseudoscientific speculation about intelligent life on Mars still continues. Reminiscent of the canali observations, these speculations are based on small scale features perceived in the spacecraft images, such as "pyramids" and the "Face on Mars". In his book Cosmos, planetary astronomer Carl Sagan wrote: "Mars has become a kind of mythic arena onto which we have projected our Earthly hopes and fears." The depiction of Mars in fiction has been stimulated by its dramatic red color and by nineteenth-century scientific speculations that its surface conditions might support not just life but intelligent life. This gave way to many science fiction stories involving these concepts, such as H. G. Wells's The War of the Worlds, in which Martians seek to escape their dying planet by invading Earth; Ray Bradbury's The Martian Chronicles, in which human explorers accidentally destroy a Martian civilization; as well as Edgar Rice Burroughs's series Barsoom, C. S. Lewis's novel Out of the Silent Planet (1938), and a number of Robert A. Heinlein stories before the mid-sixties. Since then, depictions of Martians have also extended to animation. A comic figure of an intelligent Martian, Marvin the Martian, appeared in Haredevil Hare (1948) as a character in the Looney Tunes animated cartoons of Warner Brothers, and has continued as part of popular culture to the present. After the Mariner and Viking spacecraft had returned pictures of Mars as a lifeless and canal-less world, these ideas about Mars were abandoned; for many science-fiction authors, the new discoveries initially seemed like a constraint, but eventually the post-Viking knowledge of Mars became itself a source of inspiration for works like Kim Stanley Robinson's Mars trilogy. See also Notes References Further reading External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of". |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Black_hole#cite_note-39] | [TOKENS: 13839] |
Contents Black hole A black hole is an astronomical body so compact that its gravity prevents anything, including light, from escaping. Albert Einstein's theory of general relativity predicts that a sufficiently compact mass will form a black hole. The boundary of no escape is called the event horizon. In general relativity, a black hole's event horizon seals an object's fate but produces no locally detectable change when crossed. General relativity also predicts that every black hole should have a central singularity, where the curvature of spacetime is infinite. In many ways, a black hole acts like an ideal black body, as it reflects no light. Quantum field theory in curved spacetime predicts that event horizons emit Hawking radiation, with the same spectrum as a black body of a temperature inversely proportional to its mass. This temperature is of the order of billionths of a kelvin for stellar black holes, making it essentially impossible to observe directly. Objects whose gravitational fields are too strong for light to escape were first considered in the 18th century by John Michell and Pierre-Simon Laplace. In 1916, Karl Schwarzschild found the first modern solution of general relativity that would characterise a black hole. Due to his influential research, the Schwarzschild metric is named after him. David Finkelstein, in 1958, first interpreted Schwarzschild's model as a region of space from which nothing can escape. Black holes were long considered a mathematical curiosity; it was not until the 1960s that theoretical work showed they were a generic prediction of general relativity. The first black hole known was Cygnus X-1, identified by several researchers independently in 1971. Black holes typically form when massive stars collapse at the end of their life cycle. After a black hole has formed, it can grow by absorbing mass from its surroundings. Supermassive black holes of millions of solar masses may form by absorbing other stars and merging with other black holes, or via direct collapse of gas clouds. There is consensus that supermassive black holes exist in the centres of most galaxies. The presence of a black hole can be inferred through its interaction with other matter and with electromagnetic radiation such as visible light. Matter falling toward a black hole can form an accretion disk of infalling plasma, heated by friction and emitting light. In extreme cases, this creates a quasar, some of the brightest objects in the universe. Merging black holes can also be detected by observation of the gravitational waves they emit. If other stars are orbiting a black hole, their orbits can be used to determine the black hole's mass and location. Such observations can be used to exclude possible alternatives such as neutron stars. In this way, astronomers have identified numerous stellar black hole candidates in binary systems and established that the radio source known as Sagittarius A*, at the core of the Milky Way galaxy, contains a supermassive black hole of about 4.3 million solar masses. History The idea of a body so massive that even light could not escape was first proposed in the late 18th century by English astronomer and clergyman John Michell and independently by French scientist Pierre-Simon Laplace. Both scholars proposed very large stars in contrast to the modern concept of an extremely dense object. Michell's idea, in a short part of a letter published in 1784, calculated that a star with the same density but 500 times the radius of the sun would not let any emitted light escape; the surface escape velocity would exceed the speed of light.: 122 Michell correctly hypothesized that such supermassive but non-radiating bodies might be detectable through their gravitational effects on nearby visible bodies. In 1796, Laplace mentioned that a star could be invisible if it were sufficiently large while speculating on the origin of the Solar System in his book Exposition du Système du Monde. Franz Xaver von Zach asked Laplace for a mathematical analysis, which Laplace provided and published in a journal edited by von Zach. In 1905, Albert Einstein showed that the laws of electromagnetism would be invariant under a Lorentz transformation: they would be identical for observers travelling at different velocities relative to each other. This discovery became known as the principle of special relativity. Although the laws of mechanics had already been shown to be invariant, gravity remained yet to be included.: 19 In 1907, Einstein published a paper proposing his equivalence principle, the hypothesis that inertial mass and gravitational mass have a common cause. Using the principle, Einstein predicted the redshift and half of the lensing effect of gravity on light; the full prediction of gravitational lensing required development of general relativity.: 19 By 1915, Einstein refined these ideas into his general theory of relativity, which explained how matter affects spacetime, which in turn affects the motion of other matter. This formed the basis for black hole physics. Only a few months after Einstein published the field equations describing general relativity, astrophysicist Karl Schwarzschild set out to apply the idea to stars. He assumed spherical symmetry with no spin and found a solution to Einstein's equations.: 124 A few months after Schwarzschild, Johannes Droste, a student of Hendrik Lorentz, independently gave the same solution. At a certain radius from the center of the mass, the Schwarzschild solution became singular, meaning that some of the terms in the Einstein equations became infinite. The nature of this radius, which later became known as the Schwarzschild radius, was not understood at the time. Many physicists of the early 20th century were skeptical of the existence of black holes. In a 1926 popular science book, Arthur Eddington critiqued the idea of a star with mass compressed to its Schwarzschild radius as a flaw in the then-poorly-understood theory of general relativity.: 134 In 1939, Einstein himself used his theory of general relativity in an attempt to prove that black holes were impossible. His work relied on increasing pressure or increasing centrifugal force balancing the force of gravity so that the object would not collapse beyond its Schwarzschild radius. He missed the possibility that implosion would drive the system below this critical value.: 135 By the 1920s, astronomers had classified a number of white dwarf stars as too cool and dense to be explained by the gradual cooling of ordinary stars. In 1926, Ralph Fowler showed that quantum-mechanical degeneracy pressure was larger than thermal pressure at these densities.: 145 In 1931, Subrahmanyan Chandrasekhar calculated that a non-rotating body of electron-degenerate matter below a certain limiting mass is stable, and by 1934 he showed that this explained the catalog of white dwarf stars.: 151 When Chandrasekhar announced his results, Eddington pointed out that stars above this limit would radiate until they were sufficiently dense to prevent light from exiting, a conclusion he considered absurd. Eddington and, later, Lev Landau argued that some yet unknown mechanism would stop the collapse. In the 1930s, Fritz Zwicky and Walter Baade studied stellar novae, focusing on exceptionally bright ones they called supernovae. Zwicky promoted the idea that supernovae produced stars with the density of atomic nuclei—neutron stars—but this idea was largely ignored.: 171 In 1939, based on Chandrasekhar's reasoning, J. Robert Oppenheimer and George Volkoff predicted that neutron stars below a certain mass limit, later called the Tolman–Oppenheimer–Volkoff limit, would be stable due to neutron degeneracy pressure. Above that limit, they reasoned that either their model would not apply or that gravitational contraction would not stop.: 380 John Archibald Wheeler and two of his students resolved questions about the model behind the Tolman–Oppenheimer–Volkoff (TOV) limit. Harrison and Wheeler developed the equations of state relating density to pressure for cold matter all the way through electron degeneracy and neutron degeneracy. Masami Wakano and Wheeler then used the equations to compute the equilibrium curve for stars, relating mass to circumference. They found no additional features that would invalidate the TOV limit. This meant that the only thing that could prevent black holes from forming was a dynamic process ejecting sufficient mass from a star as it cooled.: 205 The modern concept of black holes was formulated by Robert Oppenheimer and his student Hartland Snyder in 1939.: 80 In the paper, Oppenheimer and Snyder solved Einstein's equations of general relativity for an idealized imploding star, in a model later called the Oppenheimer–Snyder model, then described the results from far outside the star. The implosion starts as one might expect: the star material rapidly collapses inward. However, as the density of the star increases, gravitational time dilation increases and the collapse, viewed from afar, seems to slow down further and further until the star reaches its Schwarzschild radius, where it appears frozen in time.: 217 In 1958, David Finkelstein identified the Schwarzschild surface as an event horizon, calling it "a perfect unidirectional membrane: causal influences can cross it in only one direction". In this sense, events that occur inside of the black hole cannot affect events that occur outside of the black hole. Finkelstein created a new reference frame to include the point of view of infalling observers.: 103 Finkelstein's new frame of reference allowed events at the surface of an imploding star to be related to events far away. By 1962 the two points of view were reconciled, convincing many skeptics that implosion into a black hole made physical sense.: 226 The era from the mid-1960s to the mid-1970s was the "golden age of black hole research", when general relativity and black holes became mainstream subjects of research.: 258 In this period, more general black hole solutions were found. In 1963, Roy Kerr found the exact solution for a rotating black hole. Two years later, Ezra Newman found the cylindrically symmetric solution for a black hole that is both rotating and electrically charged. In 1967, Werner Israel found that the Schwarzschild solution was the only possible solution for a nonspinning, uncharged black hole, meaning that a Schwarzschild black hole would be defined by its mass alone. Similar identities were later found for Reissner-Nordstrom and Kerr black holes, defined only by their mass and their charge or spin respectively. Together, these findings became known as the no-hair theorem, which states that a stationary black hole is completely described by the three parameters of the Kerr–Newman metric: mass, angular momentum, and electric charge. At first, it was suspected that the strange mathematical singularities found in each of the black hole solutions only appeared due to the assumption that a black hole would be perfectly spherically symmetric, and therefore the singularities would not appear in generic situations where black holes would not necessarily be symmetric. This view was held in particular by Vladimir Belinski, Isaak Khalatnikov, and Evgeny Lifshitz, who tried to prove that no singularities appear in generic solutions, although they would later reverse their positions. However, in 1965, Roger Penrose proved that general relativity without quantum mechanics requires that singularities appear in all black holes. Astronomical observations also made great strides during this era. In 1967, Antony Hewish and Jocelyn Bell Burnell discovered pulsars and by 1969, these were shown to be rapidly rotating neutron stars. Until that time, neutron stars, like black holes, were regarded as just theoretical curiosities, but the discovery of pulsars showed their physical relevance and spurred a further interest in all types of compact objects that might be formed by gravitational collapse. Based on observations in Greenwich and Toronto in the early 1970s, Cygnus X-1, a galactic X-ray source discovered in 1964, became the first astronomical object commonly accepted to be a black hole. Work by James Bardeen, Jacob Bekenstein, Carter, and Hawking in the early 1970s led to the formulation of black hole thermodynamics. These laws describe the behaviour of a black hole in close analogy to the laws of thermodynamics by relating mass to energy, area to entropy, and surface gravity to temperature. The analogy was completed: 442 when Hawking, in 1974, showed that quantum field theory implies that black holes should radiate like a black body with a temperature proportional to the surface gravity of the black hole, predicting the effect now known as Hawking radiation. While Cygnus X-1, a stellar-mass black hole, was generally accepted by the scientific community as a black hole by the end of 1973, it would be decades before a supermassive black hole would gain the same broad recognition. Although, as early as the 1960s, physicists such as Donald Lynden-Bell and Martin Rees had suggested that powerful quasars in the center of galaxies were powered by accreting supermassive black holes, little observational proof existed at the time. However, the Hubble Space Telescope, launched decades later, found that supermassive black holes were not only present in these active galactic nuclei, but that supermassive black holes in the center of galaxies were ubiquitous: Almost every galaxy had a supermassive black hole at its center, many of which were quiescent. In 1999, David Merritt proposed the M–sigma relation, which related the dispersion of the velocity of matter in the center bulge of a galaxy to the mass of the supermassive black hole at its core. Subsequent studies confirmed this correlation. Around the same time, based on telescope observations of the velocities of stars at the center of the Milky Way galaxy, independent work groups led by Andrea Ghez and Reinhard Genzel concluded that the compact radio source in the center of the galaxy, Sagittarius A*, was likely a supermassive black hole. On 11 February 2016, the LIGO Scientific Collaboration and Virgo Collaboration announced the first direct detection of gravitational waves, named GW150914, representing the first observation of a black hole merger. At the time of the merger, the black holes were approximately 1.4 billion light-years away from Earth and had masses of 30 and 35 solar masses.: 6 In 2017, Rainer Weiss, Kip Thorne, and Barry Barish, who had spearheaded the project, were awarded the Nobel Prize in Physics for their work. Since the initial discovery in 2015, hundreds more gravitational waves have been observed by LIGO and another interferometer, Virgo. On 10 April 2019, the first direct image of a black hole and its vicinity was published, following observations made by the Event Horizon Telescope (EHT) in 2017 of the supermassive black hole in Messier 87's galactic centre. In 2022, the Event Horizon Telescope collaboration released an image of the black hole in the center of the Milky Way galaxy, Sagittarius A*; The data had been collected in 2017. In 2020, the Nobel Prize in Physics was awarded for work on black holes. Andrea Ghez and Reinhard Genzel shared one-half for their discovery that Sagittarius A* is a supermassive black hole. Penrose received the other half for his work showing that the mathematics of general relativity requires the formation of black holes. Cosmologists lamented that Hawking's extensive theoretical work on black holes would not be honored since he died in 2018. In December 1967, a student reportedly suggested the phrase black hole at a lecture by John Wheeler; Wheeler adopted the term for its brevity and "advertising value", and Wheeler's stature in the field ensured it quickly caught on, leading some to credit Wheeler with coining the phrase. However, the term was used by others around that time. Science writer Marcia Bartusiak traces the term black hole to physicist Robert H. Dicke, who in the early 1960s reportedly compared the phenomenon to the Black Hole of Calcutta, notorious as a prison where people entered but never left alive. The term was used in print by Life and Science News magazines in 1963, and by science journalist Ann Ewing in her article "'Black Holes' in Space", dated 18 January 1964, which was a report on a meeting of the American Association for the Advancement of Science held in Cleveland, Ohio. Definition A black hole is generally defined as a region of spacetime from which no information-carrying signals or objects can escape. However, verifying an object as a black hole by this definition would require waiting for an infinite time and at an infinite distance from the black hole to verify that indeed, nothing has escaped, and thus cannot be used to identify a physical black hole. Broadly, physicists do not have a precisely-agreed-upon definition of a black hole. Among astrophysicists, a black hole is a compact object with a mass larger than four solar masses. A black hole may also be defined as a reservoir of information: 142 or a region where space is falling inwards faster than the speed of light. Properties The no-hair theorem postulates that, once it achieves a stable condition after formation, a black hole has only three independent physical properties: mass, electric charge, and angular momentum; the black hole is otherwise featureless. If the conjecture is true, any two black holes that share the same values for these properties, or parameters, are indistinguishable from one another. The degree to which the conjecture is true for real black holes is currently an unsolved problem. The simplest static black holes have mass but neither electric charge nor angular momentum. According to Birkhoff's theorem, these Schwarzschild black holes are the only vacuum solution that is spherically symmetric. Solutions describing more general black holes also exist. Non-rotating charged black holes are described by the Reissner–Nordström metric, while the Kerr metric describes a non-charged rotating black hole. The most general stationary black hole solution known is the Kerr–Newman metric, which describes a black hole with both charge and angular momentum. The simplest static black holes have mass but neither electric charge nor angular momentum. Contrary to the popular notion of a black hole "sucking in everything" in its surroundings, from far away, the external gravitational field of a black hole is identical to that of any other body of the same mass. While a black hole can theoretically have any positive mass, the charge and angular momentum are constrained by the mass. The total electric charge Q and the total angular momentum J are expected to satisfy the inequality Q 2 4 π ϵ 0 + c 2 J 2 G M 2 ≤ G M 2 {\displaystyle {\frac {Q^{2}}{4\pi \epsilon _{0}}}+{\frac {c^{2}J^{2}}{GM^{2}}}\leq GM^{2}} for a black hole of mass M. Black holes with the maximum possible charge or spin satisfying this inequality are called extremal black holes. Solutions of Einstein's equations that violate this inequality exist, but they do not possess an event horizon. These are so-called naked singularities that can be observed from the outside. Because these singularities make the universe inherently unpredictable, many physicists believe they could not exist. The weak cosmic censorship hypothesis, proposed by Sir Roger Penrose, rules out the formation of such singularities, when they are created through the gravitational collapse of realistic matter. However, this theory has not yet been proven, and some physicists believe that naked singularities could exist. It is also unknown whether black holes could even become extremal, forming naked singularities, since natural processes counteract increasing spin and charge when a black hole becomes near-extremal. The total mass of a black hole can be estimated by analyzing the motion of objects near the black hole, such as stars or gas. All black holes spin, often fast—One supermassive black hole, GRS 1915+105 has been estimated to spin at over 1,000 revolutions per second. The Milky Way's central black hole Sagittarius A* rotates at about 90% of the maximum rate. The spin rate can be inferred from measurements of atomic spectral lines in the X-ray range. As gas near the black hole plunges inward, high energy X-ray emission from electron-positron pairs illuminates the gas further out, appearing red-shifted due to relativistic effects. Depending on the spin of the black hole, this plunge happens at different radii from the hole, with different degrees of redshift. Astronomers can use the gap between the x-ray emission of the outer disk and the redshifted emission from plunging material to determine the spin of the black hole. A newer way to estimate spin is based on the temperature of gasses accreting onto the black hole. The method requires an independent measurement of the black hole mass and inclination angle of the accretion disk followed by computer modeling. Gravitational waves from coalescing binary black holes can also provide the spin of both progenitor black holes and the merged hole, but such events are rare. A spinning black hole has angular momentum. The supermassive black hole in the center of the Messier 87 (M87) galaxy appears to have an angular momentum very close to the maximum theoretical value. That uncharged limit is J ≤ G M 2 c , {\displaystyle J\leq {\frac {GM^{2}}{c}},} allowing definition of a dimensionless spin magnitude such that 0 ≤ c J G M 2 ≤ 1. {\displaystyle 0\leq {\frac {cJ}{GM^{2}}}\leq 1.} Most black holes are believed to have an approximately neutral charge. For example, Michal Zajaček, Arman Tursunov, Andreas Eckart, and Silke Britzen found the electric charge of Sagittarius A* to be at least ten orders of magnitude below the theoretical maximum. A charged black hole repels other like charges just like any other charged object. If a black hole were to become charged, particles with an opposite sign of charge would be pulled in by the extra electromagnetic force, while particles with the same sign of charge would be repelled, neutralizing the black hole. This effect may not be as strong if the black hole is also spinning. The presence of charge can reduce the diameter of the black hole by up to 38%. The charge Q for a nonspinning black hole is bounded by Q ≤ G M , {\displaystyle Q\leq {\sqrt {G}}M,} where G is the gravitational constant and M is the black hole's mass. Classification Black holes can have a wide range of masses. The minimum mass of a black hole formed by stellar gravitational collapse is governed by the maximum mass of a neutron star and is believed to be approximately two-to-four solar masses. However, theoretical primordial black holes, believed to have formed soon after the Big Bang, could be far smaller, with masses as little as 10−5 grams at formation. These very small black holes are sometimes called micro black holes. Black holes formed by stellar collapse are called stellar black holes. Estimates of their maximum mass at formation vary, but generally range from 10 to 100 solar masses, with higher estimates for black holes progenated by low-metallicity stars. The mass of a black hole formed via a supernova has a lower bound: If the progenitor star is too small, the collapse may be stopped by the degeneracy pressure of the star's constituents, allowing the condensation of matter into an exotic denser state. Degeneracy pressure occurs from the Pauli exclusion principle—Particles will resist being in the same place as each other. Smaller progenitor stars, with masses less than about 8 M☉, will be held together by the degeneracy pressure of electrons and will become a white dwarf. For more massive progenitor stars, electron degeneracy pressure is no longer strong enough to resist the force of gravity and the star will be held together by neutron degeneracy pressure, which can occur at much higher densities, forming a neutron star. If the star is still too massive, even neutron degeneracy pressure will not be able to resist the force of gravity and the star will collapse into a black hole.: 5.8 Stellar black holes can also gain mass via accretion of nearby matter, often from a companion object such as a star. Black holes that are larger than stellar black holes but smaller than supermassive black holes are called intermediate-mass black holes, with masses of approximately 102 to 105 solar masses. These black holes seem to be rarer than their stellar and supermassive counterparts, with relatively few candidates having been observed. Physicists have speculated that such black holes may form from collisions in globular and star clusters or at the center of low-mass galaxies. They may also form as the result of mergers of smaller black holes, with several LIGO observations finding merged black holes within the 110-350 solar mass range. The black holes with the largest masses are called supermassive black holes, with masses more than 106 times that of the Sun. These black holes are believed to exist at the centers of almost every large galaxy, including the Milky Way. Some scientists have proposed a subcategory of even larger black holes, called ultramassive black holes, with masses greater than 109-1010 solar masses. Theoretical models predict that the accretion disc that feeds black holes will be unstable once a black hole reaches 50-100 billion times the mass of the Sun, setting a rough upper limit to black hole mass. Structure While black holes are conceptually invisible sinks of all matter and light, in astronomical settings, their enormous gravity alters the motion of surrounding objects and pulls nearby gas inwards at near-light speed, making the area around black holes the brightest objects in the universe. Some black holes have relativistic jets—thin streams of plasma travelling away from the black hole at more than one-tenth of the speed of light. A small faction of the matter falling towards the black hole gets accelerated away along the hole rotation axis. These jets can extend as far as millions of parsecs from the black hole itself. Black holes of any mass can have jets. However, they are typically observed around spinning black holes with strongly-magnetized accretion disks. Relativistic jets were more common in the early universe, when galaxies and their corresponding supermassive black holes were rapidly gaining mass. All black holes with jets also have an accretion disk, but the jets are usually brighter than the disk. Quasars, typically found in other galaxies, are believed to be supermassive black holes with jets; microquasars are believed to be stellar-mass objects with jets, typically observed in the Milky Way. The mechanism of formation of jets is not yet known, but several options have been proposed. One method proposed to fuel these jets is the Blandford-Znajek process, which suggests that the dragging of magnetic field lines by a black hole's rotation could launch jets of matter into space. The Penrose process, which involves extraction of a black hole's rotational energy, has also been proposed as a potential mechanism of jet propulsion. Due to conservation of angular momentum, gas falling into the gravitational well created by a massive object will typically form a disk-like structure around the object.: 242 As the disk's angular momentum is transferred outward due to internal processes, its matter falls farther inward, converting its gravitational energy into heat and releasing a large flux of x-rays. The temperature of these disks can range from thousands to millions of Kelvin, and temperatures can differ throughout a single accretion disk. Accretion disks can also emit in other parts of the electromagnetic spectrum, depending on the disk's turbulence and magnetization and the black hole's mass and angular momentum. Accretion disks can be defined as geometrically thin or geometrically thick. Geometrically thin disks are mostly confined to the black hole's equatorial plane and have a well-defined edge at the innermost stable circular orbit (ISCO), while geometrically thick disks are supported by internal pressure and temperature and can extend inside the ISCO. Disks with high rates of electron scattering and absorption, appearing bright and opaque, are called optically thick; optically thin disks are more translucent and produce fainter images when viewed from afar. Accretion disks of black holes accreting beyond the Eddington limit are often referred to as polish donuts due to their thick, toroidal shape that resembles that of a donut. Quasar accretion disks are expected to usually appear blue in color. The disk for a stellar black hole, on the other hand, would likely look orange, yellow, or red, with its inner regions being the brightest. Theoretical research suggests that the hotter a disk is, the bluer it should be, although this is not always supported by observations of real astronomical objects. Accretion disk colors may also be altered by the Doppler effect, with the part of the disk travelling towards an observer appearing bluer and brighter and the part of the disk travelling away from the observer appearing redder and dimmer. In Newtonian gravity, test particles can stably orbit at arbitrary distances from a central object. In general relativity, however, there exists a smallest possible radius for which a massive particle can orbit stably. Any infinitesimal inward perturbations to this orbit will lead to the particle spiraling into the black hole, and any outward perturbations will, depending on the energy, cause the particle to spiral in, move to a stable orbit further from the black hole, or escape to infinity. This orbit is called the innermost stable circular orbit, or ISCO. The location of the ISCO depends on the spin of the black hole and the spin of the particle itself. In the case of a Schwarzschild black hole (spin zero) and a particle without spin, the location of the ISCO is: r I S C O = 3 r s = 6 G M c 2 , {\displaystyle r_{\rm {ISCO}}=3\,r_{\text{s}}={\frac {6\,GM}{c^{2}}},} where r I S C O {\displaystyle r_{\rm {_{ISCO}}}} is the radius of the ISCO, r s {\displaystyle r_{\text{s}}} is the Schwarzschild radius of the black hole, G {\displaystyle G} is the gravitational constant, and c {\displaystyle c} is the speed of light. The radius of this orbit changes slightly based on particle spin. For charged black holes, the ISCO moves inwards. For spinning black holes, the ISCO is moved inwards for particles orbiting in the same direction that the black hole is spinning (prograde) and outwards for particles orbiting in the opposite direction (retrograde). For example, the ISCO for a particle orbiting retrograde can be as far out as about 9 r s {\displaystyle 9r_{\text{s}}} , while the ISCO for a particle orbiting prograde can be as close as at the event horizon itself. The photon sphere is a spherical boundary for which photons moving on tangents to that sphere are bent completely around the black hole, possibly orbiting multiple times. Light rays with impact parameters less than the radius of the photon sphere enter the black hole. For Schwarzschild black holes, the photon sphere has a radius 1.5 times the Schwarzschild radius; the radius for non-Schwarzschild black holes is at least 1.5 times the radius of the event horizon. When viewed from a great distance, the photon sphere creates an observable black hole shadow. Since no light emerges from within the black hole, this shadow is the limit for possible observations.: 152 The shadow of colliding black holes should have characteristic warped shapes, allowing scientists to detect black holes that are about to merge. While light can still escape from the photon sphere, any light that crosses the photon sphere on an inbound trajectory will be captured by the black hole. Therefore, any light that reaches an outside observer from the photon sphere must have been emitted by objects between the photon sphere and the event horizon. Light emitted towards the photon sphere may also curve around the black hole and return to the emitter. For a rotating, uncharged black hole, the radius of the photon sphere depends on the spin parameter and whether the photon is orbiting prograde or retrograde. For a photon orbiting prograde, the photon sphere will be 1-3 Schwarzschild radii from the center of the black hole, while for a photon orbiting retrograde, the photon sphere will be between 3-5 Schwarzschild radii from the center of the black hole. The exact location of the photon sphere depends on the magnitude of the black hole's rotation. For a charged, nonrotating black hole, there will only be one photon sphere, and the radius of the photon sphere will decrease for increasing black hole charge. For non-extremal, charged, rotating black holes, there will always be two photon spheres, with the exact radii depending on the parameters of the black hole. Near a rotating black hole, spacetime rotates similar to a vortex. The rotating spacetime will drag any matter and light into rotation around the spinning black hole. This effect of general relativity, called frame dragging, gets stronger closer to the spinning mass. The region of spacetime in which it is impossible to stay still is called the ergosphere. The ergosphere of a black hole is a volume bounded by the black hole's event horizon and the ergosurface, which coincides with the event horizon at the poles but bulges out from it around the equator. Matter and radiation can escape from the ergosphere. Through the Penrose process, objects can emerge from the ergosphere with more energy than they entered with. The extra energy is taken from the rotational energy of the black hole, slowing down the rotation of the black hole.: 268 A variation of the Penrose process in the presence of strong magnetic fields, the Blandford–Znajek process, is considered a likely mechanism for the enormous luminosity and relativistic jets of quasars and other active galactic nuclei. The observable region of spacetime around a black hole closest to its event horizon is called the plunging region. In this area it is no longer possible for free falling matter to follow circular orbits or stop a final descent into the black hole. Instead, it will rapidly plunge toward the black hole at close to the speed of light, growing increasingly hot and producing a characteristic, detectable thermal emission. However, light and radiation emitted from this region can still escape from the black hole's gravitational pull. For a nonspinning, uncharged black hole, the radius of the event horizon, or Schwarzschild radius, is proportional to the mass, M, through r s = 2 G M c 2 ≈ 2.95 M M ⊙ k m , {\displaystyle r_{\mathrm {s} }={\frac {2GM}{c^{2}}}\approx 2.95\,{\frac {M}{M_{\odot }}}~\mathrm {km,} } where rs is the Schwarzschild radius and M☉ is the mass of the Sun.: 124 For a black hole with nonzero spin or electric charge, the radius is smaller,[Note 1] until an extremal black hole could have an event horizon close to r + = G M c 2 , {\displaystyle r_{\mathrm {+} }={\frac {GM}{c^{2}}},} half the radius of a nonspinning, uncharged black hole of the same mass. Since the volume within the Schwarzschild radius increase with the cube of the radius, average density of a black hole inside its Schwarzschild radius is inversely proportional to the square of its mass: supermassive black holes are much less dense than stellar black holes. The average density of a 108 M☉ black hole is comparable to that of water. The defining feature of a black hole is the existence of an event horizon, a boundary in spacetime through which matter and light can pass only inward towards the center of the black hole. Nothing, not even light, can escape from inside the event horizon. The event horizon is referred to as such because if an event occurs within the boundary, information from that event cannot reach or affect an outside observer, making it impossible to determine whether such an event occurred.: 179 For non-rotating black holes, the geometry of the event horizon is precisely spherical, while for rotating black holes, the event horizon is oblate. To a distant observer, a clock near a black hole would appear to tick more slowly than one further from the black hole.: 217 This effect, known as gravitational time dilation, would also cause an object falling into a black hole to appear to slow as it approached the event horizon, never quite reaching the horizon from the perspective of an outside observer.: 218 All processes on this object would appear to slow down, and any light emitted by the object to appear redder and dimmer, an effect known as gravitational redshift. An object falling from half of a Schwarzschild radius above the event horizon would fade away until it could no longer be seen, disappearing from view within one hundredth of a second. It would also appear to flatten onto the black hole, joining all other material that had ever fallen into the hole. On the other hand, an observer falling into a black hole would not notice any of these effects as they cross the event horizon. Their own clocks appear to them to tick normally, and they cross the event horizon after a finite time without noting any singular behaviour. In general relativity, it is impossible to determine the location of the event horizon from local observations, due to Einstein's equivalence principle.: 222 Black holes that are rotating and/or charged have an inner horizon, often called the Cauchy horizon, inside of the black hole. The inner horizon is divided up into two segments: an ingoing section and an outgoing section. At the ingoing section of the Cauchy horizon, radiation and matter that fall into the black hole would build up at the horizon, causing the curvature of spacetime to go to infinity. This would cause an observer falling in to experience tidal forces. This phenomenon is often called mass inflation, since it is associated with a parameter dictating the black hole's internal mass growing exponentially, and the buildup of tidal forces is called the mass-inflation singularity or Cauchy horizon singularity. Some physicists have argued that in realistic black holes, accretion and Hawking radiation would stop mass inflation from occurring. At the outgoing section of the inner horizon, infalling radiation would backscatter off of the black hole's spacetime curvature and travel outward, building up at the outgoing Cauchy horizon. This would cause an infalling observer to experience a gravitational shock wave and tidal forces as the spacetime curvature at the horizon grew to infinity. This buildup of tidal forces is called the shock singularity. Both of these singularities are weak, meaning that an object crossing them would only be deformed a finite amount by tidal forces, even though the spacetime curvature would still be infinite at the singularity. This is as opposed to a strong singularity, where an object hitting the singularity would be stretched and squeezed by an infinite amount. They are also null singularities, meaning that a photon could travel parallel to the them without ever being intercepted. Ignoring quantum effects, every black hole has a singularity inside, points where the curvature of spacetime becomes infinite, and geodesics terminate within a finite proper time.: 205 For a non-rotating black hole, this region takes the shape of a single point; for a rotating black hole it is smeared out to form a ring singularity that lies in the plane of rotation.: 264 In both cases, the singular region has zero volume. All of the mass of the black hole ends up in the singularity.: 252 Since the singularity has nonzero mass in an infinitely small space, it can be thought of as having infinite density. Observers falling into a Schwarzschild black hole (i.e., non-rotating and not charged) cannot avoid being carried into the singularity once they cross the event horizon. As they fall further into the black hole, they will be torn apart by the growing tidal forces in a process sometimes referred to as spaghettification or the noodle effect. Eventually, they will reach the singularity and be crushed into an infinitely small point.: 182 However any perturbations, such as those caused by matter or radiation falling in, would cause space to oscillate chaotically near the singularity. Any matter falling in would experience intense tidal forces rapidly changing in direction, all while being compressed into an increasingly small volume. Alternative forms of general relativity, including addition of some quatum effects, can lead to regular, or nonsingular, black holes without singularities. For example, the fuzzball model, based on string theory, states that black holes are actually made up of quantum microstates and need not have a singularity or an event horizon. The theory of loop quantum gravity proposes that the curvature and density at the center of a black hole is large, but not infinite. Formation Black holes are formed by gravitational collapse of massive stars, either by direct collapse or during a supernova explosion in a process called fallback. Black holes can result from the merger of two neutron stars or a neutron star and a black hole. Other more speculative mechanisms include primordial black holes created from density fluctuations in the early universe, the collapse of dark stars, a hypothetical object powered by annihilation of dark matter, or from hypothetical self-interacting dark matter. Gravitational collapse occurs when an object's internal pressure is insufficient to resist the object's own gravity. At the end of a star's life, it will run out of hydrogen to fuse, and will start fusing more and more massive elements, until it gets to iron. Since the fusion of elements heavier than iron would require more energy than it would release, nuclear fusion ceases. If the iron core of the star is too massive, the star will no longer be able to support itself and will undergo gravitational collapse. While most of the energy released during gravitational collapse is emitted very quickly, an outside observer does not actually see the end of this process. Even though the collapse takes a finite amount of time from the reference frame of infalling matter, a distant observer would see the infalling material slow and halt just above the event horizon, due to gravitational time dilation. Light from the collapsing material takes longer and longer to reach the observer, with the delay growing to infinity as the emitting material reaches the event horizon. Thus the external observer never sees the formation of the event horizon; instead, the collapsing material seems to become dimmer and increasingly red-shifted, eventually fading away. Observations of quasars at redshift z ∼ 7 {\displaystyle z\sim 7} , less than a billion years after the Big Bang, has led to investigations of other ways to form black holes. The accretion process to build supermassive black holes has a limiting rate of mass accumulation and a billion years is not enough time to reach quasar status. One suggestion is direct collapse of nearly pure hydrogen gas (low metalicity) clouds characteristic of the young universe, forming a supermassive star which collapses into a black hole. It has been suggested that seed black holes with typical masses of ~105 M☉ could have formed in this way which then could grow to ~109 M☉. However, the very large amount of gas required for direct collapse is not typically stable to fragmentation to form multiple stars. Thus another approach suggests massive star formation followed by collisions that seed massive black holes which ultimately merge to create a quasar.: 85 A neutron star in a common envelope with a regular star can accrete sufficient material to collapse to a black hole or two neutron stars can merge. These avenues for the formation of black holes are considered relatively rare. In the current epoch of the universe, conditions needed to form black holes are rare and are mostly only found in stars. However, in the early universe, conditions may have allowed for black hole formations via other means. Fluctuations of spacetime soon after the Big Bang may have formed areas that were denser then their surroundings. Initially, these regions would not have been compact enough to form a black hole, but eventually, the curvature of spacetime in the regions become large enough to cause them to collapse into a black hole. Different models for the early universe vary widely in their predictions of the scale of these fluctuations. Various models predict the creation of primordial black holes ranging from a Planck mass (~2.2×10−8 kg) to hundreds of thousands of solar masses. Primordial black holes with masses less than 1015 g would have evaporated by now due to Hawking radiation. Despite the early universe being extremely dense, it did not re-collapse into a black hole during the Big Bang, since the universe was expanding rapidly and did not have the gravitational differential necessary for black hole formation. Models for the gravitational collapse of objects of relatively constant size, such as stars, do not necessarily apply in the same way to rapidly expanding space such as the Big Bang. In principle, black holes could be formed in high-energy particle collisions that achieve sufficient density, although no such events have been detected. These hypothetical micro black holes, which could form from the collision of cosmic rays and Earth's atmosphere or in particle accelerators like the Large Hadron Collider, would not be able to aggregate additional mass. Instead, they would evaporate in about 10−25 seconds, posing no threat to the Earth. Evolution Black holes can also merge with other objects such as stars or even other black holes. This is thought to have been important, especially in the early growth of supermassive black holes, which could have formed from the aggregation of many smaller objects. The process has also been proposed as the origin of some intermediate-mass black holes. Mergers of supermassive black holes may take a long time: As a binary of supermassive black holes approach each other, most nearby stars are ejected, leaving little for the remaining black holes to gravitationally interact with that would allow them to get closer to each other. This phenomenon has been called the final parsec problem, as the distance at which this happens is usually around one parsec. When a black hole accretes matter, the gas in the inner accretion disk orbits at very high speeds because of its proximity to the black hole. The resulting friction heats the inner disk to temperatures at which it emits vast amounts of electromagnetic radiation (mainly X-rays) detectable by telescopes. By the time the matter of the disk reaches the ISCO, between 5.7% and 42% of its mass will have been converted to energy, depending on the black hole's spin. About 90% of this energy is released within about 20 black hole radii. In many cases, accretion disks are accompanied by relativistic jets that are emitted along the black hole's poles, which carry away much of the energy. The mechanism for the creation of these jets is currently not well understood, in part due to insufficient data. Many of the universe's most energetic phenomena have been attributed to the accretion of matter on black holes. Active galactic nuclei and quasars are believed to be the accretion disks of supermassive black holes. X-ray binaries are generally accepted to be binary systems in which one of the two objects is a compact object accreting matter from its companion. Ultraluminous X-ray sources may be the accretion disks of intermediate-mass black holes. At a certain rate of accretion, the outward radiation pressure will become as strong as the inward gravitational force, and the black hole should unable to accrete any faster. This limit is called the Eddington limit. However, many black holes accrete beyond this rate due to their non-spherical geometry or instabilities in the accretion disk. Accretion beyond the limit is called Super-Eddington accretion and may have been commonplace in the early universe. Stars have been observed to get torn apart by tidal forces in the immediate vicinity of supermassive black holes in galaxy nuclei, in what is known as a tidal disruption event (TDE). Some of the material from the disrupted star forms an accretion disk around the black hole, which emits observable electromagnetic radiation. The correlation between the masses of supermassive black holes in the centres of galaxies with the velocity dispersion and mass of stars in their host bulges suggests that the formation of galaxies and the formation of their central black holes are related. Black hole winds from rapid accretion, particularly when the galaxy itself is still accreting matter, can compress gas nearby, accelerating star formation. However, if the winds become too strong, the black hole may blow nearly all of the gas out of the galaxy, quenching star formation. Black hole jets may also energize nearby cavities of plasma and eject low-entropy gas from out of the galactic core, causing gas in galactic centers to be hotter than expected. If Hawking's theory of black hole radiation is correct, then black holes are expected to shrink and evaporate over time as they lose mass by the emission of photons and other particles. The temperature of this thermal spectrum (Hawking temperature) is proportional to the surface gravity of the black hole, which is inversely proportional to the mass. Hence, large black holes emit less radiation than small black holes.: Ch. 9.6 A stellar black hole of 1 M☉ has a Hawking temperature of 62 nanokelvins. This is far less than the 2.7 K temperature of the cosmic microwave background radiation. Stellar-mass or larger black holes receive more mass from the cosmic microwave background than they emit through Hawking radiation and thus will grow instead of shrinking. To have a Hawking temperature larger than 2.7 K (and be able to evaporate), a black hole would need a mass less than the Moon. Such a black hole would have a diameter of less than a tenth of a millimetre. The Hawking radiation for an astrophysical black hole is predicted to be very weak and would thus be exceedingly difficult to detect from Earth. A possible exception is the burst of gamma rays emitted in the last stage of the evaporation of primordial black holes. Searches for such flashes have proven unsuccessful and provide stringent limits on the possibility of existence of low mass primordial black holes, with modern research predicting that primordial black holes must make up less than a fraction of 10−7 of the universe's total mass. NASA's Fermi Gamma-ray Space Telescope, launched in 2008, has searched for these flashes, but has not yet found any. The properties of a black hole are constrained and interrelated by the theories that predict these properties. When based on general relativity, these relationships are called the laws of black hole mechanics. For a black hole that is not still forming or accreting matter, the zeroth law of black hole mechanics states the black hole's surface gravity is constant across the event horizon. The first law relates changes in the black hole's surface area, angular momentum, and charge to changes in its energy. The second law says the surface area of a black hole never decreases on its own. Finally, the third law says that the surface gravity of a black hole is never zero. These laws are mathematical analogs of the laws of thermodynamics. They are not equivalent, however, because, according to general relativity without quantum mechanics, a black hole can never emit radiation, and thus its temperature must always be zero.: 11 Quantum mechanics predicts that a black hole will continuously emit thermal Hawking radiation, and therefore must always have a nonzero temperature. It also predicts that all black holes have entropy which scales with their surface area. When quantum mechanics is accounted for, the laws of black hole mechanics become equivalent to the classical laws of thermodynamics. However, these conclusions are derived without a complete theory of quantum gravity, although many potential theories do predict black holes having entropy and temperature. Thus, the true quantum nature of black hole thermodynamics continues to be debated.: 29 Observational evidence Millions of black holes with around 30 solar masses derived from stellar collapse are expected to exist in the Milky Way. Even a dwarf galaxy like Draco should have hundreds. Only a few of these have been detected. By nature, black holes do not themselves emit any electromagnetic radiation other than the hypothetical Hawking radiation, so astrophysicists searching for black holes must generally rely on indirect observations. The defining characteristic of a black hole is its event horizon. The horizon itself cannot be imaged, so all other possible explanations for these indirect observations must be considered and eliminated before concluding that a black hole has been observed.: 11 The Event Horizon Telescope (EHT) is a global system of radio telescopes capable of directly observing a black hole shadow. The angular resolution of a telescope is based on its aperture and the wavelengths it is observing. Because the angular diameters of Sagittarius A* and Messier 87* in the sky are very small, a single telescope would need to be about the size of the Earth to clearly distinguish their horizons using radio wavelengths. By combining data from several different radio telescopes around the world, the Event Horizon Telescope creates an effective aperture the diameter size of the Earth. The EHT team used imaging algorithms to compute the most probable image from the data in its observations of Sagittarius A* and M87*. Gravitational-wave interferometry can be used to detect merging black holes and other compact objects. In this method, a laser beam is split down two long arms of a tunnel. The laser beams reflect off of mirrors in the tunnels and converge at the intersection of the arms, cancelling each other out. However, when a gravitational wave passes, it warps spacetime, changing the lengths of the arms themselves. Since each laser beam is now travelling a slightly different distance, they do not cancel out and produce a recognizable signal. Analysis of the signal can give scientists information about what caused the gravitational waves. Since gravitational waves are very weak, gravitational-wave observatories such as LIGO must have arms several kilometers long and carefully control for noise from Earth to be able to detect these gravitational waves. Since the first measurements in 2016, multiple gravitational waves from black holes have been detected and analyzed. The proper motions of stars near the centre of the Milky Way provide strong observational evidence that these stars are orbiting a supermassive black hole. Since 1995, astronomers have tracked the motions of 90 stars orbiting an invisible object coincident with the radio source Sagittarius A*. In 1998, by fitting the motions of the stars to Keplerian orbits, the astronomers were able to infer that Sagittarius A* must be a 2.6×106 M☉ object must be contained within a radius of 0.02 light-years. Since then, one of the stars—called S2—has completed a full orbit. From the orbital data, astronomers were able to refine the calculations of the mass of Sagittarius A* to 4.3×106 M☉, with a radius of less than 0.002 light-years. This upper limit radius is larger than the Schwarzschild radius for the estimated mass, so the combination does not prove Sagittarius A* is a black hole. Nevertheless, these observations strongly suggest that the central object is a supermassive black hole as there are no other plausible scenarios for confining so much invisible mass into such a small volume. Additionally, there is some observational evidence that this object might possess an event horizon, a feature unique to black holes. The Event Horizon Telescope image of Sagittarius A*, released in 2022, provided further confirmation that it is indeed a black hole. X-ray binaries are binary systems that emit a majority of their radiation in the X-ray part of the electromagnetic spectrum. These X-ray emissions result when a compact object accretes matter from an ordinary star. The presence of an ordinary star in such a system provides an opportunity for studying the central object and to determine if it might be a black hole. By measuring the orbital period of the binary, the distance to the binary from Earth, and the mass of the companion star, scientists can estimate the mass of the compact object. The Tolman-Oppenheimer-Volkoff limit (TOV limit) dictates the largest mass a nonrotating neutron star can be, and is estimated to be about two solar masses. While a rotating neutron star can be slightly more massive, if the compact object is much more massive than the TOV limit, it cannot be a neutron star and is generally expected to be a black hole. The first strong candidate for a black hole, Cygnus X-1, was discovered in this way by Charles Thomas Bolton, Louise Webster, and Paul Murdin in 1972. Observations of rotation broadening of the optical star reported in 1986 lead to a compact object mass estimate of 16 solar masses, with 7 solar masses as the lower bound. In 2011, this estimate was updated to 14.1±1.0 M☉ for the black hole and 19.2±1.9 M☉ for the optical stellar companion. X-ray binaries can be categorized as either low-mass or high-mass; This classification is based on the mass of the companion star, not the compact object itself. In a class of X-ray binaries called soft X-ray transients, the companion star is of relatively low mass, allowing for more accurate estimates of the black hole mass. These systems actively emit X-rays for only several months once every 10–50 years. During the period of low X-ray emission, called quiescence, the accretion disk is extremely faint, allowing detailed observation of the companion star. Numerous black hole candidates have been measured by this method. Black holes are also sometimes found in binaries with other compact objects, such as white dwarfs, neutron stars, and other black holes. The centre of nearly every galaxy contains a supermassive black hole. The close observational correlation between the mass of this hole and the velocity dispersion of the host galaxy's bulge, known as the M–sigma relation, strongly suggests a connection between the formation of the black hole and that of the galaxy itself. Astronomers use the term active galaxy to describe galaxies with unusual characteristics, such as unusual spectral line emission and very strong radio emission. Theoretical and observational studies have shown that the high levels of activity in the centers of these galaxies, regions called active galactic nuclei (AGN), may be explained by accretion onto supermassive black holes. These AGN consist of a central black hole that may be millions or billions of times more massive than the Sun, a disk of interstellar gas and dust called an accretion disk, and two jets perpendicular to the accretion disk. Although supermassive black holes are expected to be found in most AGN, only some galaxies' nuclei have been more carefully studied in attempts to both identify and measure the actual masses of the central supermassive black hole candidates. Some of the most notable galaxies with supermassive black hole candidates include the Andromeda Galaxy, Messier 32, Messier 87, the Sombrero Galaxy, and the Milky Way itself. Another way black holes can be detected is through observation of effects caused by their strong gravitational field. One such effect is gravitational lensing: The deformation of spacetime around a massive object causes light rays to be deflected, making objects behind them appear distorted. When the lensing object is a black hole, this effect can be strong enough to create multiple images of a star or other luminous source. However, the distance between the lensed images may be too small for contemporary telescopes to resolve—this phenomenon is called microlensing. Instead of seeing two images of a lensed star, astronomers see the star brighten slightly as the black hole moves towards the line of sight between the star and Earth and then return to its normal luminosity as the black hole moves away. The turn of the millennium saw the first 3 candidate detections of black holes in this way, and in January 2022, astronomers reported the first confirmed detection of a microlensing event from an isolated black hole. This was also the first determination of an isolated black hole mass, 7.1±1.3 M☉. Alternatives While there is a strong case for supermassive black holes, the model for stellar-mass black holes assumes of an upper limit for the mass of a neutron star: objects observed to have more mass are assumed to be black holes. However, the properties of extremely dense matter are poorly understood. New exotic phases of matter could allow other kinds of massive objects. Quark stars would be made up of quark matter and supported by quark degeneracy pressure, a form of degeneracy pressure even stronger than neutron degeneracy pressure. This would halt gravitational collapse at a higher mass than for a neutron star. Even stronger stars called electroweak stars would convert quarks in their cores into leptons, providing additional pressure to stop the star from collapsing. If, as some extensions of the Standard Model posit, quarks and leptons are made up of the even-smaller fundamental particles called preons, a very compact star could be supported by preon degeneracy pressure. While none of these hypothetical models can explain all of the observations of stellar black hole candidates, a Q star is the only alternative which could significantly exceed the mass limit for neutron stars and thus provide an alternative for supermassive black holes.: 12 A few theoretical objects have been conjectured to match observations of astronomical black hole candidates identically or near-identically, but which function via a different mechanism. A dark energy star would convert infalling matter into vacuum energy; This vacuum energy would be much larger than the vacuum energy of outside space, exerting outwards pressure and preventing a singularity from forming. A black star would be gravitationally collapsing slowly enough that quantum effects would keep it just on the cusp of fully collapsing into a black hole. A gravastar would consist of a very thin shell and a dark-energy interior providing outward pressure to stop the collapse into a black hole or formation of a singularity; It could even have another gravastar inside, called a 'nestar'. Open questions According to the no-hair theorem, a black hole is defined by only three parameters: its mass, charge, and angular momentum. This seems to mean that all other information about the matter that went into forming the black hole is lost, as there is no way to determine anything about the black hole from outside other than those three parameters. When black holes were thought to persist forever, this information loss was not problematic, as the information can be thought of as existing inside the black hole. However, black holes slowly evaporate by emitting Hawking radiation. This radiation does not appear to carry any additional information about the matter that formed the black hole, meaning that this information is seemingly gone forever. This is called the black hole information paradox. Theoretical studies analyzing the paradox have led to both further paradoxes and new ideas about the intersection of quantum mechanics and general relativity. While there is no consensus on the resolution of the paradox, work on the problem is expected to be important for a theory of quantum gravity.: 126 Observations of faraway galaxies have found that ultraluminous quasars, powered by supermassive black holes, existed in the early universe as far as redshift z ≥ 7 {\displaystyle z\geq 7} . These black holes have been assumed to be the products of the gravitational collapse of large population III stars. However, these stellar remnants were not massive enough to produce the quasars observed at early times without accreting beyond the Eddington limit, the theoretical maximum rate of black hole accretion. Physicists have suggested a variety of different mechanisms by which these supermassive black holes may have formed. It has been proposed that smaller black holes may have also undergone mergers to produce the observed supermassive black holes. It is also possible that they were seeded by direct-collapse black holes, in which a large cloud of hot gas avoids fragmentation that would lead to multiple stars, due to low angular momentum or heating from a nearby galaxy. Given the right circumstances, a single supermassive star forms and collapses directly into a black hole without undergoing typical stellar evolution. Additionally, these supermassive black holes in the early universe may be high-mass primordial black holes, which could have accreted further matter in the centers of galaxies. Finally, certain mechanisms allow black holes to grow faster than the theoretical Eddington limit, such as dense gas in the accretion disk limiting outward radiation pressure that prevents the black hole from accreting. However, the formation of bipolar jets prevent super-Eddington rates. In fiction Black holes have been portrayed in science fiction in a variety of ways. Even before the advent of the term itself, objects with characteristics of black holes appeared in stories such as the 1928 novel The Skylark of Space with its "black Sun" and the "hole in space" in the 1935 short story Starship Invincible. As black holes grew to public recognition in the 1960s and 1970s, they began to be featured in films as well as novels, such as Disney's The Black Hole. Black holes have also been used in works of the 21st century, such as Christopher Nolan's science fiction epic Interstellar. Authors and screenwriters have exploited the relativistic effects of black holes, particularly gravitational time dilation. For example, Interstellar features a black hole planet with a time dilation factor of over 60,000:1, while the 1977 novel Gateway depicts a spaceship approaching but never crossing the event horizon of a black hole from the perspective of an outside observer due to time dilation effects. Black holes have also been appropriated as wormholes or other methods of faster-than-light travel, such as in the 1974 novel The Forever War, where a network of black holes is used for interstellar travel. Additionally, black holes can feature as hazards to spacefarers and planets: A black hole threatens a deep-space outpost in 1978 short story The Black Hole Passes, and a binary black hole dangerously alters the orbit of a planet in the 2018 Netflix reboot of Lost in Space. Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-169] | [TOKENS: 17273] |
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America) |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-:82-289] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Animal#cite_note-Gould2011-154] | [TOKENS: 6011] |
Contents Animal Animals are multicellular, eukaryotic organisms belonging to the biological kingdom Animalia (/ˌænɪˈmeɪliə/). With few exceptions, animals consume organic material, breathe oxygen, have myocytes and are able to move, can reproduce sexually, and grow from a hollow sphere of cells, the blastula, during embryonic development. Animals form a clade, meaning that they arose from a single common ancestor. Over 1.5 million living animal species have been described, of which around 1.05 million are insects, over 85,000 are molluscs, and around 65,000 are vertebrates. It has been estimated there are as many as 7.77 million animal species on Earth. Animal body lengths range from 8.5 μm (0.00033 in) to 33.6 m (110 ft). They have complex ecologies and interactions with each other and their environments, forming intricate food webs. The scientific study of animals is known as zoology, and the study of animal behaviour is known as ethology. The animal kingdom is divided into five major clades, namely Porifera, Ctenophora, Placozoa, Cnidaria and Bilateria. Most living animal species belong to the clade Bilateria, a highly proliferative clade whose members have a bilaterally symmetric and significantly cephalised body plan, and the vast majority of bilaterians belong to two large clades: the protostomes, which includes organisms such as arthropods, molluscs, flatworms, annelids and nematodes; and the deuterostomes, which include echinoderms, hemichordates and chordates, the latter of which contains the vertebrates. The much smaller basal phylum Xenacoelomorpha have an uncertain position within Bilateria. Animals first appeared in the fossil record in the late Cryogenian period and diversified in the subsequent Ediacaran period in what is known as the Avalon explosion. Nearly all modern animal phyla first appeared in the fossil record as marine species during the Cambrian explosion, which began around 539 million years ago (Mya), and most classes during the Ordovician radiation 485.4 Mya. Common to all living animals, 6,331 groups of genes have been identified that may have arisen from a single common ancestor that lived about 650 Mya during the Cryogenian period. Historically, Aristotle divided animals into those with blood and those without. Carl Linnaeus created the first hierarchical biological classification for animals in 1758 with his Systema Naturae, which Jean-Baptiste Lamarck expanded into 14 phyla by 1809. In 1874, Ernst Haeckel divided the animal kingdom into the multicellular Metazoa (now synonymous with Animalia) and the Protozoa, single-celled organisms no longer considered animals. In modern times, the biological classification of animals relies on advanced techniques, such as molecular phylogenetics, which are effective at demonstrating the evolutionary relationships between taxa. Humans make use of many other animal species for food (including meat, eggs, and dairy products), for materials (such as leather, fur, and wool), as pets and as working animals for transportation, and services. Dogs, the first domesticated animal, have been used in hunting, in security and in warfare, as have horses, pigeons and birds of prey; while other terrestrial and aquatic animals are hunted for sports, trophies or profits. Non-human animals are also an important cultural element of human evolution, having appeared in cave arts and totems since the earliest times, and are frequently featured in mythology, religion, arts, literature, heraldry, politics, and sports. Etymology The word animal comes from the Latin noun animal of the same meaning, which is itself derived from Latin animalis 'having breath or soul'. The biological definition includes all members of the kingdom Animalia. In colloquial usage, the term animal is often used to refer only to nonhuman animals. The term metazoa is derived from Ancient Greek μετα meta 'after' (in biology, the prefix meta- stands for 'later') and ζῷᾰ zōia 'animals', plural of ζῷον zōion 'animal'. A metazoan is any member of the group Metazoa. Characteristics Animals have several characteristics that they share with other living things. Animals are eukaryotic, multicellular, and aerobic, as are plants and fungi. Unlike plants and algae, which produce their own food, animals cannot produce their own food, a feature they share with fungi. Animals ingest organic material and digest it internally. Animals have structural characteristics that set them apart from all other living things: Typically, there is an internal digestive chamber with either one opening (in Ctenophora, Cnidaria, and flatworms) or two openings (in most bilaterians). Animal development is controlled by Hox genes, which signal the times and places to develop structures such as body segments and limbs. During development, the animal extracellular matrix forms a relatively flexible framework upon which cells can move about and be reorganised into specialised tissues and organs, making the formation of complex structures possible, and allowing cells to be differentiated. The extracellular matrix may be calcified, forming structures such as shells, bones, and spicules. In contrast, the cells of other multicellular organisms (primarily algae, plants, and fungi) are held in place by cell walls, and so develop by progressive growth. Nearly all animals make use of some form of sexual reproduction. They produce haploid gametes by meiosis; the smaller, motile gametes are spermatozoa and the larger, non-motile gametes are ova. These fuse to form zygotes, which develop via mitosis into a hollow sphere, called a blastula. In sponges, blastula larvae swim to a new location, attach to the seabed, and develop into a new sponge. In most other groups, the blastula undergoes more complicated rearrangement. It first invaginates to form a gastrula with a digestive chamber and two separate germ layers, an external ectoderm and an internal endoderm. In most cases, a third germ layer, the mesoderm, also develops between them. These germ layers then differentiate to form tissues and organs. Repeated instances of mating with a close relative during sexual reproduction generally leads to inbreeding depression within a population due to the increased prevalence of harmful recessive traits. Animals have evolved numerous mechanisms for avoiding close inbreeding. Some animals are capable of asexual reproduction, which often results in a genetic clone of the parent. This may take place through fragmentation; budding, such as in Hydra and other cnidarians; or parthenogenesis, where fertile eggs are produced without mating, such as in aphids. Ecology Animals are categorised into ecological groups depending on their trophic levels and how they consume organic material. Such groupings include carnivores (further divided into subcategories such as piscivores, insectivores, ovivores, etc.), herbivores (subcategorised into folivores, graminivores, frugivores, granivores, nectarivores, algivores, etc.), omnivores, fungivores, scavengers/detritivores, and parasites. Interactions between animals of each biome form complex food webs within that ecosystem. In carnivorous or omnivorous species, predation is a consumer–resource interaction where the predator feeds on another organism, its prey, who often evolves anti-predator adaptations to avoid being fed upon. Selective pressures imposed on one another lead to an evolutionary arms race between predator and prey, resulting in various antagonistic/competitive coevolutions. Almost all multicellular predators are animals. Some consumers use multiple methods; for example, in parasitoid wasps, the larvae feed on the hosts' living tissues, killing them in the process, but the adults primarily consume nectar from flowers. Other animals may have very specific feeding behaviours, such as hawksbill sea turtles which mainly eat sponges. Most animals rely on biomass and bioenergy produced by plants and phytoplanktons (collectively called producers) through photosynthesis. Herbivores, as primary consumers, eat the plant material directly to digest and absorb the nutrients, while carnivores and other animals on higher trophic levels indirectly acquire the nutrients by eating the herbivores or other animals that have eaten the herbivores. Animals oxidise carbohydrates, lipids, proteins and other biomolecules in cellular respiration, which allows the animal to grow and to sustain basal metabolism and fuel other biological processes such as locomotion. Some benthic animals living close to hydrothermal vents and cold seeps on the dark sea floor consume organic matter produced through chemosynthesis (via oxidising inorganic compounds such as hydrogen sulfide) by archaea and bacteria. Animals originated in the ocean; all extant animal phyla, except for Micrognathozoa and Onychophora, feature at least some marine species. However, several lineages of arthropods begun to colonise land around the same time as land plants, probably between 510 and 471 million years ago, during the Late Cambrian or Early Ordovician. Vertebrates such as the lobe-finned fish Tiktaalik started to move on to land in the late Devonian, about 375 million years ago. Other notable animal groups that colonized land environments are Mollusca, Platyhelmintha, Annelida, Tardigrada, Onychophora, Rotifera, Nematoda. Animals occupy virtually all of earth's habitats and microhabitats, with faunas adapted to salt water, hydrothermal vents, fresh water, hot springs, swamps, forests, pastures, deserts, air, and the interiors of other organisms. Animals are however not particularly heat tolerant; very few of them can survive at constant temperatures above 50 °C (122 °F) or in the most extreme cold deserts of continental Antarctica. The collective global geomorphic influence of animals on the processes shaping the Earth's surface remains largely understudied, with most studies limited to individual species and well-known exemplars. Diversity The blue whale (Balaenoptera musculus) is the largest animal that has ever lived, weighing up to 190 tonnes and measuring up to 33.6 metres (110 ft) long. The largest extant terrestrial animal is the African bush elephant (Loxodonta africana), weighing up to 12.25 tonnes and measuring up to 10.67 metres (35.0 ft) long. The largest terrestrial animals that ever lived were titanosaur sauropod dinosaurs such as Argentinosaurus, which may have weighed as much as 73 tonnes, and Supersaurus which may have reached 39 metres. Several animals are microscopic; some Myxozoa (obligate parasites within the Cnidaria) never grow larger than 20 μm, and one of the smallest species (Myxobolus shekel) is no more than 8.5 μm when fully grown. The following table lists estimated numbers of described extant species for the major animal phyla, along with their principal habitats (terrestrial, fresh water, and marine), and free-living or parasitic ways of life. Species estimates shown here are based on numbers described scientifically; much larger estimates have been calculated based on various means of prediction, and these can vary wildly. For instance, around 25,000–27,000 species of nematodes have been described, while published estimates of the total number of nematode species include 10,000–20,000; 500,000; 10 million; and 100 million. Using patterns within the taxonomic hierarchy, the total number of animal species—including those not yet described—was calculated to be about 7.77 million in 2011.[a] 3,000–6,500 4,000–25,000 Evolutionary origin Evidence of animals is found as long ago as the Cryogenian period. 24-Isopropylcholestane (24-ipc) has been found in rocks from roughly 650 million years ago; it is only produced by sponges and pelagophyte algae. Its likely origin is from sponges based on molecular clock estimates for the origin of 24-ipc production in both groups. Analyses of pelagophyte algae consistently recover a Phanerozoic origin, while analyses of sponges recover a Neoproterozoic origin, consistent with the appearance of 24-ipc in the fossil record. The first body fossils of animals appear in the Ediacaran, represented by forms such as Charnia and Spriggina. It had long been doubted whether these fossils truly represented animals, but the discovery of the animal lipid cholesterol in fossils of Dickinsonia establishes their nature. Animals are thought to have originated under low-oxygen conditions, suggesting that they were capable of living entirely by anaerobic respiration, but as they became specialised for aerobic metabolism they became fully dependent on oxygen in their environments. Many animal phyla first appear in the fossil record during the Cambrian explosion, starting about 539 million years ago, in beds such as the Burgess Shale. Extant phyla in these rocks include molluscs, brachiopods, onychophorans, tardigrades, arthropods, echinoderms and hemichordates, along with numerous now-extinct forms such as the predatory Anomalocaris. The apparent suddenness of the event may however be an artefact of the fossil record, rather than showing that all these animals appeared simultaneously. That view is supported by the discovery of Auroralumina attenboroughii, the earliest known Ediacaran crown-group cnidarian (557–562 mya, some 20 million years before the Cambrian explosion) from Charnwood Forest, England. It is thought to be one of the earliest predators, catching small prey with its nematocysts as modern cnidarians do. Some palaeontologists have suggested that animals appeared much earlier than the Cambrian explosion, possibly as early as 1 billion years ago. Early fossils that might represent animals appear for example in the 665-million-year-old rocks of the Trezona Formation of South Australia. These fossils are interpreted as most probably being early sponges. Trace fossils such as tracks and burrows found in the Tonian period (from 1 gya) may indicate the presence of triploblastic worm-like animals, roughly as large (about 5 mm wide) and complex as earthworms. However, similar tracks are produced by the giant single-celled protist Gromia sphaerica, so the Tonian trace fossils may not indicate early animal evolution. Around the same time, the layered mats of microorganisms called stromatolites decreased in diversity, perhaps due to grazing by newly evolved animals. Objects such as sediment-filled tubes that resemble trace fossils of the burrows of wormlike animals have been found in 1.2 gya rocks in North America, in 1.5 gya rocks in Australia and North America, and in 1.7 gya rocks in Australia. Their interpretation as having an animal origin is disputed, as they might be water-escape or other structures. Phylogeny Animals are monophyletic, meaning they are derived from a common ancestor. Animals are the sister group to the choanoflagellates, with which they form the Choanozoa. Ros-Rocher and colleagues (2021) trace the origins of animals to unicellular ancestors, providing the external phylogeny shown in the cladogram. Uncertainty of relationships is indicated with dashed lines. The animal clade had certainly originated by 650 mya, and may have come into being as much as 800 mya, based on molecular clock evidence for different phyla. Holomycota (inc. fungi) Ichthyosporea Pluriformea Filasterea The relationships at the base of the animal tree have been debated. Other than Ctenophora, the Bilateria and Cnidaria are the only groups with symmetry, and other evidence shows they are closely related. In addition to sponges, Placozoa has no symmetry and was often considered a "missing link" between protists and multicellular animals. The presence of hox genes in Placozoa shows that they were once more complex. The Porifera (sponges) have long been assumed to be sister to the rest of the animals, but there is evidence that the Ctenophora may be in that position. Molecular phylogenetics has supported both the sponge-sister and ctenophore-sister hypotheses. In 2017, Roberto Feuda and colleagues, using amino acid differences, presented both, with the following cladogram for the sponge-sister view that they supported (their ctenophore-sister tree simply interchanging the places of ctenophores and sponges): Porifera Ctenophora Placozoa Cnidaria Bilateria Conversely, a 2023 study by Darrin Schultz and colleagues uses ancient gene linkages to construct the following ctenophore-sister phylogeny: Ctenophora Porifera Placozoa Cnidaria Bilateria Sponges are physically very distinct from other animals, and were long thought to have diverged first, representing the oldest animal phylum and forming a sister clade to all other animals. Despite their morphological dissimilarity with all other animals, genetic evidence suggests sponges may be more closely related to other animals than the comb jellies are. Sponges lack the complex organisation found in most other animal phyla; their cells are differentiated, but in most cases not organised into distinct tissues, unlike all other animals. They typically feed by drawing in water through pores, filtering out small particles of food. The Ctenophora and Cnidaria are radially symmetric and have digestive chambers with a single opening, which serves as both mouth and anus. Animals in both phyla have distinct tissues, but these are not organised into discrete organs. They are diploblastic, having only two main germ layers, ectoderm and endoderm. The tiny placozoans have no permanent digestive chamber and no symmetry; they superficially resemble amoebae. Their phylogeny is poorly defined, and under active research. The remaining animals, the great majority—comprising some 29 phyla and over a million species—form the Bilateria clade, which have a bilaterally symmetric body plan. The Bilateria are triploblastic, with three well-developed germ layers, and their tissues form distinct organs. The digestive chamber has two openings, a mouth and an anus, and in the Nephrozoa there is an internal body cavity, a coelom or pseudocoelom. These animals have a head end (anterior) and a tail end (posterior), a back (dorsal) surface and a belly (ventral) surface, and a left and a right side. A modern consensus phylogenetic tree for the Bilateria is shown below. Xenacoelomorpha Ambulacraria Chordata Ecdysozoa Spiralia Having a front end means that this part of the body encounters stimuli, such as food, favouring cephalisation, the development of a head with sense organs and a mouth. Many bilaterians have a combination of circular muscles that constrict the body, making it longer, and an opposing set of longitudinal muscles, that shorten the body; these enable soft-bodied animals with a hydrostatic skeleton to move by peristalsis. They also have a gut that extends through the basically cylindrical body from mouth to anus. Many bilaterian phyla have primary larvae which swim with cilia and have an apical organ containing sensory cells. However, over evolutionary time, descendant spaces have evolved which have lost one or more of each of these characteristics. For example, adult echinoderms are radially symmetric (unlike their larvae), while some parasitic worms have extremely simplified body structures. Genetic studies have considerably changed zoologists' understanding of the relationships within the Bilateria. Most appear to belong to two major lineages, the protostomes and the deuterostomes. It is often suggested that the basalmost bilaterians are the Xenacoelomorpha, with all other bilaterians belonging to the subclade Nephrozoa. However, this suggestion has been contested, with other studies finding that xenacoelomorphs are more closely related to Ambulacraria than to other bilaterians. Protostomes and deuterostomes differ in several ways. Early in development, deuterostome embryos undergo radial cleavage during cell division, while many protostomes (the Spiralia) undergo spiral cleavage. Animals from both groups possess a complete digestive tract, but in protostomes the first opening of the embryonic gut develops into the mouth, and the anus forms secondarily. In deuterostomes, the anus forms first while the mouth develops secondarily. Most protostomes have schizocoelous development, where cells simply fill in the interior of the gastrula to form the mesoderm. In deuterostomes, the mesoderm forms by enterocoelic pouching, through invagination of the endoderm. The main deuterostome taxa are the Ambulacraria and the Chordata. Ambulacraria are exclusively marine and include acorn worms, starfish, sea urchins, and sea cucumbers. The chordates are dominated by the vertebrates (animals with backbones), which consist of fishes, amphibians, reptiles, birds, and mammals. The protostomes include the Ecdysozoa, named after their shared trait of ecdysis, growth by moulting, Among the largest ecdysozoan phyla are the arthropods and the nematodes. The rest of the protostomes are in the Spiralia, named for their pattern of developing by spiral cleavage in the early embryo. Major spiralian phyla include the annelids and molluscs. History of classification In the classical era, Aristotle divided animals,[d] based on his own observations, into those with blood (roughly, the vertebrates) and those without. The animals were then arranged on a scale from man (with blood, two legs, rational soul) down through the live-bearing tetrapods (with blood, four legs, sensitive soul) and other groups such as crustaceans (no blood, many legs, sensitive soul) down to spontaneously generating creatures like sponges (no blood, no legs, vegetable soul). Aristotle was uncertain whether sponges were animals, which in his system ought to have sensation, appetite, and locomotion, or plants, which did not: he knew that sponges could sense touch and would contract if about to be pulled off their rocks, but that they were rooted like plants and never moved about. In 1758, Carl Linnaeus created the first hierarchical classification in his Systema Naturae. In his original scheme, the animals were one of three kingdoms, divided into the classes of Vermes, Insecta, Pisces, Amphibia, Aves, and Mammalia. Since then, the last four have all been subsumed into a single phylum, the Chordata, while his Insecta (which included the crustaceans and arachnids) and Vermes have been renamed or broken up. The process was begun in 1793 by Jean-Baptiste de Lamarck, who called the Vermes une espèce de chaos ('a chaotic mess')[e] and split the group into three new phyla: worms, echinoderms, and polyps (which contained corals and jellyfish). By 1809, in his Philosophie Zoologique, Lamarck had created nine phyla apart from vertebrates (where he still had four phyla: mammals, birds, reptiles, and fish) and molluscs, namely cirripedes, annelids, crustaceans, arachnids, insects, worms, radiates, polyps, and infusorians. In his 1817 Le Règne Animal, Georges Cuvier used comparative anatomy to group the animals into four embranchements ('branches' with different body plans, roughly corresponding to phyla), namely vertebrates, molluscs, articulated animals (arthropods and annelids), and zoophytes (radiata) (echinoderms, cnidaria and other forms). This division into four was followed by the embryologist Karl Ernst von Baer in 1828, the zoologist Louis Agassiz in 1857, and the comparative anatomist Richard Owen in 1860. In 1874, Ernst Haeckel divided the animal kingdom into two subkingdoms: Metazoa (multicellular animals, with five phyla: coelenterates, echinoderms, articulates, molluscs, and vertebrates) and Protozoa (single-celled animals), including a sixth animal phylum, sponges. The protozoa were later moved to the former kingdom Protista, leaving only the Metazoa as a synonym of Animalia. In human culture The human population exploits a large number of other animal species for food, both of domesticated livestock species in animal husbandry and, mainly at sea, by hunting wild species. Marine fish of many species are caught commercially for food. A smaller number of species are farmed commercially. Humans and their livestock make up more than 90% of the biomass of all terrestrial vertebrates, and almost as much as all insects combined. Invertebrates including cephalopods, crustaceans, insects—principally bees and silkworms—and bivalve or gastropod molluscs are hunted or farmed for food, fibres. Chickens, cattle, sheep, pigs, and other animals are raised as livestock for meat across the world. Animal fibres such as wool and silk are used to make textiles, while animal sinews have been used as lashings and bindings, and leather is widely used to make shoes and other items. Animals have been hunted and farmed for their fur to make items such as coats and hats. Dyestuffs including carmine (cochineal), shellac, and kermes have been made from the bodies of insects. Working animals including cattle and horses have been used for work and transport from the first days of agriculture. Animals such as the fruit fly Drosophila melanogaster serve a major role in science as experimental models. Animals have been used to create vaccines since their discovery in the 18th century. Some medicines such as the cancer drug trabectedin are based on toxins or other molecules of animal origin. People have used hunting dogs to help chase down and retrieve animals, and birds of prey to catch birds and mammals, while tethered cormorants have been used to catch fish. Poison dart frogs have been used to poison the tips of blowpipe darts. A wide variety of animals are kept as pets, from invertebrates such as tarantulas, octopuses, and praying mantises, reptiles such as snakes and chameleons, and birds including canaries, parakeets, and parrots all finding a place. However, the most kept pet species are mammals, namely dogs, cats, and rabbits. There is a tension between the role of animals as companions to humans, and their existence as individuals with rights of their own. A wide variety of terrestrial and aquatic animals are hunted for sport. The signs of the Western and Chinese zodiacs are based on animals. In China and Japan, the butterfly has been seen as the personification of a person's soul, and in classical representation the butterfly is also the symbol of the soul. Animals have been the subjects of art from the earliest times, both historical, as in ancient Egypt, and prehistoric, as in the cave paintings at Lascaux. Major animal paintings include Albrecht Dürer's 1515 The Rhinoceros, and George Stubbs's c. 1762 horse portrait Whistlejacket. Insects, birds and mammals play roles in literature and film, such as in giant bug movies. Animals including insects and mammals feature in mythology and religion. The scarab beetle was sacred in ancient Egypt, and the cow is sacred in Hinduism. Among other mammals, deer, horses, lions, bats, bears, and wolves are the subjects of myths and worship. See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/HD_43587] | [TOKENS: 643] |
Contents HD 43587 HD 43587 is a stellar system approximately 63 light-years away in the constellation of Orion, visible to the naked eye. The system comprises four individual stars, with two widely separated binaries forming a quadruple system. Components HD 43587, being a bright, nearby, high proper motion solar-type star, has been fairly extensively studied. The star was found to be slightly hotter than the Sun, but has a similar metallicity and is therefore not much more massive. Searches for companions to the star, among many other stars, were ongoing throughout the last century. HD 43587 did not seem to have a variable radial velocity or much variability in its astrometry which would indicate that it had a close companion. The Washington Double Star Catalog lists four visual companions; companion B, discovered in 1891, has differing proper motion to the primary, so it is unrelated. Companions C and D, discovered in 1911, have only been observed once, making their relationship uncertain at best. However, Companion E, first observed in 1990, has very similar proper motion to the primary, meaning that it is indeed a companion. Designated HD 43587 B, the star was found to be a faint M-dwarf. Because of the star's brightness and position in the vicinity of the constellation of Monoceros, HD 43587 A was selected as one of the primary COROT astroseismology targets, which would collect information on the star's internal properties. Since the primary star is similar to the Sun and did not seem to have a close companion, it was targeted by the radial velocity-based planet searches that began at the end of the twentieth century. In particular, HD 43587 A was observed with the Keck/HIRES spectrograph. However, during 1998 the star's radial velocity was found to decrease by about eight km/s, indicative of a long period stellar companion. An orbital fit found that this new companion has an orbital period of about 30 years, but on a very eccentric path which brings it through periastron in about a year. This third star, designated HD 43587 Ab, was found to have a minimum mass of about 0.3 M☉ The long period of HD 43587 Ab coupled with the system being close to the Solar System means that the two components of the primary system would be well separated as viewed from Earth, which made it an attractive target for resolving. This was achieved in 2006 with adaptive optics, and has been achieved since with speckle interferometry. Meanwhile, HD 43587 B became interesting because it was a little-studied, fairly bright M-dwarf. As of such, it was targeted in the STEPS astrometric survey, which found that the star's motion was deviating from linear motion; adaptive optics observations confirmed that HD 43587 B was itself a binary with a fourth component, HD 43587 C. While the orbital period of the binary was too long to constrain the two components' dynamical masses, photometric analysis found that they were both late M-dwarfs. Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Kumiho] | [TOKENS: 16205] |
Contents Kumiho A kumiho or gumiho (Korean: 구미호; Hanja: 九尾狐; lit. nine-tailed fox) is a creature that appears in the folktales of East Asia and legends of Korea. It is similar to the Chinese jiuweihu, the Japanese kitsune and the Vietnamese hồ ly tinh. Kumiho is a term that originally referred to a magical fox appearing in Korean novels of the Joseon dynasty, or was used pejoratively in historical records of the same period to denounce treacherous officials. In modern Korea, the designation kumiho has been broadened to encompass all fox spirits described in traditional Korean sources. Etymology and terminology In Korean colloquial usage, gumiho is often applied to describe a person (especially a woman) regarded as sly or cunning. The more widely-used term Kumiho follows the McCune–Reischauer romanization system, whereas the official romanization of Korean is Gumiho, which is also closer to the actual Korean pronunciation.[citation needed] In South Korea, the term gumiho is commonly used to refer to the mythical nine-tailed fox, with ho (狐) being the Sino-Korean reading of the Chinese character for "fox." In contrast, North Korea refers to the creature as gumiyeowoo, using yeowoo, the native Korean word for "fox." While ho is typically reserved for mythological or literary contexts in South Korea, South Koreans also use yeowoo—just like North Koreans do—in everyday language to refer to real, biological foxes. However, even the word yeowoo, which usually denotes a wild fox, carries a subtle nuance of cunning or eeriness, and this term often appears in names referring to Korean fox spirits. Due to the widespread use of the term Kumiho, modern Koreans often refer to all fox spirits by this name; however, this represents a typical case of conflation. In fact, Korean folklore features a variety of fox spirits, including bul-yeowoo (불여우, “fire fox”), baeg-yeowoo (백여우, “white fox”), maegu (매구, “fox demon”), and hogwi (호귀, “fox ghost”), all of which, like kumiho, have also been employed in a derogatory sense when referring to women. As these various names indicate, the types of foxes appearing in Korean mythology are diverse and are not limited to the nine-tailed fox (kumiho). Characteristics and symbolism In Korean myth, foxes are primarily depicted as deceptive, malevolent, and tragic beings, though there are rare cases where they are portrayed as benevolent toward humans or as deities governing mountains. They are often portrayed in the form of heretical Buddhist monks or beautiful women and are sometimes depicted as monsters that torment and threaten dragons, which are considered sacred in Korean mythology. In Korean mythology, which is deeply influenced by Buddhism, the dragon is depicted as a divine being that protects the kingdom, whereas its adversary, the fox, is portrayed as a force that disturbs the kingdom or as a harbinger of the kingdom’s downfall. In certain tale, the fox spirit is even described as defeating multiple dragons, emphasizing its anti-divine characteristics. At the same time, such accounts suggest that although the fox was not venerated as a sacred creature like the dragon, it was nevertheless regarded in Korea as possessing transcendent powers comparable to those of dragons. In Korean toponymy, yeougogae (Korean: 여우고개; lit. fox pass) is among the most common place names for mountain passes. Passes bearing the name “fox” are often associated with local folktales or legends about fox spirits. Historical development In Korean tradition, the fox spirit appears in early state myths in forms that reflect syncretism with Buddhism and indigenous shamanistic beliefs. During the Goryeo period, however, the status of the fox spirit had already begun to decline. Although certain myths recorded at that time portray the fox in a sacred light, these accounts are generally considered quotations or preservations of earlier Silla beliefs rather than reflections of Goryeo religious culture itself. Most narratives about fox spirits written during the Goryeo period depict them as ominous or inauspicious beings. Unlike in Japan, where fox spirits were integrated positively into Buddhism, the fox in Goryeo Korea does not appear to have undergone such a synthesis. By the Joseon period, under the influence of Confucian thought, the religious status of the fox spirit declined even further. Confucian ideology discouraged attributing significance to fictional or supernatural beings, and its hierarchical worldview often placed animals below humans in moral and ontological value. As a result, in popular perception and oral tradition, the fox spirit came to be portrayed increasingly as a deceptive, malevolent, or tragic being. However, in literary contexts, depictions of fox spirits became more varied with the introduction of Chinese fantasy fiction, and Joseon-era writers portrayed fox spirits through a range of symbolic meanings rather than a single moral type. Unlike in China and Japan, where fox spirits were at times associated with religious functions or divine status, evidence for a comparable role in Korea is limited, with notable examples appearing only in the mythology of the ancient kingdom of Silla. In Japan, fox spirits were integrated into the state-sanctioned religious framework, and in China, extensive mythological traditions allowed for the widespread dissemination of fox spirit stories. But by contrast, due to the aforementioned religious and social currents in Korea that were unfavorable toward fox spirits, the development of Korean fox spirit narratives inevitably faced considerable constraints. Consequently, premodern Korean fox spirit traditions remained limited in both quantity and quality compared to those found in Japan or China. However, this assessment is relative; when considered in absolute terms, Korea possesses a substantial corpus of fox spirit legends, and foxes hold a notable place as supernatural beings within the country’s folklore tradition. Since most surviving Korean fox spirit traditions depict the fox as a demonic being, the general perception of the Korean fox spirit is that it is an evil or devilish entity. Many early foreign studies consequently misunderstood the Korean fox spirit as possessing exclusively demonic traits, and even perceptions within Korea have not substantially diverged from this view. Yet, upon closer examination of traditional myths, a minority of Korean folktales present the fox in a neutral or sacred light, in which it aids or rewards humans. Modern interpretations and media South Korea, as a country with an active media industry, frequently produces works based on indigenous legends, regardless of the fact that Christianity is the predominant religion in contemporary Korean society. The fox spirit is among the motifs employed in such media. The character Kumiho, featured in the horror anthology series Jeonseol-ui Gohyang (“Hometown of Legends”) aired by the public broadcaster KBS, became widely recognized by Korean audiences and has served as a source of inspiration for cultural creators. Furthermore, the traditions of the kumiho and fox spirits also serve as material for contemporary illustrated storybooks. In addition, the story of the Kumiho is further evolving as new interpretations are created. This is due to Korean mythology not being centralized thus allowing more creative freedom from the individual when creating a story based on myth. The motif of the Kumiho desiring to become human, while not entirely absent in premodern periods, was relatively rare, though it appeared during the Joseon dynasty in tales such as The Fox Sister and Myeonghaengjeong Uirok. In modern Korean mythology, however, this theme is more prominent, with repeated reinterpretations of fox-spirit narratives reinforcing the image of the Kumiho striving for humanity. Scholars suggest that the emergence of this motif may be attributed to multiple factors, including cultural reactions against the Joseon-era disdain for fox tales, feminist aspirations to liberate Korean women, who had long been marginalized and disparaged through the derogatory connotations associated with the term kumiho, and broader influences from Chinese, Japanese, and Korean folklore—such as The Man and the White Snake, Yuki-Onna, and Ungnyeo—which collectively contributed to the development of the modern interpretation. As the kumiho has frequently appeared in South Korean media, it has come to be represented in diverse ways, including more sympathetic portrayals. These positive reinterpretations extend not only to the kumiho itself but also to the fox as an animal, sparking renewed public interest in indigenous fox-spirit folklore. This growing interest has led to the rediscovery of older fox-spirit tales, which, when incorporated into newly created kumiho media, sustain a cyclical process of modern reinterpretation and the revival of ancient folklore. In September 2022, a South Korean distilled soju(Distilled Liquor) brand named Saero was released, featuring the kumiho as its brand symbol. The use of the kumiho as the emblem of a commonly consumed distilled soju suggests that Kumiho has become significantly more familiar and approachable to the general South Korean public. In contrast, the portrayal of the fox in North Korean media remains largely uniform and negative. During the Cold War, both North and South Korea frequently employed animal metaphors in their mutual propaganda, with each side comparing the other to foxes or jackals. North Korea, however, has continued to reproduce such imagery well into the 21st century. In North Korean dramas, films, and other media, the fox typically symbolizes capitalism, imperialism, selfishness, or divisive behavior. Beyond these limited functions, foxes rarely appear, as state propaganda has already attached a negative connotation to the animal. Assigning alternative meanings would potentially undermine propaganda objectives; therefore, the image of the fox in North Korea remains consistently negative. Furthermore, When North Korea undertook state-sponsored projects to compile Korean folktales, a rigid class-based and socialist framework was applied, leading to the exclusion of fantastical or frightening stories. This selective approach limited the transmission of traditional fox spirit narratives. Unlike in South Korea, where the kumiho continues to evolve through modern reinterpretations, in North Korea the figure neither develops multifaceted qualities nor benefits from the rediscovery of older traditions, which also restricts the possibility of modern adaptations. As a result, kumiho is depicted almost exclusively as a demonic entity. In the 2021 animated film The Devil's Conqueror (Eoksoe Defeats Devil, 악마를 이긴 억쇠), for example, the devil character is identified with the kumiho. Korean Fox Spirit Tales by Period The sources of the fox spirit's story that appeared in ancient Korean history are Samgugyusa(三國遺事,Memorabilia of the Three Kingdoms) and Samguksagi(三國史記,History of the Three Kingdoms). Fox spirits depicted as active in ancient Korean kingdoms are usually described as having white fur, the ability to transform into humans, and possessing cunning magical powers. However, historical records from ancient Korean history do not specifically describe these fox spirits as having nine tails. The nine-tailed fox spirit does not appear in Korean historical records until the Goryeo period and later. In ancient Korean mythology, foxes are typically depicted as malevolent spirits with white fur. But also, there is a singular account describing a sacred fox spirit with black fur. 狐能化美女 狸亦作書生 誰知異類物 幻惑同人形 變化尙非艱 操心良獨難 欲辨眞與僞 願磨心鏡看 (In truth, the fox is capable of transforming into a beautiful woman, and the wild cat becomes a scholar. Who could have known that animals could take human forms, deceive, and confuse others? While transformation may be easy, it is far more difficult to control one's heart. To distinguish truth from falsehood, one must first polish the mirror of the heart.) This passage, attributed to Choi Chiwon, a government official of the ancient kingdom of Silla in the 9th century, reflects the belief that animals could transform into humans. The fox, in particular, was believed to have the ability to become a beautiful woman. This demonstrates the perception of foxes and animals in ancient Korean society. The Samguk Sagi, a historical text detailing the ancient Korean kingdoms, includes references to a mysterious and ominous creature known as the Baeg-yeowoo(white fox), which is often associated with bad omens. Despite its ominous reputation, the fur of an albino white fox was considered a rare and valuable item. The King of Goguryeo, while hunting, came across a Baeg-yeowoo and ordered a shaman to interpret what omen this white fox might represent. The shaman explained that foxes were traditionally considered ominous creatures, and since this one was white, it was even more sinister. He suggested that the heavenly gods were showing a bad omen through this white fox spirit and that the king needed to reflect on his actions. Enraged by these words, the king had the shaman executed on the spot. When the Baekje Kingdom and the Goguryeo Kingdom were nearing their fall to the Silla Kingdom, foxes and other animals were reported to have entered the royal palaces of each kingdom. In the spring of the year before the fall of Baekje, a group of foxes entered the Baekje royal palace, and among them,Baeg-yeowoo (a white fox) was seen sitting on the desk of the highest official. In one rare case, the Heuk-yeoyoo(black fox) is depicted as a sacred being that assists humans. This fox spirit appears in the legend of the eminent Silla monk Won-Gwang (圓光, 555–638), where it serves as a guide and helper. Won-Gwang a Buddhist monk of silla, entered Samgi Mountain at the age of thirty to meditate. Four years later, another monk arrived and built a temple not far from where Won-Gwang was staying, living there for two years. This monk was brave and had an interest in practicing magic. On the sixth year, while Won-Gwang was reciting Buddhist scriptures, the Mountain god(산신) suddenly appeared to him. The Mountain god informed Won-Gwang that the monk living nearby had been practicing magic along the path, causing loud noises, and asked him to stop. The next day, Won-Gwang spoke to the monk about what had happened and suggested that he practice elsewhere, but the monk mocked Won-Gwang and ignored his request. That night, the Mountain god appeared again and asked Won-Gwang about the situation. Though Won Gwang tried to speak evasively, the Mountain god already knew the outcome. Angered, the Mountain god caused a lightning strike that triggered a landslide, destroying the monk’s temple. The Mountain god then appeared to the surprised Won-Gwang, revealed its true identity, and advised him to travel to China to learn the Buddhist teachings for the future. Won-Gwang, expressing that the distance was too far, was shown the way by the Mountain god. With the Mountain god's guidance, Won Gwang reached China and spent eleven years studying the Three Baskets of Buddhism (Tripitaka – Sutras (經), Vinaya (律), and Abhidharma (論)) and Confucianism. In the 22nd year of King Jinpyeong’s reign (600 CE), Won-Gwang returned to Silla with a diplomatic envoy Jo Bing-sa (朝聘使) from China. Upon his return, Won-Gwang visited the temple at Samgi Mountain to express his gratitude to the Mountain god. The Mountain god then imparted the precepts to Won-Gwang and made a vow of saengsaengsangje (生生相濟), a promise to rescue each other across all worlds whenever both the Mountain god and Won-Gwang are reborn. When Won-Gwang expressed a desire to see the Mountain god’s true form, the Mountain god instructed him to look toward the east the following morning. The next day, when Won-Gwang gazed at the eastern sky, he saw a massive arm piercing through the clouds, reaching toward the heavens. As a result, Won Gwang renamed Samgi Mountain Bichang Mountain (臂長山), meaning “Long Arm Mountain.” One day, the Mountain god revealed to Won-Gwang that its own death was near. On the designated day, Won-Gwang visited the place the Mountain god had mentioned and found an old fox, its fur as black as lacquer, struggling to breathe before it died.This black fox Mountain god was three thousand years old at the time of its death. This heuk-yeowoo(black fox) is a typical example of a mountain god in Korean mythology. Korean mythology also holds the belief that when an animal attains enlightenment and reaches the realm of the divine, it becomes a mountain spirit. In Korean folklore, there are narratives in which an aged dragon is threatened by a fox. A hero appears, defeats the fox, and rescues the old dragon, who in return grants his daughter in marriage to the hero. The two tales to be discussed below share this same plot. In the tale of Geotaji, a master archer, he encountered a storm while traveling to Tang China and sought refuge on an island. In a dream, he met an old man who revealed himself as the Dragon King of the Western Sea (西海若) and pleaded for help, explaining that his life was threatened by a monk. The next day, Geotaji saw a monk chanting the Darani (陀羅尼) incantation, causing the old dragon and his family to levitate helplessly. When the monk attempted to devour the dragon’s liver, Geotaji shot him with an arrow. Struck by the arrow, the monk reverted to his true form—an old fox—and died. The Dragon King then appeared before Geotaji to express his gratitude and offered his only daughter as a wife. Geotaji accepted, and the Dragon King transformed his daughter into a flower for him to carry. When Geotaji’s ship, escorted by two dragons, reached Tang China, the astonishing sight was reported to the emperor by the locals. Impressed by Geotaji’s extraordinary nature, the emperor treated him generously. After returning to Silla, Geotaji took the flower from his chest, and the Dragon King’s daughter regained her form, living with him as his wife. The Geotaji tale is set in the reign of Queen Jinseong of Silla and is believed to have influenced the later Jakjegeon narrative. The Jakjegeon story is considered an expanded version of the original Geotaji tale, incorporating additional foundation myth elements. In the Jakjegeon tale, the hero Jakjegeon sets out to find his father, carrying the divine bow (singung, 神弓) as a token. Boarding a Tang merchant ship, he encounters a storm at sea. Divination reveals that a Korean must be set ashore on an island. There, Jakjegeon meets an old man who identifies himself as the Dragon King of the Western Sea. The Dragon King complains of pain, explaining that a demon disguised as Buddha is reciting a spell to incapacitate him, and asks Jakjegeon to shoot it. When Jakjegeon shoots the evil spirit disguised as Buddha with an arrow, as promised, it reveals its true form as an old fox and dies. The Dragon King then invites Jakjegeon to the palace, where Jakjegeon marries the dragon princess and receives treasures such as the Seven Treasures, a cane, and a pig, before returning home. From the union of Jakjegeon and the dragon princess, Wang Geon is born, who later founded Goryeo, which succeeded the Three Kingdoms period. These tales correspond to the motif found in European mythology in which hero slays a dragon (or serpent) to save a kingdom and, as a reward, marries a princess. In this comparison, Geotaji and Jakjegŏn parallels the European dragon-slayer (such as Perseus), the Dragon King corresponds to a royal figure, the dragon princess to a European princess, and the monstrous fox to the dragon(or serpent).Furthermore, the theme in which a hero connected with a dragon overcomes a monstrous fox and subsequently becomes a founding ancestor shows similarities to the foundation myth of Vietnam. A later short story, Wang Sujae Married the Daughter of the Dragon King (王秀才用女取德說), which largely inherits the plot of these narratives, depicts the fox as a 3,000-year-old nine-tailed fox capable of transforming into a beautiful woman. Queen Seondeok of Silla was once struck by an illness, and a monk named Beopcheok was called to diagnose her condition. However, despite his efforts, her illness did not improve. The royal court then summoned another monk, Milbon, who began reciting scriptures. As he did so, a yukcheonjang (a type of monk's weapon) struck a hidden Old fox and Beopcheok, causing them to fall to the ground below the courtyard. After this event, Queen Seondeok’s illness was said to have been miraculously cured. It is not directly described in this story, but from the circumstances described by the story, a monk named Beopchuk appears to have colluded with an old fox and worsened the Queen's condition. Bihyeongnyang, a legendary figure in Silla, was renowned for his ability to command and control divine beings. One of his subordinates, Gildal, became disillusioned with the hard labor and, feeling overwhelmed, transformed into a fox to escape. However, Bihyeongnyang discovered Gildal's transformation and struck him down, leading to his death. During the Tang dynasty in China, a local magistrate fell under the spell of a fox spirit named Liu-Cheng(劉成). Under the fox spirit’s enchantment, he became obsessively devoted to Buddhist practices, neglecting food and, eventually, even putting his own daughter at risk of being taken by the spirit. In response, a renowned Taoist exorcist named Na Gongwon was summoned to eliminate the fox spirit. Na Gongwon managed to subdue the spirit, but Liu-Cheng had already ascended to the level of a Heavenly Fox(天狐), a celestial rank of fox spirits who serve the gods in heaven. Because of this divine status, he could not be killed. As a last resort, Na Gongwon chose to exile Liu-Cheng to the ancient Korean kingdom of Silla. There, Liu Cheng was eventually worshipped as a deity. The rank of Heavenly Fox (天狐), which Liu-Cheng had attained, is regarded in Chinese folklore as the highest level a fox spirit can reach—often described as having golden fur and nine tails. Liu Cheng also claimed to be a miraculous Buddhist monk, a characteristic that closely resembles those of other fox spirits found in native Silla folklore. Perceptions of the fox spirit in the Goryeo period, which inherited traditions from the ancient Three Kingdoms, can be observed in the poetry of scholars and poets. The prominent historical figure Shin Don was also, at times, compared to a fox spirit. 虎出空山舞孼狐. When the tiger leaves the mountain, a wicked fox will dance. 正是風流今頓盡, That is right, the flow of the wind has now suddenly ended, Your elegance has now vanished. 幾令多士涕氷鬚. Do the tears of many scholars in the world freeze their beards? The poem by Im Chun(1149~1182), a Goryeo official, was written in mourning for his acquaintance Kim Yeolbo. The phrase "When the tiger leaves the mountain, the ominous fox begins to dance" is understood as an expression of sorrow over Kim’s death, drawing from the Korean proverb "When the tiger is gone, the fox becomes king." This saying, which carries a meaning similar to the idiom hoga howi (a fox wielding power in the tiger’s name), compares Kim Yeolbo to a tiger, suggesting that in his absence, petty and insidious individuals—likened to foxes—would rise to prominence. In Im Chun’s poem, the fox is used as a metaphor for cunning and treacherous courtiers. Later Korean records also show a tendency to compare sycophantic officials at court to sly foxes or even nine-tailed foxes, reflecting ongoing cultural associations between foxes and deceitful behavior in the political sphere. 女則爲覡男爲巫. Women become shamans and men become sorcerers. 自言至神降我軀, They claim that a supreme spirit has descended upon their bodies, 而我聞此笑且吁. but when I hear this, I can only laugh and sigh. 如非穴中千歲鼠, If they are not thousand-year-old rats in a cave, 當是林下九尾狐. then surely they must be nine-tailed foxes in the forest. In historical sources from the Goryeo period, fox spirits with nine tails make a full appearance. The poem Nomoopyeon (老巫篇, “On Old Shamans”) by the poet Yi Kyubo(1168 ~ 1241) criticizes corrupt shamans who seduce the people with lewd songs and bizarre words, suggesting that they must be either thousand-year-old rats or nine-tailed foxes. This reflects the negative perception of illicit religions and fox spirits during the Goryeo era. Yi Kyubo frequently composed poems that portrayed foxes in a negative light. Ironically, however, this critical stance toward foxes made him the first known figure to record the term kumiho(nine-tailed fox) in Korean literature.[citation needed] Shin Don (辛旽,1322~1377), a prominent figure in Goryeo history, was recorded as a spirit of a cunning fox. As a historical figure, Shin Don was a monk and a close confidant of King Gongmin, who was effectively the last monarch to hold real power in the Goryeo royal court. In Goryeo society, monks were religious figures but generally came from lowly social backgrounds, and thus Shin Don’s rise to become the king’s closest advisor was an extraordinary and unconventional case. He is regarded as a reformer who challenged the corrupt Goryeo political system; however, in the end, he himself was also corrupted by power and became known as a "demonic monk" (妖僧), reflecting his complex and dual nature. Some argue that since Shin Don’s deeds were recorded by revolutionary forces who eventually overthrew Goryeo to establish Joseon, many of the negative reputations attached to him are unfair. Ultimately, Shin Don’s rise from a lowly background to become the king’s close aide, the fact that his history was written by the forces of the dynastic revolution, and most notably that he was defamed as a cunning fox spirit—a form of slander usually reserved for female figures—ironically made him the Korean historical figure most resembling Daji, the notorious woman in Chinese history, despite being a man. As a fox spirit, Shin Don is referred to as Nohojung(老狐精), meaning "old fox spirit."In the records that regard Shin Don as an old fox spirit, it is said that he was considered an old fox spirit because he ate black chickens and white horses and was afraid of yellow dogs. Gang Gam-chan (강감찬, 1020–1075) was a Goryeo military commander who famously defended the kingdom from the Khitan invasions. Over time, his historical role evolved into legend, with many folk tales portraying him as a heroic figure with magical powers. These stories, which emphasize his wisdom, strength, and supernatural abilities, were passed down through generations. A significant number of these oral traditions were collected after Korea's liberation, further cementing Gang Gam-chan’s status as a revered figure in Korean folklore. A well-known folk tale about Gang Gam-chan tells that he was born to a human father and a fox mother. This story, linked to his birthplace in Yangyang, Gangwon Province, was recorded on October 2, 1981, by Kim Seon-pung, Kim Gi-seol, and Kim Gi-hyeon from 72-year-old Kim Hyo-shin in Osaek 1-ri, Seomyeon, Yangyang. After marrying, Gang Gam-chan's father spent his first night with his wife. She then expressed her desire to have a child who would make a name for himself. She told him to go out into the world, sleep with ninety-nine women, and return when he had made the hundredth woman his partner. After fulfilling this task, Gang Gam-chan’s father returned home, and along the way, he noticed a house he had never seen before. He entered the house, and a young woman greeted him with a tray of drinks, offering him hospitality. Drunk, Gang Gam-chan’s father slept with her. Before she left, the woman told him to come back on a specific day to retrieve their child. It was later revealed that this woman was not human, but a fox. When he returned home, Gang Gam-chan's father told his wife about the strange events. On the designated day, he went to the fox woman’s house and brought the child back. The child, Gang Gam-chan, had an unusual appearance due to his fox mother, but he had remarkable talents, excelling in astronomy, geography, and even understanding the sounds of animals. One day, Gang Gam-chan's father attended a friend's son's wedding. After drinking, he and his friend argued about whose son was more accomplished. To settle the dispute, they called for Gang Gam-chan. When the groom, the friend’s son, saw Gang Gam-chan, he ran into a room and hid. Gang Gam-chan yelled at him, and the groom transformed into a snake. Gang Gam-chan then used a talisman, placing it on the bride’s abdomen, causing her to give birth to a large serpent. This incident led to Gang Gam-chan’s fame spreading far and wide. The folk tale that Gang Gam-chan is the son of a fox is a well-known oral tradition that has been collected in other regions of Korea around the same period. According to one storyteller of the folk tale, Gang Gam-chan, being born of a fox, was said to have had the ability to capture tigers. The legendary figure of General Kang Gam-chan in Korean folklore is widely known as a monster-slayer who defeats various types of supernatural creatures. In these stories, Kang Gam-chan is portrayed as the son of a fox spirit, and he often confronts and defeats other fox spirits, such as Baeg-yeowoo or Kumiho. This dual perception of the fox spirit reflects the complex view of the fox sprit in Korean culture: while the fox spirit is seen as the divine origin of the hero’s extraordinary nature, it is simultaneously regarded as a malevolent entity that must be eradicated. The religious status of the fox spirit significantly declined during the Joseon Dynasty, as Confucianism, which denied the existence of supernatural beings, became the dominant state ideology. However, fox spirits continued to appear frequently in popular literature of the period. With the introduction of Chinese shenmo (gods and demons) novels and the development of publishing during the Joseon era, Chinese traditions related to fox spirits were imported and assimilated, leading to more diverse depictions of fox spirits in Korean literature. Common features of fox spirits—such as the possession of a magical marble or having multiple tails—are also found in Joseon-era novels. And in many Korean fantasy novels written during the Joseon period, the spatial or temporal setting is often placed in earlier Chinese states. In these works as well, kumiho and other fox spirits frequently appear. Moreover, depictions of the moral orientation of the kumiho (fox spirit) became more varied during the Joseon period. In Joseon literature, fox spirits are not portrayed solely as a uniformly malevolent “femme fatale” type; instead, narratives also feature trickster or morally neutral kumihos,a enemy commander kumiho appearing as young boy, fox spirits seeking to become human, and even fox spirits who begin as malevolent beings but later repent and convert to religion, becoming benevolent. This variety suggests that Joseon literati understood the fox spirit as a nonhuman being with a range of symbolic meanings, rather than as a single fixed archetype. In the 19th century, the Joseon-era Silhak scholar Yi Gyugyeong wrote in Hoseon Byeonjeungseol (Discourse on the Disputation of Fox Spirits) that “in popular belief, the nine-tailed fox is regarded as a cunning and deceitful being, but according to historical records from China, it was originally considered an auspicious creature.” This remark reflects the widespread popularity of fox spirit narratives among the Joseon commoners at the time. Byeogsadan(辟邪丹)- This medicine cures illnesses caused by Nine-Tailed fox spirits. Take 1 nyang each of ginseng, red tuckahoe, polygala root, cynanchum root, acorus, white atractylodes, black atractylodes, and angelica; 5 don of dono; 3 don each of realgar and cinnabar; and 1 don each of bezoar and musk. Grind all the ingredients into powder, mix them with a glue made by boiling liquor, and form them into pills about the size of a longan, coating each with gold leaf. Take one pill at bedtime with water decocted from costus root. This will prevent all evil energies from approaching the body. In the Donguibogam (1610), a seminal Korean medical compendium, a remedy for illnesses attributed to nine-tailed fox spirits is recorded. The text introduces Byeogsadan (辟邪丹) as a medicinal formula used to treat ailments caused by possession or affliction by such fox spirits, and provides a detailed description of its ingredients and method of preparation. The Jeon Woo-chi Story is a tale based on a strange real-life figure from Joseon (formerly Korea) .The most well-known version of the Jeon Woo-chi Story, which originated in the 17th century, is the 47-volume Gyeongpan (Capital Edition) published in 1847. In this version, the Fox Spirit figures more prominently than in others. In the opening of the novel Jeon Woo-chi-jeon, two fox spirits make their appearance. The first fox spirit, having transformed into a woman, approaches Woo-chi but is ultimately deceived; Woo-chi manages to steal her fox marble. The second fox spirit is a golden kumiho(or Baeg-yeowoo)—also takes the form of a woman to approach Woo-chi. This kumiho engages in a trickster’s battle of deception and cunning with Woo-chi, but ultimately loses one of her heavenly books (cheonseo) to him before going somewhere. Jeon Woo-chi, a scholar, was studying under a renowned teacher named Yun-gong. One day, on his way to visit his teacher, Jeon Woo-chi passed through a bamboo forest and saw a beautiful woman weeping. At first, he ignored her, but on his way back, he found her still crying and approached her. The woman claimed that her stepmother had falsely accused her, and her father now wanted to kill her. She said she was considering suicide but couldn’t bring herself to do it. Moved by her story, Jeon Woo-chi comforted her and encouraged her not to give up on life. As they talked, an emotional connection formed between them. The next day, Jeon Woo-chi visited his teacher and told him what had happened. Yun-gong warned him that the woman was not human, but a fox spirit trying to deceive him. He instructed Jeon Woo-chi to go back and retrieve the fox marble from her mouth. When Jeon Woo-chi returned, the woman initially refused but eventually allowed him to take the marble. He swallowed it, and the woman ran away in tears. Later, he confessed everything to his teacher, who told him that he had already been tainted by the fox’s influence. Because of that, it would now be harder for him to fully grasp the truths of the world. However, from that moment on, Jeon Woo-chi began to develop supernatural powers thanks to the fox marble he had swallowed. The introduction to the first fox spirit’s story adheres closely to a typical plot found in Korean oral folktales, in which a hero (or notable figure) steals a fox marble from a fox spirit. In this narrative, a female fox spirit attempts to kiss a male human to absorb his vital energy, but the man feigns reciprocating the kiss and seizes the fox marble from the fox’s mouth, swallowing it to acquire divine knowledge. The fox marble motif recorded in the literary tale Jeon Woo-chi-jeon is also embodied in Korean oral folktales. In this tale, the young scholar Woo-chi arrives at a temple after meeting a mysterious old man who gifts him a talisman and rope. At night, while studying by lamplight, a beautiful maiden suddenly appears.She claims that she was the daughter of a noble family, but lost her family to bandits and is now wandering. Her beauty and misfortune aroused Woo-chi's sympathy, but when she pleaded for a one-night stand, he began to suspect her. Woo-chi offered her a drink and proposed marriage. At first modest and reluctant, she gradually yields under his persistence, only to collapse after drinking heavily. Sensing something amiss, Woo-chi strips away her disguise and writes a spell upon her chest, revealing that she is no human at all but a fox spirit—specifically,a golden kumiho(a golden nine-tailed fox). Binding her with the rope and striking her with sharp tools, he exposes her deception. Yet even caught,Kumiho begs for her life, saying that she can offer three heavenly books, which are more powerful than the fox's essence, and that the heavenly books are in her lair.Woo-chi, a scholar by nature, is tempted. Though suspicious, he partially frees her to follow, leading him into a cavern where she dwells in splendor among fox-servants. The attendants mistake Woo-chi for prey and rush him, but he fights them off and forces kumiho to produce the books. Here the tables keep turning. kumiho insists she must be freed to explain the texts; Woo-chi counters with threats, refusing to loosen her bonds but extracting the teachings all the same. Overnight, he masters the first volume’s arcane arts, forcing knowledge from kumiho against her will. Having gained what he sought, he pretends magnanimity—removing the talisman and releasing her with a warning never to harm human again.Kumiho appeared to be grateful and leave away, but it was only a temporary tactical retreat. Cunning still lingered in her heart. Almost immediately, uncanny forces strike back at Woo-chi. A voice from the sky reclaims the rope he had used. Then a mysterious scholar, riding a donkey, appears, scolding Woo-chi for daring to meddle with divine texts, and vanishes with one of the volumes. Moments later, a woman claiming to be his nursemaid arrives with news of his mother’s sudden death—only to disappear as another book vanishes. Woo-chi realizes too late that he has been toyed with: kumiho’s illusions and celestial trickery have stripped him of his spoils, leaving him with only the single volume he had sealed with the talisman. In this second fox spirit’s story, both Woo-chi and the kumiho are tricksters, and the story unfolds as they deceive each other. Woo-chi attempts to expose the kumiho and steal her secrets, while the kumiho counters with illusions and tricks to deceive and mislead him. Both characters embody the trickster archetype, with their conflict blurring the lines between predator and prey. The outcome is ambiguous; Woo-chi gains only partial knowledge, and the kumiho survives, with both bearing the consequences of their deceptive struggle. This kumiho also exhibits several traits associated with the Sky Fox. Like the Sky Fox, it possesses nine tails and golden fur, carries the Sky Fox’s signature item—the Cheonseo (“Heavenly book”)—and attempts to ascend to the heavens. In traditional Chinese Sky Fox lore, humans who seek to seize the Cheonseo typically suffer fatal or otherwise disastrous consequences. In Korean folklore, Jeon Woo-chi, however, subverts this pattern: the protagonist succeeds in stealing the Cheonseo and manages to grasp at least part of the magical techniques recorded within it. A tale involving Seo Gyeong-deok, who is known as the teacher of Jeon Woo-chi, also features a nine-tailed fox. The story of Seo Gyeong-deok defeating the nine-tailed fox is included in Dongpaenaksong (東稗洛誦), a collection of Korean folktales from the Joseon dynasty, believed to have been compiled in the 1770s and transcribed by hand from the late 19th century onward. When Seo Gyeong-deok was twelve years old, he studied at a mountain temple under a Buddhist monk. One day, the monk said to him, "Go home for now. Tomorrow, a strange monk will come looking for you. Treat him well, and after he leaves, return here immediately." Following the monk’s instruction, Seo Gyeong-deok went home and waited. Sure enough, a visitor soon arrived—he rode a small donkey and was accompanied by two young attendants dressed in blue. The man said, "I come from Mount Taebaek. I’ve heard that you possess exceptional talent, so I came to see you myself." Seo Gyeong-deok greeted him respectfully and asked questions about Confucian classics, astronomy, geography, divination, and esoteric Taoist arts. The man answered everything without hesitation. Seo Gyeong-deok was so impressed that he thought, "Even my own teacher might not surpass this man." After the guest departed, Seo Gyeong-deok returned to the temple and reported the visit to his teacher. The monk said, "I will now undertake a special meditation. Do not ask me any questions." He then turned to face the wall, sat cross-legged with his eyes closed and hands in prayer, and for three days neither spoke nor ate. Finally, he opened his eyes and said, "Follow me." The monk led Seo Gyeong-deok to the top of a mountain behind the temple. There he said, "Hold tightly under my arms and close your eyes." Then they rose into the air and flew westward. After several days, they landed, and the monk told Seo Gyeong-deok to open his eyes. But he had no idea where they were. The monk took out a powdered medicine, mixed it with water, drank some himself, and gave the rest to Seo Gyeong-deok. Upon drinking it, his mind became clear, and he no longer felt hunger or cold. On the mountaintop stood a massive ancient tree, so large its shade stretched for hundreds of ri (Korean miles). The monk chopped five pieces of wood from the tree and then returned with Seo Gyeong-deok to the temple. The entire journey had taken six days. Back at the temple, the monk thoroughly cleaned a room, laid out mats, and set up a folding screen. He asked Seo Gyeong-deok to lie on his back across the monk's shoulders. He then placed a table before them and took out the five carved wooden figures, painting each one a different color. The blue one was placed to the east, and the others were arranged according to their colors—white, red, black, and yellow. With a staff in hand, the monk began to chant and wait. Around dusk, a loud commotion erupted at the village entrance. The blue figure was the first to run out and engage in battle, but it was defeated and returned. Then the white, red, and black figures each went out in turn, fought, and were also defeated. Finally, the yellow figure went out. As dawn broke, it returned victorious. The monk took Seo Gyeong-deok by the hand, and they went outside. There lay a dead gumiho—a nine-tailed fox. The monk said, "The guest who visited you the other day was none other than this fox. It was born in the time of Yuso (a legendary sage of ancient China) and secretly learned the ways of Heaven and Earth. It roamed the universe freely, with no deity able to defeat it. This fox feeds on the fresh blood and organs of the noblest and most virtuous men. Upon hearing of your remarkable potential, it came to inspect you, intending to devour you. But your body was protected by divine spirits, so it waited for a moment of misfortune to strike. To save you, I needed an object from before the time of Yuso. Only such a thing could subdue the fox. During my meditation, I searched the world with my mind and discovered that the ancient tree I later chopped was from the dawn of creation. Using its wood, I carved the five guardian spirits of the five directions, and they fought the fox. Only with great difficulty were we able to defeat it.Now, you will face no more such calamities." Seo Gyeong-deok fell into a deep sleep. When he awoke, his master was gone. This novel exemplifies a typical Joseon-era narrative that actively incorporates traditional Chinese folklore. This is clearly demonstrated by its reference to the legendary s of Yuso ancient China, indicating the adoption of widely recognized historical Chinese backgrounds. Furthermore, the nine-tailed fox that appears in the story is portrayed as a religious yet heretical figure who has mastered Buddhist and Taoist scriptures—a characterization that aligns with common traits found in traditional fox spirit legends. Myeonghaengjeonguirok (明行正義錄) is a Joseon-period novel set against the backdrop of the Ming dynasty in China. The work features a kumiho (nine-tailed fox spirit) who aspires to become human. While the motif of a human-seeking fox spirit is one of the most commonly depicted themes in modern media, the portrayal in this novel is distinct and notably more complex. In Myeonghaengjeonguirok, the fox spirit is a thousand-year-old kumiho living in a rock cave beneath Nakhyeonam in Hyangju. One day, it witnesses the protagonist Wi Yeon-cheong briefly die and come back to life after being struck by his father. The fox secretly licks up Yeon-cheong’s spilled pure blood and absorbs his vital energy. Until then, it had only fed on the qi of corpses in old graves, but because Yeon-cheong is alive and an exceptional man, a miraculous elixir (dan-yak) forms in the fox’s belly—something normally made by immortals, not monsters. With this elixir, the fox gains new magical powers and begins to dream of becoming human. It assumes Wi Yeon-cheong’s appearance and lives in a small house in Cheongbyeongsan, where it is approached by Sae Hong-seon, a woman who secretly loved Yeon-cheong. She believes the fox is the real Wi Yeon-cheong and seduces “him.” The fox decides to take advantage of this, planning to absorb her yin energy as well, and during their night together rolls the elixir from its own mouth into hers to use it like a fox bead while draining her vitality. However, Sae Hong-seon accidentally swallows the elixir. The illusion breaks: the fox’s disguise and house vanish, revealing its true form. Realizing she has slept with a fox, Sae Hong-seon panics and draws a dagger, while the fox rages at losing its precious elixir. At that moment, a celestial being called Cheon-an Cheonjon appears, makes Sae Hong-seon vomit up the elixir, and carefully bottles it. He praises the fox, saying that although it did not intend to, it has created a medicine that will later save Wi Yeon-cheong’s life. As a reward, he tells the fox a great secret: if it receives a poem from Wi Yeon-cheong, it will be able to cast off its animal body and become human, and in time even cultivate the Way like an immortal. The fox, filled with hope, spends ten long years in Hyangju waiting for Yeon-cheong’s son Wi Cheon-bo to arrive with his father’s poem. But ten years is an eternity for a still-wicked creature; unable to repress its nature, the fox ends up secretly entering the body of a man named Kang Wan’s daughter and feeding on her life force, making her gravely ill. Just then, Wi Cheon-bo passes through Hyangju and stays at Kang Wan’s house, hearing of the daughter’s mysterious illness. When the fox sees Cheon-bo’s sacred sword, it is forced out of the girl’s body and reveals itself as a nine-tailed fox. Cheon-bo binds it with iron chains and is about to kill it for harming humans. Desperate, the fox explains everything that happened with Cheonjon and the elixir. Wi Cheon-bo is a devotedly filial son; since his father has suffered ever since the near-death incident at Tojimun, he has long wished for a cure. Learning that the elixir created from his father’s blood and qi is now in Cheonjon’s hands, and that the immortal has already granted the fox permission to become human, Cheon-bo is overjoyed. He decides not to kill the fox and instead gives it his father’s poem, which speaks of smoke and dark mist around a desolate grave, peering through a hole to steal the morning light, and, having heard the Buddha’s teaching, casting off a mottled animal pelt for a Daoist’s robe. By receiving this poem, the fox gains the right to shed its beastly form, become human, and walk the path of an immortal. In this novel, the kumiho is portrayed in an unusual way: it is male and primarily harms humans by draining the life-essence of women. At the same time, however, the fox spirit ardently desires to become human, a goal rooted in the Daoist worldview that one must first attain human status before advancing toward immortality. The method by which a fox may be reborn as a human is to receive a piece of human poetry, and the narrative further extends this motif to other creatures: a fish, upon receiving human verse, is depicted as transforming into a dragon. This framework reflects a Daoist cosmology in which the higher stage of a fox is a human (or Daoist adept), the higher stage of a fish is a dragon, and in which animal spirits must obtain human recognition in order to ascend to a superior spiritual state. The story of Daji (妲己), the most famous nine-tailed fox in East Asian classical literature, featured in the Chinese novel Investiture of the Gods, was introduced to Joseon Korea in the 17th century. The Korean reading of her name, Dal-gi (달기), became widely used, and Koreans referred to her by this pronunciation. The tale of the gumiho (nine-tailed fox) Dal-gi spread broadly, and Koreans even created an adapted version focusing on her deeds, titled So-Dal-gi-jeon (The Tale of So Dal-gi). The notion of Dal-gi as a malevolent nine-tailed fox became widespread during the Joseon period. Her story exerted a considerable influence on Joseon literature, and Dal-gi herself appeared directly in several classical Korean novels. In Gurae Gong Jeong Chung Jik Jeol Gi (寇萊公貞忠直節記), a Korean classical novel set in the Song dynasty, a cruel character named Yu Ho-yeong is described as a reincarnation of Dal-gi. In another Joseon novel, Ssangseong Seongbong Hyo-rok (雙星奉孝錄), the character Myo-wol trains in a cave, where she remarks that “this is the place(the cave) where a nine-tailed fox once cultivated the Way, donned a human skull, and became Dal-gi.” This passage reflects the widespread belief that Dal-gi deliberately used her beauty to bring about the downfall of the Shang dynasty. A version of the Dal-gi legend also appears in the late Joseon–period animal allegory Okpodong-Giwanrok (玉浦洞奇玩錄), which is set in Okpodong, Korea. In this story, a wise toad recounts the deeds of various animals’ ancestors while boasting of his own age and rank among the beasts. When a fox mocks him, the toad retaliates by revealing the wicked history of the fox’s supposed ancestress, Dal-gi, who is portrayed as a nine-tailed fox spirit. According to the toad’s account, Dal-gi was executed for bringing ruin upon the Shang dynasty but continued to haunt the world as an evil fox spirit, spreading chaos through her descendants, including a depraved offspring named Ho-ri (狐羸). While exposing the crimes of Dal-gi and her corrupt progeny, the toad also acknowledges the existence of a virtuous line of foxes who opposed Dal-gi. The progenitor of this righteous fox clan was said to have revealed Dal-gi’s true nature and aided in her capture, and their ancestral homeland was called Yeonghojin (令狐津, “Ford of the Virtuous Fox”). In the end, the fox—cornered by the toad’s account and seeking a way to save face—claims that his own lineage originates from Yeonghojin, thus attempting to associate himself with the virtuous fox clan that once opposed Dal-gi. In Korea, several place names are thought to have derived from the name Dal-gi. Near Juwangsan (literally “King Juwang Mountain”), several toponyms include the name Dal-gi, among which Dal-gi Falls and the Dal-gi Mineral Spring are the most notable. According to Korean legends, “King Zhou” (Juwang, 주왕) of Juwangsan was a king who came from the Tang dynasty. He is therefore not the same figure as King Zhou of the Shang dynasty, Dal-gi’s husband in Chinese mythology.The toponym “Dal-gi” is also believed to have originated during the Joseon dynasty, when the Dal-gi Mineral Spring was first discovered. According to local tradition, the sound of the bubbling water was said to resemble the crowing of a rooster — dak in Korean — which sounded similar to Dal-gi. For this reason, the place was named “Dal-gi.” Historical records indicate that the name is unrelated to the Korean pronunciation of Daji (妲己) from Chinese mythology. However, the fact that several place names containing “Dal-gi” (interpreted as Juwang’s wife) are located within Juwangsan (“King Zhou’s Mountain,” referring to Dal-gi’s husband) is considered by many to be more than a coincidence. As a result, many Koreans associate the local toponyms with the characters from Investiture of the Gods. Because the Chinese characters for “barbarian” hu (胡) and “fox” hu (狐) are homophones, several scholars have argued that in Chinese cultural history the image of the ethnic Other was at times projected onto the fox. A similar phonetic overlap exists in Korean, where the Sino-Korean reading ho refers both to foreign tribes (胡) and to the fox (狐). As a result, this conceptual association also appears to have shaped certain Joseon-period narratives. This idea is reflected in late Joseon–period literary works set during the Yuan and Ming dynasties, in which the king of foreign tribes—referred to as Ho-wang (Korean: 호왕; Hanja: 胡王)—is frequently linked to fox spirits. In these tales, Ho-wang is not merely symbolically associated with foxes: he commands formidable fox-spirit generals who fight on his behalf. Within the broader corpus of Joseon war tales (gundam soseol), which commonly adopt the Central Plains under the Yuan and Ming as their temporal and spatial backdrop, the protagonists’ armies repeatedly confront Ho-wang and his retinue of supernatural commanders. Among these supernatural retainers, fox-spirit generals feature prominently, reinforcing the literary convention that equates foreign adversaries with powerful fox entities. Then Se-yung-wang began to recite a scripture (or incantation), whereupon terrifying divine generals (神將, sinjang) appeared at his side. They charged in with dreadful momentum, collapsing the battlefield where men and horses were already tangled together. Assaulted on all sides by these fearsome spirit generals, the various commanders and common soldiers alike were struck with terror, able only to stumble about and flail their limbs in panic. In the war tale Cheonjeong-gayeon (Korean: 천정가연; Hanja: 天定佳緣), Se-yung-wang (西戎王) appears as one of the enemy commanders who fight against the protagonists Yu Sin (유신) and Cheongpung (청풍), displaying formidable power on the battlefield. Within this narrative, Ho-wang rules over a group of ten-thousand-year-old animal spirits—a tiger, a snake and a fish—who serve as his generals, and Se-yung-wang is likewise described as a ten-thousand-year-old fox spirit. Suddenly, a young boy descended from the sky and appeared before Ho-wang (胡王). Ho-wang asked, “Who might you be?” The boy replied, “This humble one’s surname is Ho, and my given name is Cheonnyeon. I have dwelled in hiding deep within the valleys of Mt. Hyeongsan for nine hundred years, never once entering the human world… (omitted) Ho-wang deploys a Gyoryong (蛟龍) skilled in illusions together with a kumiho as his vanguard and challenges Sagak once more. However, Sagak repels them with the nobaekgeom and byeongnyeokdo, weapons bestowed upon him by a Daoist adept. In the Sagakjeon (Korean: 사각전), an unusually portrayed kumiho appears in the guise of a young boy. In this narrative, the nine-tailed fox disguises itself as a youthful commander serving under Ho-wang (胡王), bearing the name Ho Cheonnyeon (Korean: 호천년). This boy-general fox spirit leads troops in battle but is ultimately defeated by Sagak, who wields the Daoist swords nobaekgeom and byeongnyeokdo, said in later retellings to be related to the legendary Songbaekdo, a blade forged from a thousand-year-old white pine. In Okrumong, a fox spirit wanders the human world disguised as an unnamed woman. After secretly overhearing Baegun Dosā and Gang Namhong discussing transformation magic and immortal arts, the fox begins acting deliberately. She transforms into a stunning woman called Sobosal and becomes the wife of Talhae, king of the southern “barbarian” state Hongdoguk. She then incites Talhae to attack Ming China, persuading him to overthrow Ming rather than remain a tributary. Ming responds by sending generals Yang Changgok and Gang Namhong (a woman fighting in male disguise). Gang Namhong captures Sobosal, and Sobosal begs for mercy, claiming her wrongdoing was partly “fated” and swearing she will leave the human world and take refuge in Buddhism. Gang Namhong—unusually compassionate—releases her, arguing that fox spirits cause chaos mainly when human society lacks virtue, and that killing them all is neither possible nor the true solution. Later, war erupts again in the north, and rumors spread that Yaryul Seonu(A New Barbarian King)’s wife is Sobosal. Gang Namhong feels betrayed when he sees a woman identical to Sobosal and is ready to kill her, but then the real Sobosal descends from heaven. She explains that she has truly been cultivating Buddhism in the Western Paradise, while the “Sobosal” on earth is an impostor fox—a former companion who stole her name and appearance. To prove it, the real Sobosal briefly reveals a fox form to command the other fox demons, then seals the impostor and her followers into a gourd. Now having “shed the beast’s body,” Sobosal is no longer merely a fox demon but has become human in both spirit and conduct, living as a genuine Buddhist devotee. Okrumong, composed in the mid-nineteenth century amid the collapse of established social orders and the formation of new value systems, has been read as articulating human ideals and aspirations through a pragmatic, utilitarian outlook. Within this framework, the depiction of the fox spirit Sobosal is notably multidimensional: she first appears as a conventional “seductive fox demon,” assuming the form of a beautiful woman to entice a ruler and incite his wrongdoing, but after being captured and bound by the female protagonist, she repents and later suppresses other malevolent spirits. Sobosal ultimately succeeds in becoming human and is portrayed as attaining a higher spiritual state, a rare narrative outcome for a fox spirit. The episode also implies a religious hierarchy in which humanity is positioned above fox spirits in spiritual attainment. The idea that fox spirits gain higher levels of cultivation as they grow more tails was not actively adopted in Korea. Nevertheless, several classical Korean novels implicitly suggest an internal hierarchy among fox spirits that corresponds to the number of tails they possess. Im Seong, overcome with guilt, tries to throw himself into the sea to die, but is rescued by his companions. While they continue their voyage, they encounter a ship from the Country of Women (女人國, yeoin-guk) and, at the invitation of its beautiful women, enter the land. The queen treats them with great hospitality and proposes marriage to Im Seong, intending also to pair the nine princesses with Im Seong’s men. She urges Im Seong to share the nuptial bed with her that very night. Sensing that something is amiss, Im Seong puts off the wedding night for a day on the pretext of preparing for the ceremony. He spends the night devising a plan and puts it into action the following day. He takes a piece of hide from his sleeve and throws it before the queen. At that moment, the hide turns into a yellow dog, which rushes at the queen and bites her to death. After Im Seong and his companions defeat the inhabitants of the Country of Women, they discover that the queen and the nine princesses are foxes with seven, five, or three tails, and that the ladies-in-waiting are also long-lived foxes. In the late Joseon dynasty novel Taewonji(태원지), foxes transform themselves into beautiful women and establish a realm called the Country of Women. When the protagonists overthrow the inhabitants of this land, the women are revealed to be fox spirits, and the description of their tails suggests a graded hierarchy among them: the queen is said to have seven tails, the nine princesses are described as having five or three tails, and the ladies-in-waiting are characterized simply as foxes of great age. Ho-mi-a (Korean: 호미아) is described as a seven-tailed fox with golden fur and a disciple of Byeokjin the nine-tailed fox. In Yui-Yangmunrok(劉李兩門錄), the portrayal of Ho-mi-a a fox spirit bearing seven tails-serving under a nine-tailed master suggests a hierarchical system among fox spirits based on the number of their tails. The Annals of the Joseon Dynasty contain several instances suggesting that the term kumiho (九尾狐, nine-tailed fox) was used as a derogatory expression to refer to officials of rival political factions. During the reign of King Jeongjo, Park Sang (pen name: Neoljae), a leading figure of the Sarim faction, likened the opposing Hungu faction to ominous creatures representing the four directions. In a scathing verse, he cursed them, saying: "A male pit viper eyes me from the left, An old owl spies on me from the right. A two-headed serpent coils before me, And a nine-tailed fox crouches behind my back." In the early modern Korean period (early 20th century), the term and concept of kumiho were already in use. Evidence appears in the late-Joseon newspaper Daehan Maeil Sinbo, which includes a tabloid-style report describing the kumiho’s method of luring victims by transforming into a woman, as well as an opinion piece that employs the kumiho as a metaphor for a corrupt ruling class. After the Japanese annexation, Daehan Maeil Sinbo was renamed Maeil Sinbo, which continued to incorporate kumiho folklore into reports of unusual events. These include articles describing victims as having “fallen prey to a kumiho,” as well as a piece criticizing a shaman who fabricated stories while invoking the kumiho. These examples indicate that early modern Korean newspapers frequently employed the kumiho as a narrative device, using it both in sensational ghost-story reporting and in social commentary. In 1930, Korean historian Son Jin-tae published Korean Folktales (朝鮮の民話) in Japanese, which includes the story "The Mountain Goddess and the Dragon King (女山神と龍王)." This work reflects his effort to collect and document traditional Korean oral folktales that were circulating among the populace at the time. In the story, the Mountain Goddess is portrayed as a thousand-year-old fox spirit. Long ago, there lived a warrior. One day, as he was walking along the beach, he saw seven boys trying to capture a great turtle with three tails and cut it into seven pieces. The warrior noticed the turtle’s pleading eyes, so he bought it from the boys and released it back into the sea. The turtle then spoke, thanking him. It revealed that it was actually the Dragon King who had come to the human world out of curiosity, only to nearly lose its life to the boys. Grateful for the warrior’s help, the Dragon King returned to the depths of the ocean. The warrior continued his wandering, and one evening he arrived at a lonely house deep in the mountains. An old woman who lived there offered him supper and asked where he planned to go. When the warrior replied that he wished to climb further up the mountain, the old woman tried to dissuade him. She warned that although many had entered the mountain, none had ever returned, for an evil demon woman dwelled there and would harm him. But the warrior answered, “What kind of warrior would I be if I feared such things?” And the next morning, he set out to find the wicked mountain spirit. Deep in the forest, he came upon a house where a beautiful woman welcomed him. She led him into her chamber, served him delicious food, and then asked him to marry her. “Living alone in the mountains is so lonely,” she said. “Please stay here with me. I am the Goddess of this mountain.” The warrior refused her offer sternly, saying it was improper. At this, the woman flung a sheet of paper into the air. Suddenly, the sky darkened, and countless fiery blades appeared all around. The warrior trembled, realizing he could not defeat her. Just then, he remembered his encounter with the Dragon King, and he begged the woman for a grace period of seven days. She laughed mockingly and warned him never to forget that no matter where he fled, she would be able to find him. and she granted his request. When the warrior arrived at the beach and called upon the Dragon King, a boy appeared and guided him to the Dragon Palace. The Dragon King, along with his three younger brothers, went to the place where the Mountain Goddess of Yeosan resided. The Yeosan Goddess laughed loudly and said, “You have come seeking help from the Dragon King. But even with the Dragon King's power, you cannot kill me. Let me show you my true strength.” The goddess wrote something on a piece of paper and sent it floating into the air. Three fiery blades flashed in the sky, each slicing the three dragons into two. Once again threatened by the Mountain Goddess, the warrior requested another grace period of one month. Granted permission by the goddess, the warrior went to see the Dragon King again. The Dragon King said that the power of the Dragon Kingdom alone was not enough to defeat the woman and that they must petition the Jade Emperor to kill her. The heavenly gods sent three warriors to strike the mountain goddess. They brought fierce winds and heavy rain, pounding the mountain so violently that heaven and earth trembled. The goddess laughed loudly and said, “This time, you have called upon the heavenly gods. This opponent is strong, but let me show you my true power.” She took out a piece of paper and sent something flying into the air, but she could not defeat the three warriors from heaven. Lightning struck the goddess’s house, and when the goddess’s dead body was revealed, she appeared as a large fox. The warrior respectfully thanked the three heavenly warriors and, on his way home, stopped by the old woman’s house who had once been the mountain goddess. The warrior restored her to her former position as the mountain goddess and went to the Dragon Palace to pay respects to the Dragon King as well. This story is a folktale from the Joseon(Japanese colonial) period in Korea and shares notable similarities with fox spirit legends recorded in the Samguk Yusa. The fox spirit holds the status of the Mountain Goddess, reminiscent of Heuk-yeoyoo (the black fox) who aided the monk Won-Gwang. This monstrous fox is depicted as a formidable being capable of simultaneously defeating three dragons and resisting the power of the Dragon King and his kingdom. This portrayal evokes the image of the monstrous fox from the Geotaji and Jakjegeon legends. This suggests that fox spirits were traditionally viewed not only as supernatural mountain gods but also as powerful adversaries of dragons within the local mythological framework. The comic Brown Fox (Korean: 《갈색여우》), published on 22 May 1972, uses foxes as an allegory for the political situation on the Korean Peninsula. Its protagonist, the brown fox, is the central character but is portrayed as petty, sly, and highly skilled at flattering those in power, making him a picaresque-type figure. All of the other foxes in the story, apart from the brown fox, are white. The work is a fable that metaphorically represents the turbulent circumstances of the modern Korean Peninsula through incidents taking place in a forest inhabited by foxes. The white foxes stand for the residents of the peninsula, the wolves symbolize foreign powers and dictators, the humans represent a powerful third party that neutralizes those foreign and dictatorial forces, and the brown fox corresponds to a petty collaborator who serves the foreign or dictatorial powers but is ruined once those powers are overthrown by the stronger third party. Although the story depicts wild foxes rather than magical fox spirits, the fox inhabitants of the forest are white-furred like traditional fox spirits. The brown fox, because of his treacherous behavior, is derisively called bul-yeowoo (Korean: 불여우), a term that also appears as one of the appellations for fox spirits in Korean folklore. The presence of such fox-spirit imagery, even in a modern fable about wild foxes, suggests how strongly Koreans have tended to view the fox as an inherently uncanny and suspicious animal. List of contemporary media featuring yeowoo/kumiho In South Korea, the anthology horror television series Hometown of Legends (Jeonseol-ui Gohyang) is widely regarded as having popularized the modern image of the kumiho. The show’s kumiho episodes featured actresses in striking fox-spirit makeup that emphasized visual horror and shock, firmly establishing the kumiho as a central figure in televised horror for the general public. The role of the kumiho was first played by Han Hye-suk in 1979, and was later taken on by many of the era’s leading actresses, including Jang Mi-hee, Jung Yoon-hee, Yoo Ji-in, Kim Mi-sook, Sunwoo Eun-sook, Han Hye-gyeong, and Kim Ja-ok, all of whom either rose to stardom or further solidified their popularity through these appearances. From the 1990s onward, prominent or up-and-coming stars such as Park Sang-ah, Lim Kyung-ok, Song Yoon-ah (in the 1997 production Kumiho), Noh Hyun-hee, and Kim Ji-young were also cast in kumiho roles. In Hometown of Legends, the kumiho is portrayed as harboring a strong desire to become human, a characterization that has come to be regarded as an influential prototype for later depictions of the kumiho in contemporary Korean popular culture. From the late 2000s, televised depictions began to move away from traditional horror-focused portrayals, increasingly featuring kumiho characters as familiar or sympathetic female protagonists, and eventually introducing male kumiho characters as well. Kumiho characters in Korean animation have been directly influenced by their portrayals in Korean television dramas, resulting in early animated depictions that emphasize sympathetic or emotionally evocative qualities. For example, the fox spirit in Once Upon a Time with Cabbage Wizard & Radish Wizard, one of the earliest known kumiho-themed animated works, is portrayed as a thousand-year-old fox with a sincere devotion to her family. Likewise, the fox protagonists of Yoranga-Yoranga and Yobi, the Five-Tailed Fox are depicted as friendly, girl-like figures.[citation needed] Although some animated works do portray the kumiho as a malevolent or frightening creature—reflecting the influence of earlier horror-oriented media—such depictions have become increasingly uncommon in recent years.[citation needed] Kingdom of the Winds, which began closed beta testing in 1995, has been recognized by Guinness World Records as the longest-running graphical MMORPG and is often cited as one of the earliest Korean-style fantasy MMORPGs. In the game, the kumiho appears as the final boss of one of its earliest dungeons, the Fox Den, and represents one of the earliest known implementations of a nine-tailed fox in an online game. Nexon, the developer of Kingdom of the Winds, introduced a playable kumiho avatar to the game in 2012 and has actively incorporated kumiho-themed content into several of its other titles. Because South Korea has a highly developed online gaming industry and the kumiho has become a representative motif in Korean fantasy, many other online game companies also make frequent use of kumiho-related characters and settings, with some long-running online games adding kumiho content only at a later stage in their service. League of Legends, an online game developed by the American company Riot Games and originally released in 2009, introduced the champion Ahri, a kumiho-inspired character, in 2011 to coincide with the launch of the game’s Korean servers. Ahri’s design drew on Korean nine-tailed fox lore, and her Korean-style name was selected through a poll of Korean players on the official League of Legends website. During the 1990s and 2000s, kumiho characters appeared frequently in print manhwa. In the Kingdom of the Winds manhwa, which later served as the basis for the identically titled online game, the kumiho appears as a minor antagonist. Several romance-oriented manhwa of the period also featured kumiho characters. In Reborn Gumiho, the kumiho protagonist is depicted not as a purely benevolent or malevolent figure but as a morally complex character combining both good and evil traits, while Raon The Gumiho includes a hermaphroditic kumiho character.[citation needed] Following the industry-wide transition from print manhwa to webtoon formats in the 2010s, kumiho characters have continued to appear in a wide variety of works.[citation needed] See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Eleventh_government_of_Israel] | [TOKENS: 368] |
Contents Eleventh government of Israel The eleventh government of Israel was formed on 26 June 1963, midway through the fifth Knesset. It was the first government formed by Levi Eshkol following the second resignation of David Ben-Gurion. Eshkol kept the same coalition partners as previously, i.e. Mapai, the National Religious Party, Ahdut HaAvoda, Poalei Agudat Yisrael, Cooperation and Brotherhood and Progress and Development. There were few changes, with Eshkol replacing Ben-Gurion in the dual role of Prime Minister and Minister of Defense, Pinhas Sapir replacing Eshkol as Minister of Finance, and Abba Eban replacing Zalman Aran as Education Minister, as well as becoming the country's second Deputy Prime Minister. Eshkol presented it as a "government of continuity". Deputy Ministers were appointed on 1 July. The government resigned following the resignation of Eshkol on 14 December 1964. Eshkol had quit over a dispute with Ben-Gurion concerning the Lavon Affair, which Ben Gurion had demanded that the Supreme Court investigate. The twelfth government was formed a week later. David Ben Gurion Mapai Levi Eshkol Mapai Cabinet members 1 Although Gvati was not an MK at the time, he later entered the Knesset as a member of the Alignment, a merger or Mapai and Ahdut HaAvoda. 2 Although Yosef was not an MK at the time, he was a member of Mapai. 3 Although Sasson was not an MK at the time, he was elected to the next Knesset as a member of the Alignment, an alliance of Mapai and Ahdut HaAvoda. References External links |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.