text
stringlengths
0
473k
[SOURCE: https://en.wikipedia.org/wiki/History_of_the_Jews_in_Hungary] | [TOKENS: 19955]
Contents History of the Jews in Hungary The history of the Jews in Hungary dates back to at least the Kingdom of Hungary, with some records even predating the Hungarian conquest of the Carpathian Basin in 895 CE by over 600 years. Written sources prove that Jewish communities lived in the medieval Kingdom of Hungary and it is even assumed that several sections of the heterogeneous Hungarian tribes practiced Judaism. Jewish officials served the king during the early 13th century reign of Andrew II. From the second part of the 13th century, the general religious tolerance decreased and Hungary's policies became similar to the treatment of the Jewish population in Western Europe. The Ashkenazi of Hungary were fairly well integrated into Hungarian society by the time of the First World War. By the early 20th century, the community had grown to constitute 5% of Hungary's total population and 23% of the population of the capital, Budapest. Jews became prominent in science, the arts and business. By 1941, over 17% of Budapest's Jews had converted to the Catholic Church.[a] Anti-Jewish policies grew more repressive in the interwar period as Hungary's leaders, who remained committed to regaining the territories lost at the peace agreement (Treaty of Trianon) of 1920, chose to align themselves with the governments of Nazi Germany and Fascist Italy – the international actors most likely to stand behind Hungary's claims. Starting in 1938, Hungary under Miklós Horthy passed a series of anti-Jewish measures in emulation of Germany's Nürnberg Laws. Following the German occupation of Hungary on March 19, 1944, Jews from the provinces were deported to the Auschwitz concentration camp; between May and July that year, 437,000 Jews were sent there from Hungary, most of them gassed on arrival. The 2011 Hungary census data had 10,965 people (0.11%) who self-identified as religious Jews, of whom 10,553 (96.2%) declared themselves as ethnic Hungarian. Estimates of Hungary's Jewish population in 2010 range from 54,000 to more than 130,000 mostly concentrated in Budapest. There are many active synagogues in Hungary, including the Dohány Street Synagogue, the largest synagogue in Europe and the second largest synagogue in the world after Temple Emanu-El of New York. Early history It is not definitely known when Jews first settled in Hungary. According to tradition, King Decebalus (ruled Dacia 87–106 CE) permitted the Jews who aided him in his war against Rome to settle in his territory. Dacia included part of modern-day Hungary as well as Romania and Moldova and smaller areas of Bulgaria, Ukraine, and Serbia. Prisoners of the Jewish-Roman Wars may have been brought back by the victorious Roman legions normally stationed in Provincia Pannonia (Western Hungary). Marcus Aurelius ordered the transfer of some of his rebellious troops from Syria to Pannonia in 175 CE. These troops had been recruited partly in Antioch and Hemesa (now Homs), which still had a sizable Jewish population at that time. The Antiochian troops were transferred to Ulcisia Castra (today Szentendre), while the Hemesian troops settled in Intercisa (Dunaújváros). According to Raphael Patai, stone inscriptions referring to Jews were found in Brigetio (now Szőny), Solva (Esztergom), Aquincum (Budapest), Intercisa (Dunaújváros), Triccinae (Sárvár), Dombovár, Siklós, Sopianae (Pécs) and Savaria (Szombathely). A Latin inscription, the epitaph of Septima Maria, discovered in Siklós (southern Hungary near Croatian border), clearly refers to her Jewishness ("Judaea"). The Intercisa tablet was inscribed on behalf of "Cosmius, chief of the Spondilla customhouse, archisynagogus Iudeorum [head of the synagogue of the Jews]" during the reign of Alexander Severus. In 2008, a team of archeologists discovered a 3rd-century AD amulet in the form of a gold scroll with the words of the Jewish prayer Shema' Yisrael inscribed on it in Féltorony (now Halbturn, Burgenland, in Austria). Hungarian tribes settled the territory 650 years later. In the Hungarian language, the word for Jew is zsidó, which was adopted from one of the Slavic languages. The first historical document relating to the Jews of Hungary is the letter written about 960 CE to King Joseph of the Khazars by Hasdai ibn Shaprut, the Jewish statesman of Córdoba, in which he says that the Slavic ambassadors promised to deliver the message to the King of Slavonia, who would hand the same to Jews living in "the country of Hungarin", who, in turn, would transmit it farther. About the same time Ibrahim ibn Jacob says that Jews went from Hungary to Prague for business purposes. Nothing is known concerning the Jews during the period of the grand princes, except that they lived in the country and engaged in commerce there. In 1061, King Béla I ordered that markets should take place on Saturdays instead of the traditional Sundays (Hungarian language has preserved the previous custom, "Sunday" = vasárnap, lit. "market day"). In the reign of St. Ladislaus (1077–1095), the Synod of Szabolcs decreed (20 May 1092) that Jews should not be permitted to have Christian wives or to keep Christian slaves. This decree had been promulgated in the Christian countries of Europe since the 5th century, and St. Ladislaus merely introduced it into Hungary. The Jews of Hungary at first formed small settlements, and had no learned rabbis; but they were strictly observant of all the Jewish laws and customs. One tradition relates the story of Jews from Ratisbon (Regensburg) coming into Hungary with merchandise from Russia, on a Friday; the wheel of their wagon broke near Buda (Ofen) or Esztergom (Gran) and by the time they had repaired it and had entered the town, the Jews were just leaving the synagogue. The unintentional Sabbath-breakers were heavily fined. The ritual of the Hungarian Jews faithfully reflected contemporary German customs. Coloman (1095–1116), the successor of St. Ladislaus, renewed the Szabolcs decree of 1092, adding further prohibitions against the employment of Christian slaves and domestics. He also restricted the Jews to cities with episcopal sees – probably to have them under the continuous supervision of the Church. Soon after the promulgation of this decree, Crusaders came to Hungary; but the Hungarians did not sympathize with them, and Coloman even opposed them. The infuriated Crusaders attacked some cities, and if Gedaliah ibn Yaḥya is to be believed, the Jews suffered a fate similar to that of their coreligionists in France, Germany, and Bohemia. The cruelties inflicted upon the Jews of Bohemia induced many of them to seek refuge in Hungary. It was probably the immigration of the rich Bohemian Jews that induced Coloman soon afterward to regulate commercial and banking transactions between Jews and Christians. He decreed, among other regulations, that if a Christian borrowed from a Jew, or a Jew from a Christian, both Christian and Jewish witnesses must be present at the transaction. During the reign of King Andrew II (1205–1235) there were Jewish Chamberlains and mint-, salt-, and tax-officials. The nobles of the country, however, induced the king, in his Golden Bull (1222), to deprive the Jews of these high offices. When Andrew needed money in 1226, he farmed the royal revenues to Jews, which gave ground for much complaint. The pope (Pope Honorius III) thereupon excommunicated him, until, in 1233, he promised the papal ambassadors on oath that he would enforce the decrees of the Golden Bull directed against the Jews and the Saracens (by this time, the papacy had changed, and the Pope was now Pope Gregory IX; he would cause both peoples to be distinguished from Christians by means of badges; and would forbid both Jews and Saracens to buy or to keep Christian slaves. The year 1240 was the closing one of the fifth millennium of the Jewish era. At that time the Jews were expecting the advent of their Messiah. The Mongol invasion in 1241 seemed to conform to expectation, as Jewish imagination expected the happy Messianic period to be ushered in by the war of Gog and Magog. Béla IV (1235–1270) appointed a Jewish man named Henul to the office of court chamberlain (Teka had filled this office under Andrew II); and Wölfel and his sons Altmann and Nickel held the castle at Komárom with its domains in pawn. Béla also entrusted the Jews with the mint; and Hebrew coins of this period are still found in Hungary. In 1251 a privilegium was granted by Béla to his Jewish subjects which was essentially the same as that granted by Duke Frederick II the Quarrelsome to the Austrian Jews in 1244, but which Béla modified to suit the conditions of Hungary. This privilegium remained in force down to the Battle of Mohács (1526). At the Synod of Buda (1279), held in the reign of King Ladislaus IV of Hungary (1272–1290), it was decreed, in the presence of the papal ambassador Philip of Fermo, that every Jew appearing in public should wear on the left side of his upper garment a piece of red cloth; that any Christian transacting business with a Jew not so marked, or living in a house or on land together with any Jew, should be refused admittance to the Church services; and that a Christian entrusting any office to a Jew should be excommunicated. Andrew III (1291–1301), the last king of the Árpád dynasty, declared, in the privilegium granted by him to the community of Posonium (Bratislava), that the Jews in that city should enjoy all the liberties of citizens. Expulsion, readmission and persecution (1349–1526) Under the rule of the foreign kings who occupied the throne of Hungary on the extinction of the house of Arpad, the Hungarian Jews were subjected to many persecutions. During the time of the Black Death (1349), they were expelled from the country. Even though the Jews were immediately readmitted, they were persecuted again, and in 1360, they were expelled by King Louis the Great of Anjou (1342–1382). Although King Louis had initially shown tolerance to the Jews during the early years of his reign, following his conquest of Bosnia, during which he tried to force the local population to convert from the "heretic" Bogomil Christianity to Catholicism, King Louis attempted to impose conversion on Hungarian Jews as well. However, he failed in his attempt to convert them to Catholicism, and expelled them. They were received by Alexander the Good of Moldavia and Dano I of Wallachia, the latter who afforded them special commercial privileges. Some years later, when Hungary was in financial distress, the Jews were readmitted. They learned that during their absence, the king had introduced the custom of Tödtbriefe, i.e., cancelling by a stroke of his pen, on the request of a subject or a city, the notes and mortgage-deeds of the Jews. An important office which was created by Louis was the office of the "judge of all the Jews living in Hungary," who was chosen from among the dignitaries of the country, the palatines, and treasurers, and he was aided by a deputy. It was his duty to collect taxes from the Jews, protect their privileges, and listen to their complaints, which had become more frequent since the reign of Sigismund Luxembourg (1387–1437). The successors of Sigismund: Albert (1437–1439), Ladislaus Posthumus (1453–1457), and Matthias Corvinus (1458–1490) all likewise confirmed the privilegium of Béla IV. Matthias created the office of Jewish prefect in Hungary. The period following the death of Matthias was a sad one for the Hungarian Jews. He was hardly buried, when the people fell upon them, confiscated their property, refused to pay debts owing to them, and persecuted them generally. The pretender John Corvinus, Matthias' illegitimate son, expelled them from Tata, and King Ladislaus II (1490–1516), always in need of money, laid heavy taxes upon them. During his reign, Jews were for the first time burned at the stake, many being executed at Nagyszombat (Trnava) in 1494, on suspicion of ritual murder. For protection, the Hungarian Jews applied to the Holy Roman Emperor Maximilian. On the occasion of the marriage of Louis II and the archduchess Maria (1512), the emperor, with the consent of Ladislaus, took the prefect, Jacob Mendel of Buda, together with his family and all the other Hungarian Jews, under his protection, according to them all the rights enjoyed by his other subjects. Under Ladislaus' successor, Louis II (1516–1526), persecution of the Jews was a common occurrence. The bitter feeling against them was in part augmented by the fact that the baptized Emerich Szerencsés, the deputy treasurer, embezzled the public funds. War against the Ottomans (1526–1686) The Ottomans vanquished the Hungarians at the Battle of Mohács (29 August 1526), on which occasion Louis II lost his life on the battlefield. When the news of his death reached the capital, Buda, the court and the nobles fled together with some wealthy Jews, among them the prefect. When the grand vizier, Ibrahim Pasha, preceding Sultan Suleiman I, arrived with his army at Buda, the representatives of the Jews who had remained in the city appeared garbed in mourning before him, and, begging for grace, handed him the keys of the deserted and unprotected castle in token of submission. The sultan himself entered Buda on September 11; and on September 22 he decreed that all the Jews seized at Buda, Esztergom, and elsewhere, more than 2,000 in number, should be distributed among the cities of the Ottoman Empire. They were sent to Constantinople, Plevna (Pleven) and Sofia, where they maintained their separate community for several decades. In Sofia, there existed four Jewish communities in the second half of the 16th century: Romaniote, Ashkenazi, Sephardi and "Ungarus". The overflow of Hungarian Jews from Sofia also settled in Kavala later. Although the Ottoman Army turned back after the battle, in 1541 it again invaded Hungary to help repel an Austrian attempt to take Buda. By the time the Ottoman Army arrived, the Austrians were defeated, but the Ottomans seized Buda by ruse. While some of the Jews of Hungary were deported to Anatolia, others, who had fled at the approach of the sultan, sought refuge beyond the frontier or in the free royal towns of western Hungary. The widow of Louis II, the queen regent Maria, favored the enemies of the Jews. The citizens of Sopron (Ödenburg) began hostilities by expelling the Jews of that city, confiscating their property, and pillaging the vacated houses and the synagogue. The city of Pressburg (Bratislava) also received permission from the queen (9 October 1526) to expel the Jews living within its territory, because they had expressed their intention of fleeing before the Turks. The Jews left Pressburg on November 9. On that same day the diet at Székesfehérvár was opened, at which János Szapolyai (1526–1540) was elected and crowned king in opposition to Ferdinand. During this session it was decreed that the Jews should immediately be expelled from every part of the country. Zápolya, however, did not ratify these laws; and the Diet held at Pressburg in December 1526, at which Ferdinand of Habsburg was chosen king (1526–1564), annulled all the decrees of that of Székesfehérvár, including Zápolya's election as king. As the lord of Bösing (Pezinok) was in debt to the Jews, a blood accusation was brought against these inconvenient creditors in 1529. Although Mendel, the prefect, and the Jews throughout Hungary protested, the accused were burned at the stake. For centuries afterward Jews were forbidden to live at Bösing. The Jews of Nagyszombat (Trnava) soon shared a similar fate, being first punished for alleged ritual murder and then expelled from the city (19 February 1539). The Jews living in the parts of Hungary occupied by the Ottoman Empire were treated far better than those living under the Habsburgs. During the periods of 1546–1590 and 1620–1680, the community of Ofen (Buda) flourished. The following table shows the number of Jewish jizya-tax paying heads of household in Buda during Ottoman rule: At the end of the Ottoman era, the approximately one thousand Jews living in Buda worshipped in three synagogues: an Ashkenazi, a Sephardi and a Syrian one. While the Ottomans held sway in Hungary, the Jews of Transylvania (at that time an independent principality) also fared well. At the instance of Abraham Sassa, a Jewish physician of Constantinople, Prince Gabriel Bethlen of Transylvania granted a letter of privileges (18 June 1623) to the Spanish Jews from Anatolia. But the community of Judaizing Szekler Sabbatarians, which had existed in Transylvania since 1588, was persecuted and driven underground in 1638. On 26 November 1572, King Maximilian II (1563–1576) intended to expel the Jews of Pressburg (Bratislava), stating that his edict would be recalled only in case they accepted Christianity. The Jews, however, remained in the city, without abandoning their religion. They were in constant conflict with the citizens. On 1 June 1582 the municipal council decreed that no one should harbor Jews, or even transact business with them. The feeling against the Jews in that part of the country not under Turkish rule is shown by the decree of the Diet of 1578, to the effect that Jews were to be taxed double the amount which was imposed upon other citizens. By article XV of the law promulgated by the Diet of 1630, Jews were forbidden to take charge of the customs; and this decree was confirmed by the Diet of 1646 on the ground that the Jews were excluded from the privileges of the country, that they were unbelievers, and had no conscience (veluti jurium regni incapaces, infideles, et nulla conscientia praediti). The Jews had to pay a special war-tax when the imperial troops set out toward the end of the 16th century to recapture Buda from the Ottomans. The Buda community suffered much during this siege, as did also that of Székesfehérvár when the imperial troops took that city in September 1601; many of its members were either slain or taken prisoner and sold into slavery, their redemption being subsequently effected by the German, Italian, and Ottoman Jews. After the conclusion of peace, which the Jews helped to bring about, the communities were in part reconstructed; but further development in the territory of the Habsburgs was arrested when Leopold I (1657–1705) expelled the Jews (24 April 1671). He, however, revoked his decree a few months later (August 20). During the siege of Vienna, in 1683, the Jews that had returned to that city were again maltreated. The Ottomans plundered some communities in western Hungary, and deported the members as slaves. Habsburg rule The imperial troops recaptured Buda on 2 September 1686, most Jewish residents were massacred, some captured and later released for ransom. In the following years the whole of Hungary now came under the rule of the House of Habsburg. As the devastated country had to be repopulated, Bishop Count Leopold Karl von Kollonitsch, subsequently Archbishop of Esztergom and Primate of Hungary, advised the king to give the preference to the German Catholics so that the country might in time become German and Catholic. He held that the Jews could not be exterminated at once, but they must be weeded out by degrees, as bad coin is gradually withdrawn from circulation. The decree passed by the Diet of Pressburg (1687–1688), imposing double taxation upon the Jews. Jews were not permitted to engage in agriculture, nor to own any real estate, nor to keep Christian servants. This advice soon bore fruit and was in part acted upon. In August 1690, the government at Vienna ordered Sopron to expel its Jews, who had immigrated from the Austrian provinces. The government, desiring to enforce the edict of the last Diet, decreed soon afterward that Jews should be removed from the office of collector. The order proved ineffective, however; and the employment of Jewish customs officials was continued. Even the treasurer of the realm set the example in transgressing the law by appointing (1692) Simon Hirsch as customs farmer at Leopoldstadt (Leopoldov); and at Hirsch's death he transferred the office to Hirsch's son-in-law. The revolt of the Kuruc, under Francis II Rákóczi, caused much suffering to Hungary's Jews. The Kuruc imprisoned and slew the Jews, who had incurred their anger by siding with the king's party. The Jews of Eisenstadt, accompanied by those of the community of Mattersdorf, sought refuge at Vienna, Wiener-Neustadt, and Forchtenstein; those of Holics (Holíč) and Sasvár (Šaštín) dispersed to Göding (Hodonín); while others, who could not leave their business in this time of distress, sent their families to safe places, and themselves braved the danger. While not many Jews lost their lives during this revolt, it made great havoc in their wealth, especially in Sopron County, where a number of rich Jews were living. The king granted letters of protection to those that had been ruined by the revolt, and demanded satisfaction for those that had been injured; but in return for these favors he commanded the Jews to furnish the sums necessary for suppressing the revolt. After the restoration of peace the Jews were expelled from many cities that feared their competition; thus Esztergom expelled them in 1712, on the ground that the city which had given birth to St. Stephen must not be desecrated by them. But the Jews living in the country, on the estates of their landlords, were generally left alone. The lot of the Jews was not improved under the reign of Leopold's son, Charles III (1711–1740). He informed the government (June 28, 1725) that he intended to decrease the number of Jews in his domains, and the government thereupon directed the counties to furnish statistics of the Hebrew inhabitants. In 1726 the king decreed that in the Austrian provinces, from the day of publication of the decree, only one male member in each Jewish family be allowed to marry. This decree, restricting the natural increase of the Jews, materially affected the Jewish communities of Hungary. All the Jews in the Austrian provinces who could not marry there went to Hungary to found families; thus the overflow of Austrian Jews peopled Hungary. These immigrants settled chiefly in the northwestern counties, in Nyitra (Nitra), Pressburg (Bratislava), and Trencsén (Trenčín). The Moravian Jews continued to live in Hungary as Moravian subjects; even those that went there for the purpose of marrying and settling promised on oath before leaving that they would pay the same taxes as those living in Moravia. In 1734 the Jews of Trencsén bound themselves by a secret oath that in all their communal affairs they would submit to the Jewish court at Ungarisch-Brod (Uherský Brod) only. In the course of time the immigrants refused to pay taxes to the Austrian provinces. The Moravian Jews, who had suffered by the heavy emigration, then brought complaint; and Maria Theresa ordered that all Jewish and Christian subjects that had emigrated after 1740 should be extradited, while those who had emigrated before that date were to be released from their Moravian allegiance. The government could not, however, check the large immigration; for although strict laws were drafted in 1727, they could not be enforced owing to the good-will of the magnates toward the Jews. The counties either did not answer at all, or sent reports bespeaking mercy rather than persecution. Meanwhile, the king endeavored to free the mining-towns from the Jews – a work which Leopold I had already begun in 1693. The Jews, however, continued to settle near these towns; they displayed their wares at the fairs; and, with the permission of the court, they even erected a foundry at Ság (Sasinkovo). When King Charles ordered them to leave (March 1727), the royal mandate was in some places ignored; in others the Jews obeyed so slowly that he had to repeat his edict three months later. In 1735, another census of the Jews of the country was taken with the view of reducing their numbers. There were at that time 11,621 Jews living in Hungary, of which 2,474 were male heads of families, and fifty-seven were female heads. Of these heads of families 35.31 per cent declared themselves to be Hungarians; the rest had immigrated. Of the immigrants 38.35 per cent came from Moravia, 11.05 per cent from Poland, and 3.07 per cent from Bohemia. The largest Jewish community, numbering 770 persons, was that of Pressburg (Bratislava). Most of the Jews were engaged in commerce or industries, most being merchants, traders, or shopkeepers; only a few pursued agriculture. During the reign of Queen Maria Theresa (1740–1780), daughter of Charles III, the Jews were expelled from Buda (1746), and the "toleration-tax" was imposed upon the Hungarian Jews. On September 1, 1749, the delegates of the Hungarian Jews, except those from Szatmár County, assembled at Pressburg and met a royal commission, which informed them that they would be expelled from the country if they did not pay this tax. The frightened Jews at once agreed to do so; and the commission then demanded a yearly tax of ƒ50,000. This sum being excessive, the delegates protested; and although the queen had fixed ƒ30,000 as the minimum tax, they were finally able to compromise on the payment of ƒ20,000 a year for a period of eight years. The delegates were to apportion this amount among the districts; the districts, their respective sums among the communities; and the communities, theirs among the individual members. The queen confirmed this agreement of the commission, except the eight-year clause, changing the period to three years, which she subsequently made five. The agreement, thus ratified by the queen, was brought on November 26 before the courts, which were powerless to relieve the Jews from the payment of this Malkegeld ("queen's money" in Yiddish), as they called it. The Jews, thus burdened by new taxes, thought the time ripe for taking steps to remove their oppressive disabilities. While still at Presburg the delegates had brought their grievances before the mixed commission that was called delegata in puncto tolerantialis taxae et gravaminum Judeorum commissio mixta. These complaints pictured the distress of the Jews of that time. They were not allowed to live in Croatia and Slavonia, in Baranya and Heves Counties, or in several free royal towns and localities; nor might they visit the markets there. At Stuhlweissenburg (Székesfehérvár) they had to pay a poll-tax of 1 florin, 30 kreutzer if they entered the city during the day, if only for an hour. In many places they might not even stay overnight. They therefore begged permission to settle, or at least to visit the fairs, in Croatia and Slavonia and in those places from which they had been driven in consequence of the jealousy of the Greeks and the merchants. The Jews also had to pay heavier bridge-and ferry-tolls than the Christians; at Nagyszombat (Trnava) they had to pay three times the ordinary sum, namely, for the driver, for the vehicle, and for the animal drawing the same; and in three villages belonging to the same district they had to pay toll, although there was no toll-gate. Jews living on the estates of the nobles had to give their wives and children as pledges for arrears of taxes. In Upper Hungary they asked for the revocation of the toleration-tax imposed by the chamber of Zips County (Szepes, Spiš), on the ground that otherwise the Jews living there would have to pay two such taxes; and they asked also to be relieved from a similar tax paid to the Diet. Finally, they requested that Jewish artisans might be allowed to follow their trades in their homes undisturbed. The commission laid these complaints before the Queen, indicating the manner in which they could be relieved; and their suggestions were subsequently willed by the queen and made into law. The queen relieved the Jews from the tax of toleration in Upper Hungary only. In regard to the other complaints she ordered that the Jews should specify them in detail, and that the government should remedy them insofar as they came under its jurisdiction. The toleration-tax had hardly been instituted when Michael Hirsch petitioned the government to be appointed primate of the Hungarian Jews to be able to settle difficulties that might arise among them, and to collect the tax. The government did not recommend Hirsch, but decided that in case the Jews should refuse to pay, it might be advisable to appoint a primate to adjust the matter. Before the end of the period of five years the delegates of the Jews again met the commission at Pressburg (Bratislava) and offered to increase the amount of their tax to 25,000 florins a year if the queen would promise that it should remain at that sum for the next ten years. The queen had other plans, however; not only did she dismiss the renewed gravamina of the Jews, but rather imposed stiffer regulations upon them. Their tax of ƒ20,000 was increased to ƒ30,000 in 1760; to ƒ50,000 in 1772; to ƒ80,000 in 1778; and to ƒ160,000 in 1813. Joseph II (1780–1790), son and successor of Maria Theresa, showed immediately on his accession that he intended to alleviate the condition of the Jews, communicating this intention to the Hungarian chancellor, Count Franz Esterházy as early as May 13, 1781. In consequence the Hungarian government issued (March 31, 1783) a decree known as the Systematica gentis Judaicae regulatio, which wiped out at one stroke the decrees that had oppressed the Jews for centuries. The royal free towns, except the mining-towns, were opened to the Jews, who were allowed to settle at leisure throughout the country. The regulatio decreed that the legal documents of the Jews should no longer be composed in Hebrew, or in Yiddish, but in Latin, German, and Hungarian, the languages used in the country at the time, and which the young Jews were required to learn within two years. Documents written in Hebrew or in Yiddish were not legal; Hebrew books were to be used at worship only; the Jews were to organize elementary schools; the commands of the emperor, issued in the interests of the Jews, were to be announced in the synagogues; and the rabbis were to explain to the people the salutary effects of these decrees. The subjects to be taught in the Jewish schools were to be the same as those taught in the national schools; the same text-books were to be used in all the elementary schools; and everything that might offend the religious sentiment of non-conformists was to be omitted. During the early years Christian teachers were to be employed in the Jewish schools, but they were to have nothing to do with the religious affairs of such institutions. After the lapse of ten years a Jew might establish a business, or engage in trade, only if he could prove that he had attended a school. The usual school-inspectors were to supervise the Jewish schools and to report to the government. The Jews were to create a fund for organizing and maintaining their schools. Jewish youth might enter the academies, and might study any subject at the universities except theology. Jews might rent farms only if they could cultivate the same without the aid of Christians. Jews were allowed to peddle and to engage in various industrial occupations, and to be admitted into the guilds. They were also permitted to engrave seals, and to sell gunpowder and saltpeter; but their exclusion from the mining-towns remained in force. Christian masters were allowed to have Jewish apprentices. All distinctive marks hitherto worn by the Jews were to be abolished, and they might even carry swords. On the other hand, they were required to discard the distinctive marks prescribed by their religion and to shave their beards. Emperor Joseph regarded this decree so seriously that he allowed no one to violate it. The Jews, in a petition dated April 22, 1783, expressed their gratitude to the emperor for his favors, and, reminding him of his principle that religion should not be interfered with, asked permission to wear beards. The emperor granted the prayer of the petitioners, but reaffirmed the other parts of the decree (April 24, 1783). The Jews organized schools in various places, at Pressburg (Bratislava), Óbuda, Vágújhely (Nové Mesto nad Váhom), and Nagyvárad (Oradea). A decree was issued by the emperor (July 23, 1787) to the effect that every Jew should choose a German surname; and a further edict (1789) ordered, to the consternation of the Jews, that they should henceforth perform military service. After the death of Joseph II the royal free cities showed a very hostile attitude toward the Jews. The citizens of Pest petitioned the municipal council that after May 1, 1790, the Jews should no longer be allowed to live in the city. The government interfered; and the Jews were merely forbidden to engage in peddling in the city. Seven days previously a decree of expulsion had been issued at Nagyszombat (Trnava), May 1 being fixed as the date of the Jews' departure. The Jews appealed to the government; and in the following December the city authorities of Nagyszombat were informed that the Diet had confirmed the former rights of the Jews, and that the latter could not be expelled. The Jews of Hungary handed a petition, in which they boldly presented their claims to equality with other citizens, to King Leopold II (1790–1792) at Vienna on November 29, 1790. He sent it the following day to the chancelleries of Hungary and Moravia for their opinions. The question was brought before the estates of the country on December 2, and the Diet drafted a bill showing that it intended to protect the Jews. This decision created consternation among the enemies of the latter. Nagyszombat (Trnava) addressed a further memorandum to the estates (December 4) in which it demanded that the Diet should protect the city's privileges. The Diet decided in favor of the Jews, and its decision was laid before the king. The Jews, confidently anticipating the king's decision in their favor, organized a splendid celebration on November 15, 1790, the day of his coronation; on January 10, 1791, the king approved the bill of the Diet; and the following law, drafted in conformity with the royal decision, was read by Judge Stephen Atzel in the session of February 5: "In order that the condition of the Jews may be regulated pending such time as may elapse until their affairs and the privileges of various royal free towns relating to them shall have been determined by a commission to report to the next ensuing Diet, when his Majesty and the estates will decide on the condition of the Jews, the estates have determined, with the approval of his Majesty, that the Jews within the boundaries of Hungary and the countries belonging to it shall, in all the royal free cities and in other localities (except the royal mining-towns), remain under the same conditions in which they were on Jan. 1, 1790; and in case they have been expelled anywhere, they shall be recalled." Thus came into force the famous law entitled De Judaeis, which forms the thirty-eighth article of the laws of the Diet of 1790–1791. The De Judaeis law was gratefully received by the Jews; for it not only afforded them protection, but also gave them the assurance that their affairs would soon be regulated. Still, although the Diet appointed on February 7, 1791, a commission to study the question, the amelioration of the condition of the Hungarian Jews was not effected till half a century later, under Ferdinand V (r. 1835–1848), during the session of the Diet of 1839–1840. It is estimated that the Jewish population in Hungary grew by about 80% between 1815 and 1840, bolstered by immigration due to the perception of royal tolerance. In consequence of the petition of the Jews of Pest, the mover of which was Philip Jacobovics, superintendent of the Jewish hospital, the general assembly of the county of Pest drafted instructions for the delegates on June 10, 1839, to the effect that if the Jews would be willing to adopt the Magyar language they should be given equal rights with other Hungarian citizens. From now on much attention was paid to the teaching of Hungarian in the schools; Moritz Bloch (Ballagi) translated the Pentateuch into Hungarian, and Moritz Rosenthal the Psalms and the Pirkei Avoth. Various communities founded Hungarian reading-circles; and the Hungarian dress and language were more and more adopted. Many communities began to use Hungarian on their seals and in their documents, and some liberal rabbis even began to preach in that language. At the sessions of the Diet subsequent to that of 1839–1840, as well as in various cities, a decided antipathy—at times active and at times merely passive—toward the Jews became manifest. In sharp contrast to this attitude was that of Baron József Eötvös, who published in 1840 in the Budapesti Szemle, the most prominent Hungarian review, a strong appeal for the emancipation of the Jews. This cause also found a friend in Count Charles Zay, the chief ecclesiastical inspector of the Hungarian Lutherans, who warmly advocated Jewish interests in 1846. Although the session of the Diet convened on November 7, 1847, was unfavorable to the Jews, the latter not only continued to cultivate the Hungarian language, but were also willing to sacrifice their lives and property in the hour of danger. During the Revolution of 1848 they displayed their patriotism, even though attacked by the populace in several places at the beginning of the uprising. On March 19 the populace of Pressburg (Bratislava), encouraged by the antipathies of the citizens—who were aroused by the fact that the Jews, leaving their ghetto around Pressburg Castle (Bratislava Castle), were settling in the city itself—began hostilities that were continued after some days, and were renewed more fiercely in April. At this time the expulsion of the Jews from Sopron, Pécs, Székesfehérvár, and Szombathely was demanded; in the last two cities there were pogroms. At Szombathely, the mob advanced upon the synagogue, cut up the Torah scrolls, and threw them into a well. Nor did the Jews of Pest escape, while those at Vágújhely (Nové Mesto nad Váhom) especially suffered from the brutality of the mob. Bitter words against the Jews were also heard in the Diet. Some Jews advised emigration to America as a means of escape; and a society was founded at Pest, with a branch at Pressburg, for that purpose. A few left Hungary, seeking a new home across the sea, but the majority remained. Revolution and emancipation (1848–1849) Jews entered the national guard as early as March 1848; although they were excluded from certain cities, they reentered as soon as the danger to the country seemed greater than the hatred of the citizens. At Pest the Jewish national guard formed a separate division. When the national guards of Pápa were mobilized against the Croatians, Leopold Löw, rabbi of Pápa, joined the Hungarian ranks, inspiring his companions by his words of encouragement. Jews were also to be found in the volunteer corps, and among the honvéd and landsturm; and they constituted one-third of the volunteer division of Pest that marched along the Drava against the Croatians, being blessed by Rabbi Schwab on June 22, 1848. Many Jews throughout the country joined the army to fight for their fatherland; among them, Adolf Hübsch, subsequently rabbi at New York City; Solomon Marcus Schiller-Szinessy, afterward lecturer at the University of Cambridge; and Ignatz Einhorn, who, under the name of "Eduard Horn," subsequently became state secretary of the Hungarian Ministry of Commerce. The rebellious Serbians slew the Jews at Zenta who sympathized with Hungary; among them, Rabbi Israel Ullmann and Jacob Münz, son of Moses Münz of Óbuda The conduct of the Jewish soldiers in the Hungarian army was highly commended by Generals Klapka and Görgey. Einhorn estimated the number of Jewish soldiers who took part in the Hungarian Revolution to be 20,000; but this is most likely exaggerated, as Béla Bernstein enumerates only 755 combatants by name in his work, Az 1848-49-iki Magyar Szabadságharcz és a Zsidók (Budapest, 1898). The Hungarian Jews served their country not only with the sword, but also with funds. Communities and individuals, Chevra Kadisha, and other Jewish societies, freely contributed silver and gold, armor and provisions, clothed and fed the soldiers, and furnished lint and other medical supplies to the Hungarian camps. Meanwhile, they did not forget to take steps to obtain their rights as citizens. When the Diet of 1847–1848 (in which, according to ancient law, only the nobles and those having the rights of nobles might take part) was dissolved (April 11), and the new Parliament – at which under the new laws the delegates elected by the commons also appeared – was convened at Pest (July 2, 1848), the Jews hopefully looked forward to the deliberations of the new body. Many Jews thought to pave the way for emancipation by a radical reform of their religious life. They thought this might ease their way, as legislators in the Diets and articles printed in the press suggested that the Jews should not receive equal civic rights until they reformed their religious practices. This reform had been first demanded in the session of 1839–1840. From this session onward, the press and general assemblies pushed for religious reform. Several counties instructed their representatives not to vote for the emancipation of the Jews until they desisted from practising the externals of their religion. For the purpose of urging Jewish emancipation, all the Jews of Hungary sent delegates to a conference at Pest on July 5, 1848. It chose a commission of ten members to lobby with the Diet for emancipation. The commission delegates were instructed not to make any concessions related to practicing the Jewish faith. The commission soon after addressed a petition to the Parliament for emancipation, but it proved ineffective. The national assembly at Szeged granted emancipation of Jews on Saturday, the eve of the Ninth of Av (July 28, 1849). The bill, which was quickly debated and immediately became a law, fulfilled the hopes of the Reform party. The Jews obtained full citizenship. The Ministry of the Interior was ordered to call a convention of Jewish ministers and laymen for the purpose of drafting a confession of faith, and of inducing the Jews to organize their religious life in conformity with the demands of the time, for instance, business hours on Saturday, the Jewish Sabbath. The bill included the clause referring to marriages between Jews and Christians, which clause both Lajos Kossuth and the Reform party advocated. The Jews' civic liberty lasted for just two weeks. After the Hungarian army's surrender at Világos to Russian troops, which had come to aid the Austrians in suppressing the Hungarian struggle for liberty, the Jews were severely punished by new authorities for having taken part in the uprising. Field Marshal Julius Jacob von Haynau, the new governor of Hungary, imposed heavy war-taxes upon them, especially upon the communities of Pest and Óbuda, which had already been heavily taxed by Alfred I, Prince of Windisch-Grätz, commander-in-chief of the Austrian army, on his triumphant entry into the Hungarian capital at the beginning of 1849. Haynau punished the communities of Kecskemét, Nagykőrös, Cegléd, Albertirsa, Szeged, and Szabadka (now Subotica, Serbia) with equal severity. Numerous Jews were imprisoned and executed; others sought refuge in emigration. Several communities petitioned to be relieved of the war taxes. The ministry of war, however, increased the burden, requiring that the communities of Pest, Óbuda, Kecskemét, Czegléd, Nagykőrös, and Irsa should pay this tax not in kind, but in currency to the amount of ƒ2,300,000 . As the communities were unable to collect such monies, they petitioned the government to remit it. The Jewish communities of the entire country were ordered to share in raising the sum, on the grounds that most of the Jews of Hungary had supported the Revolution. Only the communities of Temesvár (now Timișoara, Romania) and Pressburg (now Bratislava, Slovakia) were exempted from this order, as they remained loyal to the existing Austrian government. The military commission added a clause to tax requirements, to the effect that individuals or communities might be exempted from the punishment, if they could prove by documents or witnesses, before a commission to be appointed, that they had not taken part in the Revolution, either by word or deed, morally or materially. The Jews refused this means of clearing themselves. They declared to be willing to redeem the tax by collecting a certain sum for a national school fund. Emperor Franz Joseph remitted the war-tax (September 20, 1850), but ordered that the Jews of Hungary without distinction should contribute toward a Jewish school fund of ƒ1,000,000; they raised this sum within a few years. Struggles for a second emancipation (1859–1867) While the House of Habsburg controlled Hungary, emancipation of Jews was postponed. When the Austrian troops were defeated in Italy in 1859, activists pressed for liberty. In that year the cabinet, with Emperor Franz Joseph in the chair, decreed that the status of the Jews should be regulated in agreement with the times, but with due regard for the conditions obtaining in the several localities and provinces. When the emperor convened the Diet on April 2, 1861, Jews pushed for emancipation but the early dissolution of that body prevented it from taking action in the matter. The decade of absolutism in Hungary (1849–1859) resulted in Jews establishing schools, most of which were in charge of trained teachers. Based on the Jewish school fund, the government organized model schools at Sátoraljaújhely, Temesvár (Timișoara), Pécs, and Pest. In Pest the Israelite State Teachers' Seminary was founded in 1859, the principals of which have included Abraham Lederer, Heinrich Deutsch, and József Bánóczi. When the Parliament dissolved in 1861, the emancipation of the Jews was deferred to the coronation of Franz Joseph. On December 22, 1867, the question came before the lower house, and on the favorable report of Kálmán Tisza and Zsigmond Bernáth, a bill in favor of emancipation was adopted; it was passed by the upper house on the following day. Although the Antisemitic Party was represented in the Parliament, it was not taken seriously by the political elite of the country. Its agitation against Jews was not successful (see Tiszaeszlár affair). On October 4, 1877, the Budapest University of Jewish Studies opened in Budapest. The university is still operating, celebrating its 130th anniversary on October 4, 2007. Since its opening, it has been the only Jewish institute in all of Central and Eastern Europe. In the 1890 Hungarian census, 64.0% of the Jewish population were counted as ethnic Hungarian by mother tongue, 33.1% as German 1.9% as Slovak, 0.8% as Romanian, and 0.2% as Ruthenian. Austria-Hungary (1867–1918) Most Jews did not have family names before 1783. Some family names were recorded for Jewish families: Emperor Joseph II believed that Germanization could facilitate the centralization of his empire. Beginning in 1783, he ordered Jews to either choose or be given German family names by local committees. The actions were dependent on local conditions. With the rise of Hungarian nationalism, the first wave of Magyarization of family names occurred between 1840 and 1849. After the Hungarian revolution, this process was stopped until 1867. After the Ausgleich, many Jews changed their family names from German to Hungarian. In 1942 during World War II, when Hungary became allied with Germany, the Hungarian Defense Ministry was tasked with "race validation." Its officials complained that no Hungarian or German names were "safe," as Jews might have any name. They deemed Slavic names to be "safer", but the decree listed 58 Slavic-sounding names regularly held by Jews. Almost a quarter (22.35%) of the Jews of Hungary lived in Budapest in 1910. Some of the surviving large synagogues in Budapest include the following: According to the 1910 census, the number of Jews was 911,227, or 4.99% of the 18,264,533 people living in Hungary (In addition, there were 21,231 Jews in autonomous Croatia-Slavonia). This was a 28.7% increase in absolute terms since the 1890 census, and a 0.3% increase (from 4.7%) in the overall population of Hungary. At the time, the Jewish natural growth rate was higher than the Christian (although the difference had been narrowing), but so was the emigration rate, mainly to the United States. (The total emigration from Austria-Hungary to the U.S. in 1881–1912 was 3,688,000 people, including 324,000 Jews (8.78%). In the 1880–1913 period, a total of 2,019,000 people emigrated from Hungary to the US. Thus, an estimated 177,000 Jews emigrated from Hungary to the US during this total period.)[citation needed] The schism in Hungarian Jewry saw the development of three religious denominations. Budapest, the South and West had a Neolog majority (related to modern US Conservative and Reform Judaism – the kipah and organ were both used in religious worship in the synagogues). Traditionalists ("Status quo ante") were the smallest of the three, mainly in the North. The East and North of the country were overwhelmingly Orthodox (more orthodox than "status quo ante"). In broad terms, Jews whose ancestors had come from Moravia in the 18th century tended to become Neolog at the split in 1869; those whose ancestors were from Galicia identified as Orthodox.[citation needed] The net loss for Judaism due to conversions was relatively low before the end of the Great War: 240 people/year between 1896 and 1900, 404 between 1901 and 1910, and 435 people/year between 1911 and 1917. According to records, 10,530 people left Judaism, and 2,244 converted to Judaism between 1896 and 1917. The majority (75.7%) of the Jewish population reported Hungarian as their primary language, so they were counted as ethnically Hungarian in the census. The Yiddish speakers were counted as ethnically German. According to this classification, 6.94% of the ethnic Hungarians and 11.63% of the Germans of Hungary were Jewish. In total, Hungarian speakers made up a 54.45% majority in Hungary; German speakers (including those who spoke Yiddish), made up 10.42% of the population.[citation needed] Population of the capital, Budapest, was 23% Jewish (about the same ratio as in New York City)[citation needed]. This community had established numerous religious and educational institutions. Pest was more Jewish than Buda. The prosperity, cultural, and financial prominence of Budapest's large Jewish community attested to its successful integration. Indeed, commentators opined in 1911 that Hungary had "absorbed" their Jews and "it has come to pass that there is no anti-Semitism in Budapest, although the Hebrew element is proportionately much larger (21% as compared to 9%) than it is in Vienna, the Mecca of the Jew-baiter" At that time Karl Lueger, mayor of Vienna referred to the capital as Judapest, alluding to the high proportion of Jews. Budapest had the second largest Jewish population among the world's cities, after New York.[citation needed] Before the Hungarian Revolution of 1848, Jews in Hungary were prevented from owning land, which resulted in many going into business. In 1910, 60.96% of merchants, 58.11% of the book printers, 41.75% of the innkeepers, 24.42% of the bakers, 24.07% of the butchers, 21.04% of the tailors, and 8.90% of the shoemakers of Hungary were Jewish. 48.5% of the physicians in the country (2701 out of 5565) were Jewish. In the 1893–1913 period, Jews made up roughly 20% of the students of the gimnázium high school (where classical subjects were emphasized) students and 37% of reál high school (where practical subjects were emphasized).[citation needed] The strong class divisions of Hungary were represented in the Jewish population. About 3.1% of the Jews belonged to the "large employer" and "agricultural landowner of more than 100 hold, i.e. 57 hectares" class, 3.2% to the "small (<100 hold) landholder" class, 34.4% to the "working", i.e. wage-earning employee class, while 59.3% belonged to the self-employed or salary-earning middle class. Stephen Roth writes, "Hungarian Jews were opposed to Zionism because they hoped that somehow they could achieve equality with other Hungarian citizens, not just in law but in fact, and that they could be integrated into the country as Hungarian Israelites. The word 'Israelite' (Hungarian: Izraelita) denoted only religious affiliation and was free from the ethnic or national connotations usually attached to the term 'Jew'. Hungarian Jews attained remarkable achievements in business, culture and less frequently even in politics. By 1910 about 900,000 religious Jews made up approximately 5% of the population of Hungary and about 23% of Budapest's citizenry. Jews accounted for 54% of commercial business owners, 85% of financial institution directors and owners in banking, and 62% of all employees in commerce, 20% of all general grammar school students, and 37% of all commercial scientific grammar school students, 31.9% of all engineering students, and 34.1% of all students in human faculties of the universities. Religious Jews were accounted for 48.5% of all physicians,[citation needed] and 49.4% of all lawyers/jurists in Hungary.[citation needed] During the cabinet of pm. István Tisza three Jewish men were appointed as ministers. The first was Samu Hazai (Minister of War), János Harkányi (Minister of Trade) and János Teleszky (Minister of Finance). By 1910 22% of the Members of Parliament were Jews (45% in the governing National Party of Work). While the Jewish population of the lands of the Dual Monarchy was about five percent, Jews made up nearly eighteen percent of the reserve officer corps. Thanks to the modernity of the constitution and to the benevolence of emperor Franz Joseph, the Austrian Jews came to regard the era of Austria-Hungary as a golden era of their history. In absolute numbers, Budapest had by far the largest number of Jews (203,000), followed by Nagyvárad (Oradea) with 15,000, Újpest and Miskolc with about 10,000 each, Máramarossziget (Sighetu Marmaţiei), Munkács (Mukachevo), Pozsony (Bratislava), Debrecen with 8,000, Kolozsvár (Cluj-Napoca), Szatmárnémeti (Satu Mare), Temesvár (Timișoara), Kassa (Košice) with about 7,000 each.[citation needed] Interwar period (1918–1939) Using data from the 1910 census, 51.7% of the Hungarian Jews lived in territories that stayed inside the "small" Hungary after 1921, 25.5% (232,000) lived in territories that later became part of Czechoslovakia, 19.5% (178,000) became part of Romania, 2.6% (23,000) became part of Yugoslavia, 0.5% (5,000) became part of Austria and finally 0.2% (2,000) lived in Fiume, which became part of Italy after 1924. According to the censuses of 1930–1931, 238,460/192,833/about 22,000 Jews lived in parts of Czechoslovakia/Romania/Yugoslavia formerly belonging to Hungary, which means that the overall number of people declaring themselves Jewish remained unchanged in the Carpathian basin between 1910 and 1930 [a decrease of 26,000 in the post-WWI Hungary, a 6,000 increase in Czechoslovakia and a 15,000 increase in Romania].[citation needed] According to the census of December 1920 in the "small" Hungary, the percentage of Jews increased in the preceding decade in Sátoraljaújhely (to 30.4%), Budapest (23.2%), Újpest (20.0%), Nyíregyháza (11.7%), Debrecen (9.9%), Pécs (9.0%), Sopron (7.5%), Makó (6.4%), Rákospalota (6.1%), Kispest (5.6%) and Békéscsaba (to 5.6%), while decreased in the other 27 towns with more than 20 thousand inhabitants. Overall, 31.1% of the Jewish population lived in villages and towns with less than 20 thousand inhabitants.[citation needed] In 1920, 46.3% of the medical doctors, 41.2% of the veterinarians, 21.4% of the pharmacists of Hungary were Jewish, as well as 34.3% of the journalists, 24.5% of performers of music, 22.7% of the theater actors, 16.8% of the painters and sculptors. Among the owners of land of more than 1000 hold, i.e. 570 hectares, 19.6% were Jewish. Among the 2739 factories in Hungary, 40.5% had a Jewish owner. The following table shows the number of people who declared to be Israelite (Jewish) at the censuses inside the post-WWI territory of Hungary. Between 1920 and 1945, it was illegal for Hungarians to fail to declare their religion. A person's religion was written on their birth certificate, marriage license (except in 1919, during the short-lived Commune, see Hungarian Soviet Republic), and even on a child's school grade reports.[citation needed] The net loss for Judaism due to official conversions was 26,652 people between 1919 and 1938, while 4,288 people converted into the faith, 30,940 left it. The endpoints of this period, 1919–1920 (white terror) and 1938 (anti-Jewish law) contributed to more than half of this loss; between 1921 and 1930, the net loss rested around pre-war levels (260 people/year).: In 1926, the districts I, II, III of Buda were Jewish 8%,11%,10% respectively. The 19,000 Jews of Buda constituted about 9.3% of both the total population of Buda and the entire Jewish population of Budapest. On the left (Pest) side of the Danube, downtown Pest (Belváros, district IV then) was 18% Jewish. Districts V (31%), VI (28%), VII (36%), VIII (22%), IX (13%) had large Jewish populations, while district X had 6%. The four Neolog communities of Budapest (I-II, III, IV-IX, X) had a total of 66,300 members paying their dues, while the Orthodox community had about 7,000 members paying religious taxes.[citation needed] In the countryside of the post-WWI Hungary, the Orthodox had a slight edge (about 49%) over the Neolog (46%). Budapest and countryside combined, 65.72% of the 444,567 Jews belonged to Neolog communities, 5.03% to status quo ante, while 29.25% were Orthodox in 1930. The Jewish communities suffered a 5.6% decline in the 1910–1930 period, on the territory of the "small" Hungary, due to emigration and conversion.[citation needed] The Jews of Hungary were fairly well integrated into Hungarian society by the time of the First World War. Class distinction was very significant in Hungary in general, and among the Jewish population in particular. Rich bankers, factory owners, lower middle class artisans and poor factory workers did not mingle easily. In 1926, there were 50,761 Jewish families living in Budapest. Of that number, 65% lived in apartments that contained one or two rooms, 30% had three or four rooms, while 5% lived in apartments with more than 4 rooms.[citation needed] Education. The following chart illustrates the effect of the 1920 "numerus clausus" Law on the percentage of Jewish university students at two Budapest Universities. Those who could afford went to study to other European countries like Austria, Germany, Italy and Czechoslovakia. In 1930, of all males aged six and older, Seven of the thirteen Nobel prize winners born in Hungary are Jewish. In sports, 55.6% of the individual gold medal winners of Hungary at the Summer Olympic Games between 1896 and 1912 were Jewish. This number dropped to 17.6% in the interwar period of 1924–1936.[citation needed] Revolution More than 10,000 Jews died and thousands were wounded and disabled fighting for Hungary in World War I. But these sacrifices by patriotic Hungarian Jews may have been outweighed by the chaotic events following the war's end. With the defeat and dissolution of the Austro-Hungarian Empire, Hungary would be forced by the Allies to adhere to the Treaty of Trianon, which ceded to neighboring nations fully two-thirds of Hungary's imperial territory and two-thirds of its population, including a third of its ethnically Magyar citizens and many Jews. These losses provoked deep anger and hostility in the remaining Hungarian population. The first post-war government was led by Mihály Károlyi, and was the first modern effort at liberal democratic government in Hungary. But it was cut short in a spasm of communist revolution, which would have serious implications for the manner in which Hungarian Jews were viewed by their fellow-countrymen.[citation needed] In March 1919, Communist and Social Democrat members of a coalition government ousted Karolyi; soon after (March 21), the Communists were to take power as their Social Democrat colleagues were willing neither to accept nor to refuse the Vix Note to cede a significant part of the Great Plains to Romania and the communists took control of Hungary's governing institutions. While popular at first among Budapest's progressive elite and proletariat, the so-called Hungarian Soviet Republic fared poorly in almost all of its aims, particularly its efforts to regain territories occupied by Slovakia (although achieving some transitional success here) and Romania. All the less palatable excesses of Communist uprisings were in evidence during these months, particularly the formation of squads of brutal young men practicing what they called "revolutionary terror" to intimidate and suppress dissident views. All but the one Sándor Garbai, the revolution's leaders, including Béla Kun, Tibor Szamuely, and Jenő Landler – were of Jewish ancestry. As in other countries where Communism was viewed as an immediate threat, the presence of ethnic Jews in positions of revolutionary leadership helped foster the notion of a Jewish-Bolshevik conspiracy. Kun's regime was crushed after four and a half months when the Romanian army entered Budapest; it was quickly followed by the reactionary forces under the command of the former Austro-Hungarian admiral, Miklós Horthy.[citation needed] The sufferings endured during the brief revolution, and their exploitation by ultra-nationalist movements, helped generate stronger suspicions among non-Jewish Hungarians, and undergirded pre-existing antisemitic views.[citation needed] Beginning in July 1919, officers of Horthy's National Army engaged in a brutal string of counter-reprisals against Hungarian communists and their allies, real or imagined. This series of pogroms directed at Jews, progressives, peasants and others is known as the White Terror. Horthy's personal role in these reprisals is still subject of debate (in his memoirs he refused to disavow the violence, saying that "only an iron broom" could have swept the country clean). Tallying the numbers of victims of the different terror campaigns in this period is still a matter of some political dispute but the White Terror is generally considered to have claimed more lives than the repressions of the Kun regime by an order of magnitude, thousands vs hundreds. Interwar years Jews represented one-fourth of all university students and 43% percent at Budapest Technological University. In 1920, 60 percent of Hungarian doctors, 51 percent of lawyers, 39 percent of all privately employed engineers and chemists, 34 percent of editors and journalists, and 29 percent of musicians identified themselves as Jews by religion. Resentment of this Jewish trend of success was widespread: Admiral Horthy himself declared that he was "an anti-Semite", and remarked in a letter to one of his prime ministers, "I have considered it intolerable that here in Hungary everything, every factory, bank, large fortune, business, theater, press, commerce, etc. should be in Jewish hands, and that the Jew should be the image reflected of Hungary, especially abroad." Unfortunately for Jews they had also become, by a quirk of history, the most visible minority remaining in Hungary (besides ethnic Germans and Gypsies); the other large "non-Hungarian" populations (including Slovaks, Slovenes, Croats, and Romanians, among others) had been abruptly excised from the Hungarian population by the territorial losses at Trianon. That and the highly visible role of Jews in the economy, the media and the professions, as well as in the leadership of the 1919 Communist dictatorship left Hungary's Jews as an ethnically separate group which could serve as a scapegoat for the nation's ills. The scapegoating began quickly. In 1920, Horthy's government passed a "numerus clausus" law that placed limits on the number of minority students in proportion of their size of the population, thus restricting the Jewish enrollment at universities to five percent or less.[citation needed] Anti-Jewish policies grew more repressive in the interwar period as Hungary's leaders, who remained committed to regaining territories lost in WWI, chose to align themselves (albeit warily) with the fascist governments of Germany and Italy – the international actors most likely to stand behind Hungary's claims. The inter-war years also saw the emergence of flourishing fascist groups, such as the Hungarian National Socialist Party and the Arrow Cross Party. Jewish Conversions to Christianity in Hungary About 20,000 conversions took place in Hungary between 1867 and 1918. Anti-Jewish measures Starting in 1938, Hungary under Miklós Horthy passed a series of anti-Jewish measures in emulation of Germany's Nuremberg Laws. Their employment in government at any level was forbidden, they could not be editors at newspapers, their numbers were restricted to six per cent among theater and movie actors, physicians, lawyers and engineers. Private companies were forbidden to employ more than 12% Jews. 250,000 Hungarian Jews lost their income. Most of them lost their right to vote as well: before the second Jewish law, about 31% of the Jewish population of Borsod county (Miskolc excluded), 2496 people had this right. At the next elections, less than a month after this new anti-Jewish legislation, only 38 privileged Jews could vote. In the elections of May 28–29, Nazi and Arrow Cross (Nyilas) parties received one quarter of the votes and 52 out of 262 seats. Their support was even larger, usually between 1/3 and 1/2 of the votes, where they were on the ballot at all, since they were not listed in large parts of the country For instance, the support for Nazi parties was above 43% in the election districts of Zala, Győr-Moson, Budapest surroundings, Central and Northern Pest-Pilis, and above 36% in Veszprém, Vas, Szabolcs-Ung, Sopron, Nógrád-Hont, Jász-Nagykun, Southern Pest town and Buda town. The Nazi parties were not on the ballot mainly in the Eastern third of the country and in Somogy, Baranya, Tolna, Fejér. Their smallest support was in Békés county (15%), Pécs town (19%), Szeged town (22%) and in Northern Pest town (27%) According to Magyarország történelmi kronológiája, the census of January 31, 1941, found that 6.2% of the population of 13,643,621, i.e. 846,000 people, were considered Jewish according to the racial laws of that time. In addition, in April 1941, Hungary annexed the Bácska (Bačka), the Muraköz (Međimurje County) and Muravidék (Prekmurje) regions from the occupied Yugoslavia, with 1,025,508 people including 15,000 Jews (data are from October 1941). This means that inside the May 1941 borders of Hungary, there were 861,000 people (or 5.87%) who were at least half Jewish, and therefore were considered Jewish. From this number, 725,000 (or 4.94%) were Jewish in accordance with Jewish religious law (4.30% in pre-1938 Hungary, 7.15% in the territories annexed from Czechoslovakia and Romania in 1938–1940 and 1.38% in the territories annexed from Yugoslavia in 1941).[citation needed] The following is from another source, a statistical summary written in the beginning of 1944 and referring to the 1941 census data: The question about Jewish grandparents was added late to the questionnaires at the census of 1941, when some of the sheets had already been printed. In addition, a lot of Christians of Jewish ancestry did not answer this question truthfully. So while about 62,000 Christians admitted some Jewish ancestry (including 38,000 in Budapest), their actual number was estimated at least 100,000: It is not clear whether the 10,000–20,000 Jewish refugees (from Poland and elsewhere) were counted in the January 1941 census. They and anyone who could not prove legal residency since 1850, about 20,000 people, were deported to southern Poland and either abandoned there or were handed over to the Germans between July 15 and August 12, 1941. In practice, the Hungarians deported many people whose families had lived in the area for generations. In some cases, applications for residency permits were allowed to pile up without action by Hungarian officials until after the deportations had been carried out. The vast majority of those deported were massacred in Kameniec-Podolsk (Kamianets-Podilskyi massacre) at the end of August.[b] In the massacres of Újvidék (Novi Sad) and villages nearby, 2,550–2,850 Serbs, 700–1,250 Jews and 60–130 others were murdered by the Hungarian Army and "Csendőrség" (Gendarmerie) in January 1942. Those responsible, Ferenc Feketehalmy-Czeydner, Márton Zöldy, József Grassy, László Deák and others were later tried in Budapest during December 1943 and were sentenced, but some of them escaped to Germany.[citation needed] During the war, Jews were called up to serve in unarmed "labour service" (munkaszolgálat) units which were used to repair bombed railroads, build airports or to clean up minefields at the front barehanded. Approximately 42,000 Jewish labour service troops were killed at the Soviet front in 1942–43, of which about 40% perished in Soviet POW camps. Many died as a result of harsh conditions on the Eastern Front and cruel treatment by their Hungarian sergeants and officers. Another 4,000 forced laborers died in the copper mine of Bor, Serbia. Nevertheless, Miklós Kállay, Prime Minister from March 9, 1942, and Regent Horthy resisted German pressure and refused to allow the deportation of Hungarian Jews to the German extermination camps in occupied Poland. This "anomalous" situation lasted until March 19, 1944, when German troops occupied Hungary and forced Horthy to oust Kállay.[citation needed] The Holocaust On March 18, 1944, Adolf Hitler summoned Horthy to a conference in Austria, where he demanded greater acquiescence from the Hungarian state. Horthy resisted, but his efforts were fruitless – German tanks rolled into Budapest while he attended the conference.[citation needed] On March 23, the government of Döme Sztójay was installed. Among his other first moves, Sztójay legalised the Arrow Cross Party, which quickly began organising. During the four days interregnum following the German occupation, the Ministry of the Interior was put in the hands of László Endre and László Baky, right-wing politicians well known for their hostility to Jews. Their boss, Andor Jaross, was another committed antisemite.[citation needed] Immediately after the occupation, the German and Hungarian authorities established Judenrats throughout the country. A few days later, Ruthenia, Northern Transylvania, and the border region with Croatia and Serbia were placed under military command. On April 9, Prime Minister Döme Sztójay and the Germans obliged Hungary to place at the disposal of the Reich 300,000 Jewish laborers. Five days later, on April 14, Endre, Baky, and Adolf Eichmann, the SS officer in charge of organising the deportation of Hungarian Jews to the German Reich, decided to deport all the Jews of Hungary.[citation needed] Although in 1943, the BBC Polish Service broadcast about the exterminations, the BBC Hungarian Service did not discuss the Jews. A 1942 memo for the BBC Hungarian Service, written by Carlile Macartney, a British Foreign Office adviser on Hungary, said: "We shouldn't mention the Jews at all." Macartney believed that most Hungarians were antisemitic and that mentioning the Jews would alienate much of the population.[c] Most of the Jews did not believe that the Holocaust might happen in Hungary: "This might be happening in Galicia to Polish Jews, but this can't happen in our very cultivated Hungarian state."[d] According to Yehuda Bauer, when the deportations to Auschwitz began in May 1944, the Zionist youth movements organized smuggling of Hungarian Jews into Romania. Around 4,000 Hungarian Jews were smuggled into Romania, including the smugglers and those who paid them on the border. The Romanians agreed to let those Jews in, despite heavy German pressure.[e] However, Romania allied itself with Nazi Germany from 1940 to 1944. Despite Romania not being under German occupation, during the dictatorship of Ion Antonescu, 380,000–400,000 Jews were murdered in the Holocaust in Romanian-controlled areas such as Bessarabia, Bukovina, and Transnistria. SS-Obersturmbannführer Adolf Eichmann, whose duties included supervising the extermination of Jews, set up his staff in the Majestic Hotel and proceeded rapidly in rounding up Jews from the Hungarian provinces outside Budapest and its suburbs. The Yellow Star and Ghettoization laws, and deportation, were accomplished in less than 8 weeks, with the enthusiastic help of the Hungarian authorities, particularly the gendarmerie (csendőrség). The plan was to use 45 cattle cars per train, 4 trains a day, to deport 12,000 Jews to Auschwitz every day from the countryside, starting in mid-May; this was to be followed by the deportation of Jews of Budapest from about July 15. Just before the deportations began, the Vrba-Wetzler Report reached the Allied officials. Details from the report were broadcast by the BBC on June 15, and printed in The New York Times on June 20. World leaders, including Pope Pius XII (June 25), President Franklin D. Roosevelt on June 26, and King Gustaf V of Sweden on June 30, subsequently pleaded with Horthy to use his influence to stop the deportations. Roosevelt specifically threatened military retaliation if the transports were not ceased. On July 7, Horthy, at last, ordered the transports halted. According to historian Péter Sipos, the Hungarian government had already known about the Jewish genocide since 1943. Horthy's son and daughter-in-law both received copies of the Vrba-Wetzler report in early May before mass deportations began.[f] The Vrba-Wetzler Report is believed to have been passed to Hungarian Zionist leader Rudolf Kastner no later than April 28, 1944; however, Kastner did not make it public. The first transports to Auschwitz began in early May 1944, and continued, even as Soviet troops approached. The Hungarian government was solely in charge of the Jews' transportation up to the northern border. The Hungarian commander of the Kassa (Košice) railroad station meticulously recorded the trains heading to Auschwitz with their place of departure and the number of people inside them. The first train went through Kassa on May 14. On a typical day, there were three or four trains, with between 3,000 and 4,000 people on each train, for a total of approximately 12,000 Jews delivered to the extermination facilities each day. There were 109 trains during these 33 days through June 16. (There were days when there were as many as six trains.) Between June 25 and 29, there were 10 trains, then an additional 18 trains on July 5–9. The 138th recorded train (with the 400,426th victim) heading to Auschwitz via Kassa was on July 20. Another 10 trains were sent to Auschwitz via other routes (24,000+ people) (the first two left Budapest and Topolya on April 29, and arrived at Auschwitz on May 2), while 7 trains with 20,787 people went to Strasshof between June 25 and 28 (2 each from Debrecen, Szeged, and Baja; 1 from Szolnok). The unique Kastner train left for Bergen-Belsen with 1,685 people on June 30. "There is no doubt that this is probably the greatest and most horrible crime ever committed in the whole history of the world ..." By July 9, 1944, 437,402 Jews had been deported, according to Reich plenipotentiary in Hungary Edmund Veesenmayer's official German reports.[g] One hundred and forty-seven trains were sent to Auschwitz, where most of the deportees were murdered on arrival. Because the crematoria could not cope with the corpses, special pits were dug near them, where bodies were burned. It has been estimated that one-third of the murdered victims at Auschwitz were Hungarian. For most of this period, 12,000 Jews were delivered to Auschwitz in a typical day, among them the future writer and Nobel Prize-winner Elie Wiesel, at age 15. Photographs taken at Auschwitz were found after the war showing the arrival of Jews from Hungary at the camp. The devotion to the cause of the "final solution" of the Hungarian gendarmes surprised even Eichmann himself, who supervised the operation with only twenty officers and a staff of 100, which included drivers, cooks, etc. Very few Catholic or Protestant clergy members raised their voices against sending the Jews to their death[citation needed]. (Notable was Bishop Áron Márton's sermon in Kolozsvár on May 18). The Catholic Primate of Hungary, Serédi decided not to issue a pastoral letter condemning the deportation of the Jews.[citation needed] On June 14, the Mayor of Budapest, Ákos Farkas, designated about 1,950 (5%) yellow-star houses where every Jew (20%+) had to move together. The authorities thought that the Allies would not bomb Budapest because the "starred" houses were scattered around the town. On June 15, American bombers dropped leaflets over Budapest threatening punishment for those involved in the deportation of Hungarian Jews. At the end of June, Pope Pius XII, Swedish King Gustav VI, and, in strong terms, U.S. President Franklin D. Roosevelt urged for an immediate halt to the deportations. Horthy ordered the suspension of all deportations on July 6. Nonetheless, another 45,000 Jews were deported from the Trans-Danubian region and the outskirts of Budapest to Auschwitz after this day. "After the failed attempt on Hitler's life, the Germans backed off from pressing Horthy's regime to continue further, large-scale deportations, although some smaller groups continued to be deported by train. In late August, Horthy refused Eichmann's request to restart the deportations. Himmler ordered Eichmann to leave Budapest."[h] The Sztójay government rescheduled the date of deportation of the Jews of Budapest to Auschwitz to August 27. But the Romanians switched sides on August 23, 1944, causing huge problems for the German military. Himmler ordered the cancellation of further deportations from Hungary on August 25, in return for nothing more than Saly Mayer [de]’s promise to see whether the Germans' demands would be met.[i] Horthy finally dismissed Prime Minister Sztójay on August 29, the same day the Slovak National uprising against the Nazis started. Despite the change of government, Hungarian troops occupied parts of Southern Transylvania, Romania, and massacred hundreds of Jews in Kissármás (Sărmașu; Sărmașu massacre), Marosludas (Luduș; Luduș massacre) and other places starting September 4. After the Nyilaskeresztes (Arrow Cross) coup d'état on October 15, tens of thousands of Jews of Budapest were sent on foot to the Austrian border in death marches, most forced laborers under Hungarian Army command so far were deported (for instance to Bergen-Belsen), and two ghettos were set up in Budapest. The small "international ghetto" consisted of several "starred" houses under the protection of neutral powers in the Újlipótváros district. Switzerland was allowed to issue 7,800 Schutzpasses, Sweden 4,500, and the Vatican, Portugal, and Spain 3,300 combined. The big Budapest ghetto was set up and walled in the Erzsébetváros part of Budapest on November 29. Nyilas raids and mass executions occurred in both ghettos regularly. In addition, in the two months between November 1944 and February 1945, the Nyilas shot 10,000–15,000 Jews on the banks of the Danube. Soviet troops liberated the big Budapest ghetto on January 18, 1945. On the Buda side of the town, the encircled Nyilas continued their murders until the Soviets took Buda on February 13. The names of some diplomats, Raoul Wallenberg, Carl Lutz, Ángel Sanz Briz, Giorgio Perlasca, Carlos Sampaio Garrido, and Carlos de Liz-Texeira Branquinho deserve mentioning, as well as some members of the army and police who saved people (Pál Szalai, Károly Szabó, and other officers who took Jews out from camps with fake papers), an Interior Ministry official (Béla Horváth) and some church institutions and personalities. Rudolf Kastner deserves special attention because of his enduring negotiations with Adolf Eichmann and Kurt Becher to prevent deportations to Auschwitz, succeeding only minimally by sending Jews to still horrific labor battalions in Austria and ultimately saving 1,680 Jews in Kastner's train. An estimated 119,000 Jewish people were liberated in Budapest (25,000 in the small "international" ghetto, 69,000 in the big ghetto, and 25,000 hiding with false papers) and 20,000 forced laborers in the countryside. Almost all the surviving deportees returned between May and December 1945, at least to check out the fate of their families. Their number was 116,000. It is estimated that from an original population of 861,000 people considered Jewish inside the borders of 1941–1944, about 255,000 survived. This gives a 29.6 percent survival rate. According to another calculation, Hungary's Jewish population at the time of the German invasion was 800,000, of which 365,000 survived. Communist rule At the end of World War II, only 140,000 Jews remained in Hungary, down from 750,000 in 1941. The difficult economic situation coupled with the lingering antisemitic attitude of the population prompted a wave of migration. Between 1945 and 1949, 40,000–50,000 Jews left Hungary for Israel (30,000–35,000) and Western countries (15,000–20,000). Between 1948 and 1951 14,301 Hungarian Jews immigrated to Israel, and after 1951 exit visas became increasingly expensive and restrictive. People of Jewish origin dominated the post-war Communist regime until 1952–53 when many were removed in a series of purges. During its first years, the regime's top membership and secret police were almost entirely Jewish, albeit naturally anti-religious. Leaders like Mátyás Rákosi, Ernő Gerő and Peter Gabor repudiated Judaism and were strict atheists per Communist doctrine. Indeed, under Communist rule from 1948 to 1988, Zionism was outlawed and Jewish observance was curtailed. Moreover, members of the upper class, Jews and Christians alike, were expelled from the cities to the provinces for 6–12 months in the early 1950s. Jews were on both sides of the 1956 uprising. Some armed rebel leaders like István Angyal, an Auschwitz survivor executed on December 1, 1958, were Jewish. Jewish writers and intellectuals such as Tibor Déry, imprisoned from 1957 to 1961, occupied the forefront of the reform movement. After the Hungarian Revolution of 1956, about 20,000 or so Jews fled the country. About 9,000 went to Israel while others settled in the United States, Canada, Australia, Western Europe, and Latin America. An estimated 20% of the Hungarian refugees entering Canada in 1957 were Jewish. The Hungarian Jewish population declined both because of emigration and because of high levels of assimilation and intermarriage and low birth rates. The Jews with the strongest Jewish identities were typically the ones who emigrated. By 1967, only about 80,000–90,000 Jews (including non-religious Jews) remained in the country, with the number dropping further before the country's Communist regime collapsed in 1989. Under the milder communist regime of János Kádár (ruled 1957–1988) leftist Jewish intelligentsia remained an important and vocal part of Hungarian art and sciences. Diplomatic relations with Israel were severed in 1967 following the Six-Day War, but it was not followed by antisemitic campaigns as in Poland or the Soviet Union. From the 1990s Hungary's Jewish population (within its current borders) decreased from nearly half a million after World War I and kept declining between 1920 and 2010, significantly between 1939 and 1945 (World War II and the Holocaust), and further between 1951 and 1960 (the Hungarian Revolution of 1956). Despite the decline, in 2010 Hungary had the largest Jewish population in Eastern Europe outside the former Soviet Union.[citation needed] In the aftermath of the Cold War and during its democratic transition, Hungary played a pivotal role in supporting the emigration of Soviet Jews. From 1989 to 1991, more than 160,000 Soviet Jewish refugees passed through Hungary en route to Israel. This large-scale transit was made possible by a collaboration between the Jewish Agency and Malév Hungarian Airlines, which operated a makeshift air corridor from Moscow to Tel Aviv via Budapest. Hungary’s political elites — both outgoing Communists and emerging democrats — supported this operation as part of the country's realignment with Western democratic norms and its renewed relationship with Israel. Despite facing terrorist threats and domestic political tensions, Hungary maintained its role as a key transit hub during this period. After the fall of communism, Hungary had two prime ministers of partial Jewish origin, József Antall (1990–1993) and Gyula Horn (1994–1998) In April 1997, the Hungarian parliament passed a Jewish compensation act that returns property stolen from Jewish victims during the Nazi and Communist eras. Under this law, property and monetary payment were given back to the Jewish public heritage foundation and to Jewish victims of the Holocaust. Critics have asserted that the sums represent nothing more than a symbolic gesture. According to Randolph L. Braham: "The overshadowing of the Holocaust by a politically guided preoccupation with the horrors of the Communist era has led, among other things, to giving priority to the compensation of the victims of Communism over those of Nazism. To add insult to injury, an indeterminate number of the Christian victims who were compensated for properties nationalized by the Communist regime had, in fact, 'legally' or fraudulently acquired them from Jews during the Nazi era. Compounding this virtual obscenity, the government of Viktor Orbán sought in late 1998 to ease the collective conscience of the nation by offering to compensate survivors by paying approximately $150 for each member of their particular immediate families, assuming that they can prove that their loved ones were in fact victims of the Holocaust." See also Notes References This article incorporates text from this source, which is in the public domain: Büchler, Alexander (1904). "Hungary". In Singer, Isidore (ed.). The Jewish Encyclopedia. Volume 6. New York and London: Funk and Wagnalls Co., pp. 494–503. Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Minecraft#cite_ref-spike_–_pc_330-0] | [TOKENS: 12858]
Contents Minecraft Minecraft is a sandbox game developed and published by Mojang Studios. Following its initial public alpha release in 2009, it was formally released in 2011 for personal computers. The game has since been ported to numerous platforms, including mobile devices and various video game consoles. In Minecraft, players explore a procedurally generated world with virtually infinite terrain made up of voxels (cubes). They can discover and extract raw materials, craft tools and items, build structures, fight hostile mobs, and cooperate with or compete against other players in multiplayer. The game's large community offers a wide variety of user-generated content, such as modifications, servers, player skins, texture packs, and custom maps, which add new game mechanics and possibilities. Originally created by Markus "Notch" Persson using the Java programming language, Jens "Jeb" Bergensten was handed control over the game's development following its full release. In 2014, Mojang and the Minecraft intellectual property were purchased by Microsoft for US$2.5 billion; Xbox Game Studios hold the publishing rights for the Bedrock Edition, the unified cross-platform version which evolved from the Pocket Edition codebase[i] and replaced the legacy console versions. Bedrock is updated concurrently with Mojang's original Java Edition, although with numerous, generally small, differences. Minecraft is the best-selling video game in history with over 350 million copies sold. It has received critical acclaim, winning several awards and being cited as one of the greatest video games of all time. Social media, parodies, adaptations, merchandise, and the annual Minecon conventions have played prominent roles in popularizing it. The wider Minecraft franchise includes several spin-off games, such as Minecraft: Story Mode, Minecraft Dungeons, and Minecraft Legends. A film adaptation, titled A Minecraft Movie, was released in 2025 and became the second highest-grossing video game film of all time. Gameplay Minecraft is a 3D sandbox video game that has no required goals to accomplish, giving players a large amount of freedom in choosing how to play the game. The game features an optional achievement system. Gameplay is in the first-person perspective by default, but players have the option of third-person perspectives. The game world is composed of rough 3D objects—mainly cubes, referred to as blocks—representing various materials, such as dirt, stone, ores, tree trunks, water, and lava. The core gameplay revolves around picking up and placing these objects. These blocks are arranged in a voxel grid, while players can move freely around the world. Players can break, or mine, blocks and then place them elsewhere, enabling them to build things. Very few blocks are affected by gravity, instead maintaining their voxel position in the air. Players can also craft a wide variety of items, such as armor, which mitigates damage from attacks; weapons (such as swords or bows and arrows), which allow monsters and animals to be killed more easily; and tools (such as pickaxes or shovels), which break certain types of blocks more quickly. Some items have multiple tiers depending on the material used to craft them, with higher-tier items being more effective and durable. They may also freely craft helpful blocks—such as furnaces which can cook food and smelt ores, and torches that produce light—or exchange items with villagers (NPC) through trading emeralds for different goods and vice versa. The game has an inventory system, allowing players to carry a limited number of items. The in-game time system follows a day and night cycle, with one full cycle lasting for 20 real-time minutes. The game also contains a material called redstone, which can be used to make primitive mechanical devices, electrical circuits, and logic gates, allowing for the construction of many complex systems. New players are given a randomly selected default character skin out of nine possibilities, including Steve or Alex, but are able to create and upload their own skins. Players encounter various mobs (short for mobile entities) including animals, villagers, and hostile creatures. Passive mobs, such as cows, pigs, and chickens, spawn during the daytime and can be hunted for food and crafting materials, while hostile mobs—including large spiders, witches, skeletons, and zombies—spawn during nighttime or in dark places such as caves. Some hostile mobs, such as zombies and skeletons, burn under the sun if they have no headgear and are not standing in water. Other creatures unique to Minecraft include the creeper (an exploding creature that sneaks up on the player) and the enderman (a creature with the ability to teleport as well as pick up and place blocks). There are also variants of mobs that spawn in different conditions; for example, zombies have husk and drowned variants that spawn in deserts and oceans, respectively. The Minecraft environment is procedurally generated as players explore it using a map seed that is randomly chosen at the time of world creation (or manually specified by the player). Divided into biomes representing different environments with unique resources and structures, worlds are designed to be effectively infinite in traditional gameplay, though technical limits on the player have existed throughout development, both intentionally and not. Implementation of horizontally infinite generation initially resulted in a glitch termed the "Far Lands" at over 12 million blocks away from the world center, where terrain generated as wall-like, fissured patterns. The Far Lands and associated glitches were considered the effective edge of the world until they were resolved, with the current horizontal limit instead being a special impassable barrier called the world border, located 30 million blocks away. Vertical space is comparatively limited, with an unbreakable bedrock layer at the bottom and a building limit several hundred blocks into the sky. Minecraft features three independent dimensions accessible through portals and providing alternate game environments. The Overworld is the starting dimension and represents the real world, with a terrestrial surface setting including plains, mountains, forests, oceans, caves, and small sources of lava. The Nether is a hell-like underworld dimension accessed via an obsidian portal and composed mainly of lava. Mobs that populate the Nether include shrieking, fireball-shooting ghasts, alongside anthropomorphic pigs called piglins and their zombified counterparts. Piglins in particular have a bartering system, where players can give them gold ingots and receive items in return. Structures known as Nether Fortresses generate in the Nether, containing mobs such as wither skeletons and blazes, which can drop blaze rods needed to access the End dimension. The player can also choose to build an optional boss mob known as the Wither, using skulls obtained from wither skeletons and soul sand. The End can be reached through an end portal, consisting of twelve end portal frames. End portals are found in underground structures in the Overworld known as strongholds. To find strongholds, players must craft eyes of ender using an ender pearl and blaze powder. Eyes of ender can then be thrown, traveling in the direction of the stronghold. Once the player reaches the stronghold, they can place eyes of ender into each portal frame to activate the end portal. The dimension consists of islands floating in a dark, bottomless void. A boss enemy called the Ender Dragon guards the largest, central island. Killing the dragon opens access to an exit portal, which, when entered, cues the game's ending credits and the End Poem, a roughly 1,500-word work written by Irish novelist Julian Gough, which takes about nine minutes to scroll past, is the game's only narrative text, and the only text of significant length directed at the player.: 10–12 At the conclusion of the credits, the player is teleported back to their respawn point and may continue the game indefinitely. In Survival mode, players have to gather natural resources such as wood and stone found in the environment in order to craft certain blocks and items. Depending on the difficulty, monsters spawn in darker areas outside a certain radius of the character, requiring players to build a shelter in order to survive at night. The mode also has a health bar which is depleted by attacks from mobs, falls, drowning, falling into lava, suffocation, starvation, and other events. Players also have a hunger bar, which must be periodically refilled by eating food in-game unless the player is playing on peaceful difficulty. If the hunger bar is empty, the player starves. Health replenishes when players have a full hunger bar or continuously on peaceful. Upon losing all health, players die. The items in the players' inventories are dropped unless the game is reconfigured not to do so. Players then re-spawn at their spawn point, which by default is where players first spawn in the game and can be changed by sleeping in a bed or using a respawn anchor. Dropped items can be recovered if players can reach them before they despawn after 5 minutes. Players may acquire experience points (commonly referred to as "xp" or "exp") by killing mobs and other players, mining, smelting ores, animal breeding, and cooking food. Experience can then be spent on enchanting tools, armor and weapons. Enchanted items are generally more powerful, last longer, or have other special effects. The game features two more game modes based on Survival, known as Hardcore mode and Adventure mode. Hardcore mode plays identically to Survival mode, but with the game's difficulty setting locked to "Hard" and with permadeath, forcing them to delete the world or explore it as a spectator after dying. Adventure mode was added to the game in a post-launch update, and prevents the player from directly modifying the game's world. It was designed primarily for use in custom maps, allowing map designers to let players experience it as intended. In Creative mode, players have access to an infinite number of all resources and items in the game through the inventory menu and can place or mine them instantly. Players can toggle the ability to fly freely around the game world at will, and their characters usually do not take any damage nor are affected by hunger. The game mode helps players focus on building and creating projects of any size without disturbance. Multiplayer in Minecraft enables multiple players to interact and communicate with each other on a single world. It is available through direct game-to-game multiplayer, local area network (LAN) play, local split screen (console-only), and servers (player-hosted and business-hosted). Players can run their own server by making a realm, using a host provider, hosting one themselves or connect directly to another player's game via Xbox Live, PlayStation Network or Nintendo Switch Online. Single-player worlds have LAN support, allowing players to join a world on locally interconnected computers without a server setup. Minecraft multiplayer servers are guided by server operators, who have access to server commands such as setting the time of day and teleporting players. Operators can also set up restrictions concerning which usernames or IP addresses are allowed or disallowed to enter the server. Multiplayer servers have a wide range of activities, with some servers having their own unique rules and customs. The largest and most popular server is Hypixel, which has been visited by over 14 million unique players. Player versus player combat (PvP) can be enabled to allow fighting between players. In 2013, Mojang announced Minecraft Realms, a server hosting service intended to enable players to run server multiplayer games easily and safely without having to set up their own. Unlike a standard server, only invited players can join Realms servers, and these servers do not use server addresses. Minecraft: Java Edition Realms server owners can invite up to twenty people to play on their server, with up to ten players online at a time. Minecraft Realms server owners can invite up to 3,000 people to play on their server, with up to ten players online at one time. The Minecraft: Java Edition Realms servers do not support user-made plugins, but players can play custom Minecraft maps. Minecraft Bedrock Realms servers support user-made add-ons, resource packs, behavior packs, and custom Minecraft maps. At Electronic Entertainment Expo 2016, support for cross-platform play between Windows 10, iOS, and Android platforms was added through Realms starting in June 2016, with Xbox One and Nintendo Switch support to come later in 2017, and support for virtual reality devices. On 31 July 2017, Mojang released the beta version of the update allowing cross-platform play. Nintendo Switch support for Realms was released in July 2018. The modding community consists of fans, users and third-party programmers. Using a variety of application program interfaces that have arisen over time, they have produced a wide variety of downloadable content for Minecraft, such as modifications, texture packs and custom maps. Modifications of the Minecraft code, called mods, add a variety of gameplay changes, ranging from new blocks, items, and mobs to entire arrays of mechanisms. The modding community is responsible for a substantial supply of mods from ones that enhance gameplay, such as mini-maps, waypoints, and durability counters, to ones that add to the game elements from other video games and media. While a variety of mod frameworks were independently developed by reverse engineering the code, Mojang has also enhanced vanilla Minecraft with official frameworks for modification, allowing the production of community-created resource packs, which alter certain game elements including textures and sounds. Players can also create their own "maps" (custom world save files) that often contain specific rules, challenges, puzzles and quests, and share them for others to play. Mojang added an adventure mode in August 2012 and "command blocks" in October 2012, which were created specially for custom maps in Java Edition. Data packs, introduced in version 1.13 of the Java Edition, allow further customization, including the ability to add new achievements, dimensions, functions, loot tables, predicates, recipes, structures, tags, and world generation. The Xbox 360 Edition supported downloadable content, which was available to purchase via the Xbox Games Store; these content packs usually contained additional character skins. It later received support for texture packs in its twelfth title update while introducing "mash-up packs", which combined texture packs with skin packs and changes to the game's sounds, music and user interface. The first mash-up pack (and by extension, the first texture pack) for the Xbox 360 Edition was released on 4 September 2013, and was themed after the Mass Effect franchise. Unlike Java Edition, however, the Xbox 360 Edition did not support player-made mods or custom maps. A cross-promotional resource pack based on the Super Mario franchise by Nintendo was released exclusively for the Wii U Edition worldwide on 17 May 2016, and later bundled free with the Nintendo Switch Edition at launch. Another based on Fallout was released on consoles that December, and for Windows and Mobile in April 2017. In April 2018, malware was discovered in several downloadable user-made Minecraft skins for use with the Java Edition of the game. Avast stated that nearly 50,000 accounts were infected, and when activated, the malware would attempt to reformat the user's hard drive. Mojang promptly patched the issue, and released a statement stating that "the code would not be run or read by the game itself", and would run only when the image containing the skin itself was opened. In June 2017, Mojang released the "1.1 Discovery Update" to the Pocket Edition of the game, which later became the Bedrock Edition. The update introduced the "Marketplace", a catalogue of purchasable user-generated content intended to give Minecraft creators "another way to make a living from the game". Various skins, maps, texture packs and add-ons from different creators can be bought with "Minecoins", a digital currency that is purchased with real money. Additionally, users can access specific content with a subscription service titled "Marketplace Pass". Alongside content from independent creators, the Marketplace also houses items published by Mojang and Microsoft themselves, as well as official collaborations between Minecraft and other intellectual properties. By 2022, the Marketplace had over 1.7 billion content downloads, generating over $500 million in revenue. Development Before creating Minecraft, Markus "Notch" Persson was a game developer at King, where he worked until March 2009. At King, he primarily developed browser games and learned several programming languages. During his free time, he prototyped his own games, often drawing inspiration from other titles, and was an active participant on the TIGSource forums for independent developers. One such project was "RubyDung", a base-building game inspired by Dwarf Fortress, but with an isometric, three-dimensional perspective similar to RollerCoaster Tycoon. Among the features in RubyDung that he explored was a first-person view similar to Dungeon Keeper, though he ultimately discarded this idea, feeling the graphics were too pixelated at the time. Around March 2009, Persson left King and joined jAlbum, while continuing to work on his prototypes. Infiniminer, a block-based open-ended mining game first released in April 2009, inspired Persson's vision for RubyDung's future direction. Infiniminer heavily influenced the visual style of gameplay, including bringing back the first-person mode, the "blocky" visual style and the block-building fundamentals. However, unlike Infiniminer, Persson wanted Minecraft to have RPG elements. The first public alpha build of Minecraft was released on 17 May 2009 on TIGSource. Over the years, Persson regularly released test builds that added new features, including tools, mobs, and entire new dimensions. In 2011, partly due to the game's rising popularity, Persson decided to release a full 1.0 version—a second part of the "Adventure Update"—on 18 November 2011. Shortly after, Persson stepped down from development, handing the project's lead to Jens "Jeb" Bergensten. On 15 September 2014, Microsoft, the developer behind the Microsoft Windows operating system and Xbox video game console, announced a $2.5 billion acquisition of Mojang, which included the Minecraft intellectual property. Persson had suggested the deal on Twitter, asking a corporation to buy his stake in the game after receiving criticism for enforcing terms in the game's end-user license agreement (EULA), which had been in place for the past three years. According to Persson, Mojang CEO Carl Manneh received a call from a Microsoft executive shortly after the tweet, asking if Persson was serious about a deal. Mojang was also approached by other companies including Activision Blizzard and Electronic Arts. The deal with Microsoft was arbitrated on 6 November 2014 and led to Persson becoming one of Forbes' "World's Billionaires". After 2014, Minecraft's primary versions received usually annual major updates—free to players who have purchased the game— each primarily centered around a specific theme. For instance, version 1.13, the Update Aquatic, focused on ocean-related features, while version 1.16, the Nether Update, introduced significant changes to the Nether dimension. However, in late 2024, Mojang announced a shift in their update strategy; rather than releasing large updates annually, they opted for a more frequent release schedule with smaller, incremental updates, stating, "We know that you want new Minecraft content more often." The Bedrock Edition has also received regular updates, now matching the themes of the Java Edition updates. Other versions of the game, such as various console editions and the Pocket Edition, were either merged into Bedrock or discontinued and have not received further updates. On 7 May 2019, coinciding with Minecraft's 10th anniversary, a JavaScript recreation of an old 2009 Java Edition build named Minecraft Classic was made available to play online for free. On 16 April 2020, a Bedrock Edition-exclusive beta version of Minecraft, called Minecraft RTX, was released by Nvidia. It introduced physically-based rendering, real-time path tracing, and DLSS for RTX-enabled GPUs. The public release was made available on 8 December 2020. Path tracing can only be enabled in supported worlds, which can be downloaded for free via the in-game Minecraft Marketplace, with a texture pack from Nvidia's website, or with compatible third-party texture packs. It cannot be enabled by default with any texture pack on any world. Initially, Minecraft RTX was affected by many bugs, display errors, and instability issues. On 22 March 2025, a new visual mode called Vibrant Visuals, an optional graphical overhaul similar to Minecraft RTX, was announced. It promises modern rendering features—such as dynamic shadows, screen space reflections, volumetric fog, and bloom—without the need of RTX-capable hardware. Vibrant Visuals was released as a part of the Chase the Skies update on 17 June 2025 for Bedrock Edition and is planned to release on Java Edition at a later date. Development began for the original edition of Minecraft—then known as Cave Game, and now known as the Java Edition—in May 2009,[k] and ended on 13 May, when Persson released a test video on YouTube of an early version of the game, dubbed the "Cave game tech test" or the "Cave game tech demo". The game was named Minecraft: Order of the Stone the next day, after a suggestion made by a player. "Order of the Stone" came from the webcomic The Order of the Stick, and "Minecraft" was chosen "because it's a good name". The title was later shortened to just Minecraft, omitting the subtitle. Persson completed the game's base programming over a weekend in May 2009, and private testing began on TigIRC on 16 May. The first public release followed on 17 May 2009 as a developmental version shared on the TIGSource forums. Based on feedback from forum users, Persson continued updating the game. This initial public build later became known as Classic. Further developmental phases—dubbed Survival Test, Indev, and Infdev—were released throughout 2009 and 2010. The first major update, known as Alpha, was released on 30 June 2010. At the time, Persson was still working a day job at jAlbum but later resigned to focus on Minecraft full-time as sales of the alpha version surged. Updates were distributed automatically, introducing new blocks, items, mobs, and changes to game mechanics such as water flow. With revenue generated from the game, Persson founded Mojang, a video game studio, alongside former colleagues Jakob Porser and Carl Manneh. On 11 December 2010, Persson announced that Minecraft would enter its beta phase on 20 December. He assured players that bug fixes and all pre-release updates would remain free. As development progressed, Mojang expanded, hiring additional employees to work on the project. The game officially exited beta and launched in full on 18 November 2011. On 1 December 2011, Jens "Jeb" Bergensten took full creative control over Minecraft, replacing Persson as lead designer. On 28 February 2012, Mojang announced the hiring of the developers behind Bukkit, a popular developer API for Minecraft servers, to improve Minecraft's support of server modifications. This move included Mojang taking apparent ownership of the CraftBukkit server mod, though this apparent acquisition later became controversial, and its legitimacy was questioned due to CraftBukkit's open-source nature and licensing under the GNU General Public License and Lesser General Public License. In August 2011, Minecraft: Pocket Edition was released as an early alpha for the Xperia Play via the Android Market, later expanding to other Android devices on 8 October 2011. The iOS version followed on 17 November 2011. A port was made available for Windows Phones shortly after Microsoft acquired Mojang. Unlike Java Edition, Pocket Edition initially focused on Minecraft's creative building and basic survival elements but lacked many features of the PC version. Bergensten confirmed on Twitter that the Pocket Edition was written in C++ rather than Java, as iOS does not support Java. On 10 December 2014, a port of Pocket Edition was released for Windows Phone 8.1. In July 2015, a port of the Pocket Edition to Windows 10 was released as the Windows 10 Edition, with full crossplay to other Pocket versions. In January 2017, Microsoft announced that it would no longer maintain the Windows Phone versions of Pocket Edition. On 20 September 2017, with the "Better Together Update", the Pocket Edition was ported to the Xbox One, and was renamed to the Bedrock Edition. The console versions of Minecraft debuted with the Xbox 360 edition, developed by 4J Studios and released on 9 May 2012. Announced as part of the Xbox Live Arcade NEXT promotion, this version introduced a redesigned crafting system, a new control interface, in-game tutorials, split-screen multiplayer, and online play via Xbox Live. Unlike the PC version, its worlds were finite, bordered by invisible walls. Initially, the Xbox 360 version resembled outdated PC versions but received updates to bring it closer to Java Edition before eventually being discontinued. The Xbox One version launched on 5 September 2014, featuring larger worlds and support for more players. Minecraft expanded to PlayStation platforms with PlayStation 3 and PlayStation 4 editions released on 17 December 2013 and 4 September 2014, respectively. Originally planned as a PS4 launch title, it was delayed before its eventual release. A PlayStation Vita version followed in October 2014. Like the Xbox versions, the PlayStation editions were developed by 4J Studios. Nintendo platforms received Minecraft: Wii U Edition on 17 December 2015, with a physical release in North America on 17 June 2016 and in Europe on 30 June. The Nintendo Switch version launched via the eShop on 11 May 2017. During a Nintendo Direct presentation on 13 September 2017, Nintendo announced that Minecraft: New Nintendo 3DS Edition, based on the Pocket Edition, would be available for download immediately after the livestream, and a physical copy available on a later date. The game is compatible only with the New Nintendo 3DS or New Nintendo 2DS XL systems and does not work with the original 3DS or 2DS systems. On 20 September 2017, the Better Together Update introduced Bedrock Edition across Xbox One, Windows 10, VR, and mobile platforms, enabling cross-play between these versions. Bedrock Edition later expanded to Nintendo Switch and PlayStation 4, with the latter receiving the update in December 2019, allowing cross-platform play for users with a free Xbox Live account. The Bedrock Edition released a native version for PlayStation 5 on 22 October 2024, while the Xbox Series X/S version launched on 17 June 2025. On 18 December 2018, the PlayStation 3, PlayStation Vita, Xbox 360, and Wii U versions of Minecraft received their final update and would later become known as "Legacy Console Editions". On 15 January 2019, the New Nintendo 3DS version of Minecraft received its final update, effectively becoming discontinued as well. An educational version of Minecraft, designed for use in schools, launched on 1 November 2016. It is available on Android, ChromeOS, iPadOS, iOS, MacOS, and Windows. On 20 August 2018, Mojang announced that it would bring Education Edition to iPadOS in Autumn 2018. It was released to the App Store on 6 September 2018. On 27 March 2019, it was announced that it would be operated by JD.com in China. On 26 June 2020, a public beta for the Education Edition was made available to Google Play Store compatible Chromebooks. The full game was released to the Google Play Store for Chromebooks on 7 August 2020. On 20 May 2016, China Edition (also known as My World) was announced as a localized edition for China, where it was released under a licensing agreement between NetEase and Mojang. The PC edition was released for public testing on 8 August 2017. The iOS version was released on 15 September 2017, and the Android version was released on 12 October 2017. The PC edition is based on the original Java Edition, while the iOS and Android mobile versions are based on the Bedrock Edition. The edition is free-to-play and had over 700 million registered accounts by September 2023. This version of Bedrock Edition is exclusive to Microsoft's Windows 10 and Windows 11 operating systems. The beta release for Windows 10 launched on the Windows Store on 29 July 2015. After nearly a year and a half in beta, Microsoft fully released the version on 19 December 2016. Called the "Ender Update", this release implemented new features to this version of Minecraft like world templates and add-on packs. On 7 June 2022, the Java and Bedrock Editions of Minecraft were merged into a single bundle for purchase on Windows; those who owned one version would automatically gain access to the other version. Both game versions would otherwise remain separate. Around 2011, prior to Minecraft's full release, Mojang collaborated with The Lego Group to create a Lego brick-based Minecraft game called Brickcraft. This would have modified the base Minecraft game to use Lego bricks, which meant adapting the basic 1×1 block to account for larger pieces typically used in Lego sets. Persson worked on an early version called "Project Rex Kwon Do", named after the character of the same name from the film Napoleon Dynamite. Although Lego approved the project and Mojang assigned two developers for six months, it was canceled due to the Lego Group's demands, according to Mojang's Daniel Kaplan. Lego considered buying Mojang to complete the game, but when Microsoft offered over $2 billion for the company, Lego stepped back, unsure of Minecraft's potential. On 26 June 2025, a build of Brickcraft dated 28 June 2012 was published on a community archive website Omniarchive. Initially, Markus Persson planned to support the Oculus Rift with a Minecraft port. However, after Facebook acquired Oculus in 2013, he abruptly canceled the plans, stating, "Facebook creeps me out." In 2016, a community-made mod, Minecraft VR, added VR support for Java Edition, followed by Vivecraft for HTC Vive. Later that year, Microsoft introduced official Oculus Rift support for Windows 10 Edition, leading to the discontinuation of the Minecraft VR mod due to trademark complaints. Vivecraft was endorsed by Minecraft VR contributors for its Rift support. Also available is a Gear VR version, titled Minecraft: Gear VR Edition. Windows Mixed Reality support was added in 2017. On 7 September 2020, Mojang Studios announced that the PlayStation 4 Bedrock version would receive PlayStation VR support later that month. In September 2024, the Minecraft team announced they would no longer support PlayStation VR, which received its final update in March 2025. Music and sound design Minecraft's music and sound effects were produced by German musician Daniel Rosenfeld, better known as C418. To create the sound effects for the game, Rosenfeld made extensive use of Foley techniques. On learning the processes for the game, he remarked, "Foley's an interesting thing, and I had to learn its subtleties. Early on, I wasn't that knowledgeable about it. It's a whole trial-and-error process. You just make a sound and eventually you go, 'Oh my God, that's it! Get the microphone!' There's no set way of doing anything at all." He reminisced on creating the in-game sound for grass blocks, stating "It turns out that to make grass sounds you don't actually walk on grass and record it, because grass sounds like nothing. What you want to do is get a VHS, break it apart, and just lightly touch the tape." According to Rosenfeld, his favorite sound to design for the game was the hisses of spiders. He elaborates, "I like the spiders. Recording that was a whole day of me researching what a spider sounds like. Turns out, there are spiders that make little screeching sounds, so I think I got this recording of a fire hose, put it in a sampler, and just pitched it around until it sounded like a weird spider was talking to you." Many of the sound design decisions by Rosenfeld were done accidentally or spontaneously. The creeper notably lacks any specific noises apart from a loud fuse-like sound when about to explode; Rosenfeld later recalled "That was just a complete accident by Markus and me [sic]. We just put in a placeholder sound of burning a matchstick. It seemed to work hilariously well, so we kept it." On other sounds, such as those of the zombie, Rosenfeld remarked, "I actually never wanted the zombies so scary. I intentionally made them sound comical. It's nice to hear that they work so well [...]." Rosenfeld remarked that the sound engine was "terrible" to work with, remembering "If you had two song files at once, it [the game engine] would actually crash. There were so many more weird glitches like that the guys never really fixed because they were too busy with the actual game and not the sound engine." The background music in Minecraft consists of instrumental ambient music. To compose the music of Minecraft, Rosenfeld used the package from Ableton Live, along with several additional plug-ins. Speaking on them, Rosenfeld said "They can be pretty much everything from an effect to an entire orchestra. Additionally, I've got some synthesizers that are attached to the computer. Like a Moog Voyager, Dave Smith Prophet 08 and a Virus TI." On 4 March 2011, Rosenfeld released a soundtrack titled Minecraft – Volume Alpha; it includes most of the tracks featured in Minecraft, as well as other music not featured in the game. Kirk Hamilton of Kotaku chose the music in Minecraft as one of the best video game soundtracks of 2011. On 9 November 2013, Rosenfeld released the second official soundtrack, titled Minecraft – Volume Beta, which included the music that was added in a 2013 "Music Update" for the game. A physical release of Volume Alpha, consisting of CDs, black vinyl, and limited-edition transparent green vinyl LPs, was issued by indie electronic label Ghostly International on 21 August 2015. On 14 August 2020, Ghostly released Volume Beta on CD and vinyl, with alternate color LPs and lenticular cover pressings released in limited quantities. The final update Rosenfeld worked on was 2018's 1.13 Update Aquatic. His music remained the only music in the game until 2020's "Nether Update", introducing pieces from Lena Raine. Since then, other composers have made contributions, including Kumi Tanioka, Samuel Åberg, Aaron Cherof, and Amos Roddy, with Raine remaining as the new primary composer. Ownership of all music besides Rosenfeld's independently released albums has been retained by Microsoft, with their label publishing all of the other artists' releases. Gareth Coker also composed some of the music for the game's mini games from the Legacy Console editions. Rosenfeld had stated his intent to create a third album of music for the game in a 2015 interview with Fact, and confirmed its existence in a 2017 tweet, stating that his work on the record as of then had tallied up to be longer than the previous two albums combined, which in total clocks in at over 3 hours and 18 minutes. However, due to licensing issues with Microsoft, the third volume has since not seen release. On 8 January 2021, Rosenfeld was asked in an interview with Anthony Fantano whether or not there was still a third volume of his music intended for release. Rosenfeld responded, saying, "I have something—I consider it finished—but things have become complicated, especially as Minecraft is now a big property, so I don't know." Reception Minecraft has received critical acclaim, with praise for the creative freedom it grants players in-game, as well as the ease of enabling emergent gameplay. Critics have expressed enjoyment in Minecraft's complex crafting system, commenting that it is an important aspect of the game's open-ended gameplay. Most publications were impressed by the game's "blocky" graphics, with IGN describing them as "instantly memorable". Reviewers also liked the game's adventure elements, noting that the game creates a good balance between exploring and building. The game's multiplayer feature has been generally received favorably, with IGN commenting that "adventuring is always better with friends". Jaz McDougall of PC Gamer said Minecraft is "intuitively interesting and contagiously fun, with an unparalleled scope for creativity and memorable experiences". It has been regarded as having introduced millions of children to the digital world, insofar as its basic game mechanics are logically analogous to computer commands. IGN was disappointed about the troublesome steps needed to set up multiplayer servers, calling it a "hassle". Critics also said that visual glitches occur periodically. Despite its release out of beta in 2011, GameSpot said the game had an "unfinished feel", adding that some game elements seem "incomplete or thrown together in haste". A review of the alpha version, by Scott Munro of the Daily Record, called it "already something special" and urged readers to buy it. Jim Rossignol of Rock Paper Shotgun also recommended the alpha of the game, calling it "a kind of generative 8-bit Lego Stalker". On 17 September 2010, gaming webcomic Penny Arcade began a series of comics and news posts about the addictiveness of the game. The Xbox 360 version was generally received positively by critics, but did not receive as much praise as the PC version. Although reviewers were disappointed by the lack of features such as mod support and content from the PC version, they acclaimed the port's addition of a tutorial and in-game tips and crafting recipes, saying that they make the game more user-friendly. The Xbox One Edition was one of the best received ports, being praised for its relatively large worlds. The PlayStation 3 Edition also received generally favorable reviews, being compared to the Xbox 360 Edition and praised for its well-adapted controls. The PlayStation 4 edition was the best received port to date, being praised for having 36 times larger worlds than the PlayStation 3 edition and described as nearly identical to the Xbox One edition. The PlayStation Vita Edition received generally positive reviews from critics but was noted for its technical limitations. The Wii U version received generally positive reviews from critics but was noted for a lack of GamePad integration. The 3DS version received mixed reviews, being criticized for its high price, technical issues, and lack of cross-platform play. The Nintendo Switch Edition received fairly positive reviews from critics, being praised, like other modern ports, for its relatively larger worlds. Minecraft: Pocket Edition initially received mixed reviews from critics. Although reviewers appreciated the game's intuitive controls, they were disappointed by the lack of content. The inability to collect resources and craft items, as well as the limited types of blocks and lack of hostile mobs, were especially criticized. After updates added more content, Pocket Edition started receiving more positive reviews. Reviewers complimented the controls and the graphics, but still noted a lack of content. Minecraft surpassed over a million purchases less than a month after entering its beta phase in early 2011. At the same time, the game had no publisher backing and has never been commercially advertised except through word of mouth, and various unpaid references in popular media such as the Penny Arcade webcomic. By April 2011, Persson estimated that Minecraft had made €23 million (US$33 million) in revenue, with 800,000 sales of the alpha version of the game, and over 1 million sales of the beta version. In November 2011, prior to the game's full release, Minecraft beta surpassed 16 million registered users and 4 million purchases. By March 2012, Minecraft had become the 6th best-selling PC game of all time. As of 10 October 2014[update], the game had sold 17 million copies on PC, becoming the best-selling PC game of all time. On 25 February 2014, the game reached 100 million registered users. By May 2019, 180 million copies had been sold across all platforms, making it the single best-selling video game of all time. The free-to-play Minecraft China version had over 700 million registered accounts by September 2023. By 2023, the game had sold over 300 million copies. As of April 2025, Minecraft has sold over 350 million copies. The Xbox 360 version of Minecraft became profitable within the first day of the game's release in 2012, when the game broke the Xbox Live sales records with 400,000 players online. Within a week of being on the Xbox Live Marketplace, Minecraft sold a million copies. GameSpot announced in December 2012 that Minecraft sold over 4.48 million copies since the game debuted on Xbox Live Arcade in May 2012. In 2012, Minecraft was the most purchased title on Xbox Live Arcade; it was also the fourth most played title on Xbox Live based on average unique users per day. As of 4 April 2014[update], the Xbox 360 version has sold 12 million copies. In addition, Minecraft: Pocket Edition has reached a figure of 21 million in sales. The PlayStation 3 Edition sold one million copies in five weeks. The release of the game's PlayStation Vita version boosted Minecraft sales by 79%, outselling both PS3 and PS4 debut releases and becoming the largest Minecraft launch on a PlayStation console. The PS Vita version sold 100,000 digital copies in Japan within the first two months of release, according to an announcement by SCE Japan Asia. By January 2015, 500,000 digital copies of Minecraft were sold in Japan across all PlayStation platforms, with a surge in primary school children purchasing the PS Vita version. As of 2022, the Vita version has sold over 1.65 million physical copies in Japan, making it the best-selling Vita game in the country. Minecraft helped improve Microsoft's total first-party revenue by $63 million for the 2015 second quarter. The game, including all of its versions, had over 112 million monthly active players by September 2019. On its 11th anniversary in May 2020, the company announced that Minecraft had reached over 200 million copies sold across platforms with over 126 million monthly active players. By April 2021, the number of active monthly users had climbed to 140 million. In July 2010, PC Gamer listed Minecraft as the fourth-best game to play at work. In December of that year, Good Game selected Minecraft as their choice for Best Downloadable Game of 2010, Gamasutra named it the eighth best game of the year as well as the eighth best indie game of the year, and Rock, Paper, Shotgun named it the "game of the year". Indie DB awarded the game the 2010 Indie of the Year award as chosen by voters, in addition to two out of five Editor's Choice awards for Most Innovative and Best Singleplayer Indie. It was also awarded Game of the Year by PC Gamer UK. The game was nominated for the Seumas McNally Grand Prize, Technical Excellence, and Excellence in Design awards at the March 2011 Independent Games Festival and won the Grand Prize and the community-voted Audience Award. At Game Developers Choice Awards 2011, Minecraft won awards in the categories for Best Debut Game, Best Downloadable Game and Innovation Award, winning every award for which it was nominated. It also won GameCity's video game arts award. On 5 May 2011, Minecraft was selected as one of the 80 games that would be displayed at the Smithsonian American Art Museum as part of The Art of Video Games exhibit that opened on 16 March 2012. At the 2011 Spike Video Game Awards, Minecraft won the award for Best Independent Game and was nominated in the Best PC Game category. In 2012, at the British Academy Video Games Awards, Minecraft was nominated in the GAME Award of 2011 category and Persson received The Special Award. In 2012, Minecraft XBLA was awarded a Golden Joystick Award in the Best Downloadable Game category, and a TIGA Games Industry Award in the Best Arcade Game category. In 2013, it was nominated as the family game of the year at the British Academy Video Games Awards. During the 16th Annual D.I.C.E. Awards, the Academy of Interactive Arts & Sciences nominated the Xbox 360 version of Minecraft for "Strategy/Simulation Game of the Year". Minecraft Console Edition won the award for TIGA Game Of The Year in 2014. In 2015, the game placed 6th on USgamer's The 15 Best Games Since 2000 list. In 2016, Minecraft placed 6th on Time's The 50 Best Video Games of All Time list. Minecraft was nominated for the 2013 Kids' Choice Awards for Favorite App, but lost to Temple Run. It was nominated for the 2014 Kids' Choice Awards for Favorite Video Game, but lost to Just Dance 2014. The game later won the award for the Most Addicting Game at the 2015 Kids' Choice Awards. In addition, the Java Edition was nominated for "Favorite Video Game" at the 2018 Kids' Choice Awards, while the game itself won the "Still Playing" award at the 2019 Golden Joystick Awards, as well as the "Favorite Video Game" award at the 2020 Kids' Choice Awards. Minecraft also won "Stream Game of the Year" at inaugural Streamer Awards in 2021. The game later garnered a Nickelodeon Kids' Choice Award nomination for Favorite Video Game in 2021, and won the same category in 2022 and 2023. At the Golden Joystick Awards 2025, it won the Still Playing Award - PC and Console. Minecraft has been subject to several notable controversies. In June 2014, Mojang announced that it would begin enforcing the portion of Minecraft's end-user license agreement (EULA) which prohibits servers from giving in-game advantages to players in exchange for donations or payments. Spokesperson Owen Hill stated that servers could still require players to pay a fee to access the server and could sell in-game cosmetic items. The change was supported by Persson, citing emails he received from parents of children who had spent hundreds of dollars on servers. The Minecraft community and server owners protested, arguing that the EULA's terms were more broad than Mojang was claiming, that the crackdown would force smaller servers to shut down for financial reasons, and that Mojang was suppressing competition for its own Minecraft Realms subscription service. The controversy contributed to Notch's decision to sell Mojang. In 2020, Mojang announced an eventual change to the Java Edition to require a login from a Microsoft account rather than a Mojang account, the latter of which would be sunsetted. This also required Java Edition players to create Xbox network Gamertags. Mojang defended the move to Microsoft accounts by saying that improved security could be offered, including two-factor authentication, blocking cyberbullies in chat, and improved parental controls. The community responded with intense backlash, citing various technical difficulties encountered in the process and how account migration would be mandatory, even for those who do not play on servers. As of 10 March 2022, Microsoft required that all players migrate in order to maintain access the Java Edition of Minecraft. Mojang announced a deadline of 19 September 2023 for account migration, after which all legacy Mojang accounts became inaccessible and unable to be migrated. In June 2022, Mojang added a player-reporting feature in Java Edition. Players could report other players on multiplayer servers for sending messages prohibited by the Xbox Live Code of Conduct; report categories included profane language,[l] substance abuse, hate speech, threats of violence, and nudity. If a player was found to be in violation of Xbox Community Standards, they would be banned from all servers for a specific period of time or permanently. The update containing the report feature (1.19.1) was released on 27 July 2022. Mojang received substantial backlash and protest from community members, one of the most common complaints being that banned players would be forbidden from joining any server, even private ones. Others took issue to what they saw as Microsoft increasing control over its player base and exercising censorship, leading some to start a hashtag #saveminecraft and dub the version "1.19.84", a reference to the dystopian novel Nineteen Eighty-Four. The "Mob Vote" was an online event organized by Mojang in which the Minecraft community voted between three original mob concepts; initially, the winning mob was to be implemented in a future update, while the losing mobs were scrapped, though after the first mob vote this was changed, and losing mobs would now have a chance to come to the game in the future. The first Mob Vote was held during Minecon Earth 2017 and became an annual event starting with Minecraft Live 2020. The Mob Vote was often criticized for forcing players to choose one mob instead of implementing all three, causing divisions and flaming within the community, and potentially allowing internet bots and Minecraft content creators with large fanbases to conduct vote brigading. The Mob Vote was also blamed for a perceived lack of new content added to Minecraft since Microsoft's acquisition of Mojang in 2014. The 2023 Mob Vote featured three passive mobs—the crab, the penguin, and the armadillo—with voting scheduled to start on 13 October. In response, a Change.org petition was created on 6 October, demanding that Mojang eliminate the Mob Vote and instead implement all three mobs going forward. The petition received approximately 445,000 signatures by 13 October and was joined by calls to boycott the Mob Vote, as well as a partially tongue-in-cheek "revolutionary" propaganda campaign in which sympathizers created anti-Mojang and pro-boycott posters in the vein of real 20th century propaganda posters. Mojang did not release an official response to the boycott, and the Mob Vote otherwise proceeded normally, with the armadillo winning the vote. In September 2024, as part of a blog post detailing their future plans for Minecraft's development, Mojang announced the Mob Vote would be retired. Cultural impact In September 2019, The Guardian classified Minecraft as the best video game of the 21st century to date, and in November 2019, Polygon called it the "most important game of the decade" in its 2010s "decade in review". In June 2020, Minecraft was inducted into the World Video Game Hall of Fame. Minecraft is recognized as one of the first successful games to use an early access model to draw in sales prior to its full release version to help fund development. As Minecraft helped to bolster indie game development in the early 2010s, it also helped to popularize the use of the early access model in indie game development. Social media sites such as YouTube, Facebook, and Reddit have played a significant role in popularizing Minecraft. Research conducted by the Annenberg School for Communication at the University of Pennsylvania showed that one-third of Minecraft players learned about the game via Internet videos. In 2010, Minecraft-related videos began to gain influence on YouTube, often made by commentators. The videos usually contain screen-capture footage of the game and voice-overs. Common coverage in the videos includes creations made by players, walkthroughs of various tasks, and parodies of works in popular culture. By May 2012, over four million Minecraft-related YouTube videos had been uploaded. The game would go on to be a prominent fixture within YouTube's gaming scene during the entire 2010s; in 2014, it was the second-most searched term on the entire platform. By 2018, it was still YouTube's biggest game globally. Some popular commentators have received employment at Machinima, a now-defunct gaming video company that owned a highly watched entertainment channel on YouTube. The Yogscast is a British company that regularly produces Minecraft videos; their YouTube channel has attained billions of views, and their panel at Minecon 2011 had the highest attendance. Another well-known YouTube personality is Jordan Maron, known online as CaptainSparklez, who has also created many Minecraft music parodies, including "Revenge", a parody of Usher's "DJ Got Us Fallin' in Love". Minecraft's popularity on YouTube was described by Polygon as quietly dominant, although in 2019, thanks in part to PewDiePie's playthroughs of the game, Minecraft experienced a visible uptick in popularity on the platform. Longer-running series include Far Lands or Bust, dedicated to reaching the obsolete "Far Lands" glitch by foot on an older version of the game. YouTube announced that on 14 December 2021 that the total amount of Minecraft-related views on the website had exceeded one trillion. Minecraft has been referenced by other video games, such as Torchlight II, Team Fortress 2, Borderlands 2, Choplifter HD, Super Meat Boy, The Elder Scrolls V: Skyrim, The Binding of Isaac, The Stanley Parable, and FTL: Faster Than Light. Minecraft is officially represented in downloadable content for the crossover fighter Super Smash Bros. Ultimate, with Steve as a playable character with a moveset including references to building, crafting, and redstone, alongside an Overworld-themed stage. It was also referenced by electronic music artist Deadmau5 in his performances. The game is also referenced heavily in "Informative Murder Porn", the second episode of the seventeenth season of the animated television series South Park. In 2025, A Minecraft Movie was released. It made $313 million in the box office in the first week, a record-breaking opening for a video game adaptation. Minecraft has been noted as a cultural touchstone for Generation Z, as many of the generation's members played the game at a young age. The possible applications of Minecraft have been discussed extensively, especially in the fields of computer-aided design (CAD) and education. In a panel at Minecon 2011, a Swedish developer discussed the possibility of using the game to redesign public buildings and parks, stating that rendering using Minecraft was much more user-friendly for the community, making it easier to envision the functionality of new buildings and parks. In 2012, a member of the Human Dynamics group at the MIT Media Lab, Cody Sumter, said: "Notch hasn't just built a game. He's tricked 40 million people into learning to use a CAD program." Various software has been developed to allow virtual designs to be printed using professional 3D printers or personal printers such as MakerBot and RepRap. In September 2012, Mojang began the Block by Block project in cooperation with UN Habitat to create real-world environments in Minecraft. The project allows young people who live in those environments to participate in designing the changes they would like to see. Using Minecraft, the community has helped reconstruct the areas of concern, and citizens are invited to enter the Minecraft servers and modify their own neighborhood. Carl Manneh, Mojang's managing director, called the game "the perfect tool to facilitate this process", adding "The three-year partnership will support UN-Habitat's Sustainable Urban Development Network to upgrade 300 public spaces by 2016." Mojang signed Minecraft building community, FyreUK, to help render the environments into Minecraft. The first pilot project began in Kibera, one of Nairobi's informal settlements and is in the planning phase. The Block by Block project is based on an earlier initiative started in October 2011, Mina Kvarter (My Block), which gave young people in Swedish communities a tool to visualize how they wanted to change their part of town. According to Manneh, the project was a helpful way to visualize urban planning ideas without necessarily having a training in architecture. The ideas presented by the citizens were a template for political decisions. In April 2014, the Danish Geodata Agency generated all of Denmark in fullscale in Minecraft based on their own geodata. This is possible because Denmark is one of the flattest countries with the highest point at 171 meters (ranking as the country with the 30th smallest elevation span), where the limit in default Minecraft was around 192 meters above in-game sea level when the project was completed. Taking advantage of the game's accessibility where other websites are censored, the non-governmental organization Reporters Without Borders has used an open Minecraft server to create the Uncensored Library, a repository within the game of journalism by authors from countries (including Egypt, Mexico, Russia, Saudi Arabia and Vietnam) who have been censored and arrested, such as Jamal Khashoggi. The neoclassical virtual building was created over about 250 hours by an international team of 24 people. Despite its unpredictable nature, Minecraft speedrunning, where players time themselves from spawning into a new world to reaching The End and defeating the Ender Dragon boss, is popular. Some speedrunners use a combination of mods, external programs, and debug menus, while other runners play the game in a more vanilla or more consistency-oriented way. Minecraft has been used in educational settings through initiatives such as MinecraftEdu, founded in 2011 to make the game affordable and accessible for schools in collaboration with Mojang. MinecraftEdu provided features allowing teachers to monitor student progress, including screenshot submissions as evidence of lesson completion, and by 2012 reported that approximately 250,000 students worldwide had access to the platform. Mojang also developed Minecraft: Education Edition with pre-built lesson plans for up to 30 students in a closed environment. Educators have used Minecraft to teach subjects such as history, language arts, and science through custom-built environments, including reconstructions of historical landmarks and large-scale models of biological structures such as animal cells. The introduction of redstone blocks enabled the construction of functional virtual machines such as a hard drive and an 8-bit computer. Mods have been created to use these mechanics for teaching programming. In 2014, the British Museum announced a project to reproduce its building and exhibits in Minecraft in collaboration with the public. Microsoft and Code.org have offered Minecraft-based tutorials and activities designed to teach programming, reporting by 2018 that more than 85 million children had used their resources. In 2025, the Musée de Minéralogie in Paris held a temporary exhibition titled "Minerals in Minecraft." Following the initial surge in popularity of Minecraft in 2010, other video games were criticised for having various similarities to Minecraft, and some were described as being "clones", often due to a direct inspiration from Minecraft, or a superficial similarity. Examples include Ace of Spades, CastleMiner, CraftWorld, FortressCraft, Terraria, BlockWorld 3D, Total Miner, and Luanti (formerly Minetest). David Frampton, designer of The Blockheads, reported that one failure of his 2D game was the "low resolution pixel art" that too closely resembled the art in Minecraft, which resulted in "some resistance" from fans. A homebrew adaptation of the alpha version of Minecraft for the Nintendo DS, titled DScraft, has been released; it has been noted for its similarity to the original game considering the technical limitations of the system. In response to Microsoft's acquisition of Mojang and their Minecraft IP, various developers announced further clone titles developed specifically for Nintendo's consoles, as they were the only major platforms not to officially receive Minecraft at the time. These clone titles include UCraft (Nexis Games), Cube Life: Island Survival (Cypronia), Discovery (Noowanda), Battleminer (Wobbly Tooth Games), Cube Creator 3D (Big John Games), and Stone Shire (Finger Gun Games). Despite this, the fears of fans were unfounded, with official Minecraft releases on Nintendo consoles eventually resuming. Markus Persson made another similar game, Minicraft, for a Ludum Dare competition in 2011. In 2025, Persson announced through a poll on his X account that he was considering developing a spiritual successor to Minecraft. He later clarified that he was "100% serious", and that he had "basically announced Minecraft 2". Within days, however, Persson cancelled the plans after speaking to his team. In November 2024, artificial intelligence companies Decart and Etched released Oasis, an artificially generated version of Minecraft, as a proof of concept. Every in-game element is completely AI-generated in real time and the model does not store world data, leading to "hallucinations" such as items and blocks appearing that were not there before. In January 2026, indie game developer Unomelon announced that their voxel sandbox game Allumeria would be playable in Steam Next Fest that year. On 10 February, Mojang issued a DMCA takedown of Allumeria on Steam through Valve, alleging the game was infringing on Minecraft's copyright. Some reports suggested that the takedown may have used an automatic AI copyright claiming service. The DMCA was later withdrawn. Minecon was an annual official fan convention dedicated to Minecraft. The first full Minecon was held in November 2011 at the Mandalay Bay Hotel and Casino in Las Vegas. The event included the official launch of Minecraft; keynote speeches, including one by Persson; building and costume contests; Minecraft-themed breakout classes; exhibits by leading gaming and Minecraft-related companies; commemorative merchandise; and autograph and picture times with Mojang employees and well-known contributors from the Minecraft community. In 2016, Minecon was held in-person for the last time, with the following years featuring annual "Minecon Earth" livestreams on minecraft.net and YouTube instead. These livestreams, later rebranded to "Minecraft Live", included the mob/biome votes, and announcements of new game updates. In 2025, "Minecraft Live" became a biannual event as part of Minecraft's changing update schedule.[citation needed] Notes References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Apple_Inc.] | [TOKENS: 17511]
Contents Apple Inc. Apple Inc. is an American multinational technology company headquartered in Cupertino, California, in Silicon Valley, best known for its consumer electronics, software and online services. Founded in 1976 as Apple Computer Company by Steve Jobs, Steve Wozniak and Ronald Wayne, the company was incorporated by Jobs and Wozniak as Apple Computer, Inc. the following year. It was renamed to its current name in 2007 as the company had expanded its focus from computers to consumer electronics. Apple is one of the Big Tech companies. The company was founded in 1976 to market Wozniak's Apple I personal computer. Its successor, the Apple II, became one of the first successful mass-produced microcomputers. Apple introduced the Lisa in 1983 and the Macintosh in 1984 as some of the first computers to use a graphical user interface and a mouse. By 1985, internal conflicts led to Jobs leaving the company to form NeXT and Wozniak withdrawing to other ventures; John Sculley served as CEO for over a decade. In the 1990s, Apple lost considerable market share in the personal computer industry to the lower-priced Wintel duopoly of Intel-powered PC clones running Microsoft Windows, and neared bankruptcy by 1997. To overhaul its market strategy, Apple acquired NeXT, bringing Jobs back to the company. Under his leadership, Apple returned to profitability by introducing the iMac, iPod, iPhone, and iPad devices; creating the iTunes Store; launching the "Think different" advertising campaign; and opening the Apple Store retail chain. Jobs resigned in 2011 for health reasons, and died two months later; he was succeeded as CEO by Tim Cook. Apple's product lineup includes portable and home hardware like the iPhone, iPad, Apple Watch, Mac, Apple Vision Pro, AirPods, and Apple TV; several in-house operating systems such as iOS, iPadOS, and macOS; and various software and services including Apple Pay and iCloud, as well as multimedia streaming services like Apple Music and Apple TV. Since 2011, Apple has for the most part been the world's largest company by market capitalization, and, as of 2024[update], is the largest manufacturing company by revenue, the fourth-largest personal computer vendor, the largest vendor of tablet computers, and the largest vendor of mobile phones. Apple became the first publicly traded US company to be valued at over $1 trillion in 2018, and, as of October 2025[update], is valued at just over $4 trillion. Apple has received criticism regarding its contractors' labor conditions, its relationship with trade unions, its environmental practices, and its corporate ethics, including anti-competitive tactics, materials sourcing, and its acquisitions of smaller businesses. Nevertheless, the company has a large following and enjoys a high level of customer loyalty. Apple has consistently been ranked as one of the world's most valuable brands since the late 2000s. History Apple Computer Company was founded on April 1, 1976, by Steve Jobs, Steve Wozniak, and Ronald Wayne as a partnership. The company's first product was the Apple I, a computer designed and hand-built entirely by Wozniak. To finance its creation, Jobs sold his Volkswagen Bus, and Wozniak sold his HP-65 calculator. Neither received the full selling price, but together they earned $1,300 (equivalent to $7,400 in 2025). Wozniak debuted the first prototype Apple I at the Homebrew Computer Club in July 1976. The Apple I was sold as a motherboard with CPU, RAM, and basic textual-video chips—a base kit concept which was not yet marketed as a complete personal computer. It was priced soon after debut for $666.66 (equivalent to $3,800 in 2025). Wozniak later said he was unaware of the coincidental mark of the beast in the number 666, and that he came up with the price because he liked "repeating digits". Apple Computer, Inc. was incorporated in Cupertino, California, on January 3, 1977, without Wayne, who had left and sold his share of the company back to Jobs and Wozniak for $800 only twelve days after having co-founded it. Multimillionaire Mike Markkula provided essential business expertise and funding of $250,000 (equivalent to $1,328,000 in 2025) to Jobs and Wozniak during the incorporation of Apple. During the first five years of operations, revenue grew exponentially, doubling about every four months. Between September 1977 and September 1980, yearly sales grew from $775,000 to US$118 million, an average annual growth rate of 533%. The Apple II, also designed by Wozniak, was introduced on April 16, 1977, at the first West Coast Computer Faire. It differed from its major rivals, the TRS-80 and Commodore PET, because of its character cell-based color graphics and open architecture. The Apple I and early Apple II models used ordinary audio cassette tapes as storage devices, which were superseded by the 5+1⁄4-inch floppy disk drive and interface called the Disk II in 1978. The Apple II was chosen to be the desktop platform for the first killer application of the business world: VisiCalc, a spreadsheet program released in 1979. VisiCalc created a business market for the Apple II and gave home users an additional reason to buy an Apple II: compatibility with the office, but Apple II market share remained behind home computers made by competitors such as Atari, Commodore, and Tandy. On December 12, 1980, Apple went public with an initial public offering (IPO) on the fully electronic Nasdaq stock market, selling 4.6 million shares at $22 per share ($.10 per share when adjusting for stock splits as of September 3, 2022[update]), generating over $100 million, which was more capital than any IPO since Ford Motor Company in 1956. By the end of the day, around 300 millionaires were created, including Jobs and Wozniak, from a stock price of $29 per share and a market cap of $1.778 billion. In November and December 1979, Steve Jobs and Apple employees, including Jef Raskin, visited Xerox PARC, where they observed the Xerox Alto, featuring a graphical user interface (GUI) and a mouse. Jobs had negotiated with Xerox in advance to gain access to PARC's technology, in exchange for the right to purchase $1 million worth of Apple's pre-IPO shares. This visit influenced Jobs to implement a GUI in Apple's products, starting with the Apple Lisa. Despite being pioneering as a mass-marketed GUI computer, the Lisa suffered from high costs and limited software options, leading to commercial failure. Jobs, infuriated by his removal from the Lisa team, joined the company's Macintosh division in January 1981. Raskin had envisioned the Macintosh as a low-cost, portable computer and Wozniak had helped its development until a plane crash in early 1981 forced him to step back from the project. Wozniak's absence allowed Jobs to quickly take over the project and he redefined the Macintosh as a graphical system that would be cheaper than the Lisa, undercutting his former division. Jobs was also hostile to the Apple II division, which at the time generated most of the company's revenue. In 1984, Apple launched the Macintosh, the first personal computer without a bundled programming language. Its debut was signified by "1984", a US$1.5-million television advertisement directed by Ridley Scott that aired during the third quarter of Super Bowl XVIII on January 22, 1984. This was hailed as a watershed event for Apple's success and was called a "masterpiece" by CNN and one of the greatest TV advertisements of all time by TV Guide. The advertisement created great interest in the Macintosh, and sales were initially good, but began to taper off dramatically after the first three months as reviews started coming in. Jobs had required 128 kilobytes of RAM, which limited its speed and software in favor of aspiring for a projected price point of $1,000 (equivalent to $3,100 in 2025). The Macintosh shipped for $2,495 (equivalent to $7,700 in 2025), a price panned by critics due to its slow performance. In early 1985, this sales slump triggered a power struggle between Steve Jobs and CEO John Sculley, who had been hired away from Pepsi two years earlier by Jobs saying, "Do you want to sell sugar water for the rest of your life or come with me and change the world?" Sculley removed Jobs as the head of the Macintosh division, with unanimous support from the Apple board of directors. The board of directors instructed Sculley to contain Jobs and his ability to launch expensive forays into untested products. Rather than submit to Sculley's direction, Jobs attempted to oust him from leadership. Jean-Louis Gassée informed Sculley that Jobs had been attempting to organize a boardroom coup, and called an emergency meeting at which Apple's executive staff sided with Sculley, and stripped Jobs of all operational duties. Jobs resigned from Apple in September 1985 and took several Apple employees with him to found NeXT. Wozniak had also quit his active employment at Apple earlier in 1985 to pursue other ventures, expressing his frustration with Apple's treatment of the Apple II division and stating that the company had "been going in the wrong direction for the last five years". Wozniak remained employed by Apple as a representative, receiving a stipend estimated to be $120,000 per year. Jobs and Wozniak remained Apple shareholders following their departures. After the departures of Jobs and Wozniak in 1985, Sculley launched the Macintosh 512K that year with quadruple the RAM, and introduced the LaserWriter, the first reasonably priced PostScript-based laser printer. PageMaker, an early desktop publishing application taking advantage of the PostScript language, was also released by Aldus Corporation in July 1985. It has been suggested that the combination of Macintosh, LaserWriter, and PageMaker was responsible for the creation of the desktop publishing market. This dominant position in the desktop publishing market allowed the company to focus on higher price points, the so-called "high-right policy" named for its position on a price–profits chart. Newer models selling at higher price points offered higher profit margin, and appeared to have no effect on total sales as power users snapped up every increase in speed. Although some worried about pricing themselves out of the market, the high-right policy was in full force by the mid-1980s, due to Jean-Louis Gassée's slogan of "fifty-five or die", referring to the 55% profit margins of the Macintosh II. This policy began to backfire late in the decade as desktop publishing programs appeared on IBM PC compatibles with some of the same functionality of the Macintosh at far lower price points. The company lost its dominant position in the desktop publishing market and estranged many of its original consumer customer base who could no longer afford Apple products. The Christmas season of 1989 was the first in the company's history to have declining sales, which led to a 20% drop in Apple's stock price. During this period, the relationship between Sculley and Gassée deteriorated, leading Sculley to effectively demote Gassée in January 1990 by appointing Michael Spindler as the chief operating officer. Gassée left the company later that year to set up a rival, Be Inc. The company pivoted strategy and, in October 1990, introduced three lower-cost models: the Macintosh Classic, the Macintosh LC, and the Macintosh IIsi, all of which generated significant sales due to pent-up demand. In 1991, Apple introduced the PowerBook, a commercially successful laptop whose clamshell design influenced later notebook computers. The same year, Apple introduced System 7, a major upgrade to the Macintosh operating system, adding color to the interface and introducing new networking capabilities. The success of the lower-cost Macs and the PowerBook brought increasing revenue. For some time, Apple was doing very well, introducing fresh new products at increasing profits. The magazine MacAddict named the period between 1989 and 1991 as the "first golden age" of the Macintosh. The success of lower-cost consumer Macs, especially the LC, cannibalized higher-priced machines. To address this, management introduced several new brands, selling largely identical machines at different price points, for different markets: the high-end Quadra series, the mid-range Centris series, and the consumer-marketed Performa series. This led to significant consumer confusion between so many models. In 1993, the Apple II series was discontinued. It was expensive to produce, and the company decided it was still absorbing sales from lower-cost Macintosh models. After the launch of the LC, Apple encouraged developers to create applications for Macintosh rather than Apple II, and authorized salespersons to redirect consumers from Apple II and toward Macintosh. The Apple IIe was discontinued in 1993. Apple experimented with several other unsuccessful consumer targeted products during the 1990s, including QuickTake digital cameras, PowerCD portable CD audio players, speakers, the Pippin video game console, the eWorld online service, and Apple Interactive Television Box. Apple made significant investments in the Newton tablet division; the Newton was later criticized for high costs and limited commercial success, and commentators have attributed the decision to start that division in part to market forecasts by CEO John Sculley. Throughout this period, Microsoft continued to gain market share with Windows by focusing on delivering software to inexpensive personal computers, while Apple was delivering a richly engineered but expensive experience. Apple relied on high profit margins and never developed a clear response; it sued Microsoft for making a GUI similar to the Lisa in Apple Computer, Inc. v. Microsoft Corp. The lawsuit dragged on for years and was finally dismissed. The major product flops and the rapid loss of market share to Windows sullied Apple's reputation, and in 1993 Sculley was replaced as CEO by Michael Spindler. Under Spindler, Apple, IBM, and Motorola formed the AIM alliance in 1994 to create a new computing platform (the PowerPC Reference Platform or PReP), with IBM and Motorola hardware coupled with Apple software. The AIM alliance hoped that PReP's performance and Apple's software would leave the PC far behind and thus counter the dominance of Windows. That year, Apple introduced the Power Macintosh, the first of many computers with Motorola's PowerPC processor. In the wake of the alliance, Apple opened up to the idea of allowing Motorola and other companies to build Macintosh clones. Over the next two years, 75 distinct Macintosh clone models were introduced. However, by 1996, Apple executives were worried that the clones were cannibalizing sales of its own high-end computers, where profit margins were highest. In 1996, Spindler was replaced as CEO by Gil Amelio, who was hired for his reputation as a corporate rehabilitator. Amelio made deep changes, including extensive layoffs and cost-cutting. This period was also marked by numerous failed attempts to modernize the Macintosh operating system (the classic Mac OS). The original Macintosh operating system (System 1) was not built for multitasking (running several applications at once). The company attempted to correct this by introducing cooperative multitasking in System 5, but still decided it needed a more modern approach. This led to the Pink project in 1988, A/UX that same year, Copland in 1994, and evaluated the purchase of BeOS in 1996. Talks with Be stalled when the CEO, former Apple executive Jean-Louis Gassée, demanded $300 million in contrast to Apple's $125-million offer. With Apple only weeks away from bankruptcy, the board preferred NeXTSTEP and purchased NeXT in late 1996 for $400 million, retaining Steve Jobs. The NeXT acquisition was finalized on February 9, 1997, and the board brought Jobs back to Apple as an advisor. On July 9, 1997, Jobs staged a boardroom coup, which resulted in Amelio's resignation after overseeing a three-year record-low stock price and crippling financial losses. The board named Jobs as interim CEO and he immediately reviewed the product lineup. Jobs canceled 70% of models, ending 3,000 jobs and paring to the core of its computer offerings. The next month, in August 1997, Steve Jobs convinced Microsoft to make a $150-million investment in Apple and a commitment to continue developing Mac software. This was seen as an "antitrust insurance policy" for Microsoft which had recently settled with the Department of Justice over anti-competitive practices in the United States v. Microsoft Corp. case. Around then, Jobs donated Apple's internal library and archives to Stanford University, to focus more on the present and the future rather than the past. He ended the Mac clone deals and in September 1997, purchased the largest clone maker, Power Computing. On November 10, 1997, the Apple Store website launched, which was tied to a new build-to-order manufacturing model similar to PC manufacturer Dell's success. The moves paid off for Jobs; at the end of his first year as CEO, the company had a $309-million profit. On May 6, 1998, Apple introduced a new all-in-one computer reminiscent of the original Macintosh: the iMac. The iMac sold 800,000 units in its first five months. It abandoned legacy technologies such as the 3+1⁄2-inch diskette, adopted the USB connector early, and came pre-installed with Internet connectivity (the 'i' in iMac) via Ethernet and a dial-up modem. Its striking teardrop shape and translucent materials were designed by Jonathan Ive, who had been hired by Amelio, and who collaborated with Jobs for more than a decade to reshape Apple's product design. A little more than a year later on July 21, 1999, Apple introduced the iBook consumer laptop. It culminated Jobs's strategy to produce only four products: refined versions of the Power Macintosh G3 desktop and PowerBook G3 laptop for professionals, and the iMac desktop and iBook laptop for consumers. Jobs said the small product line allowed for a greater focus on quality and innovation. Around then, Apple also completed numerous acquisitions to create a portfolio of digital media production software for both professionals and consumers. Apple acquired Macromedia's Key Grip digital video editing software project, which was launched as Final Cut Pro in April 1999. Key Grip's development also led to Apple's release of the consumer video-editing product iMovie in October 1999. Apple acquired the German company Astarte in April 2000, which had developed the DVD authoring software DVDirector, which Apple repackaged as the professional-oriented DVD Studio Pro, and reused its technology to create iDVD for the consumer market. In 2000, Apple purchased the SoundJam MP audio player software from Casady & Greene. Apple renamed the program iTunes, simplified the user interface and added CD burning. In 2001, Apple changed course with three announcements. First, on March 24, 2001, Apple announced the release of a new modern operating system, Mac OS X. This was after numerous failed attempts in the early 1990s, and several years of development. Mac OS X is based on NeXTSTEP, OpenStep, and BSD Unix, to combine the stability, reliability, and security of Unix with the ease of use of an overhauled user interface. Second, in May 2001, the first two Apple Store retail locations opened in Virginia and California, offering an improved presentation of the company's products. At the time, many speculated that the stores would fail, but they later expanded to more than 500 locations worldwide. Third, on October 23, 2001, the iPod portable digital audio player debuted. The product was first sold on November 10, 2001, and sold over 100 million units within six years. In 2003, the iTunes Store was introduced with music downloads for 99¢ a song and iPod integration. It quickly became the market leader in online music services, with over 5 billion downloads by June 19, 2008. Two years later, the iTunes Store was the world's largest music retailer. In 2002, Apple purchased Nothing Real for its advanced digital compositing application Shake, and Emagic for the music productivity application Logic. The purchase of Emagic made Apple the first computer manufacturer to own a music software company. The acquisition was followed by the development of Apple's consumer-level GarageBand application. The release of iPhoto that year completed the iLife suite. At the Worldwide Developers Conference keynote address on June 6, 2005, Jobs announced that Apple would move away from PowerPC processors, and the Mac would transition to Intel processors in 2006. On January 10, 2006, the new MacBook Pro and iMac became the first Apple computers to use Intel's Core Duo CPU. By August 7, 2006, Apple made the transition to Intel chips for the entire Mac product line—over one year sooner than announced. The Power Mac, iBook, and PowerBook brands were retired during the transition; the Mac Pro, MacBook, and MacBook Pro became their respective successors. Apple also introduced Boot Camp in 2006 to help users install Windows XP or Windows Vista on their Intel Macs alongside Mac OS X. Between early 2003 and 2006, the price of Apple's stock increased more than tenfold, from around $6 per share (split-adjusted) to over $80. When Apple surpassed Dell's market cap in January 2006, Jobs sent an email to Apple employees saying Dell's CEO Michael Dell should eat his words. Nine years prior, Dell had said that if he ran Apple he would "shut it down and give the money back to the shareholders". During his keynote speech at the Macworld Expo on January 9, 2007, Jobs announced the renaming of Apple Computer, Inc. to Apple Inc., because the company had broadened its focus from computers to consumer electronics. This event also saw the announcement of the iPhone and the Apple TV. The company sold 270,000 first-generation iPhones during the first 30 hours of sales, and some industry commentators described the device as "a game changer for the industry". In an article posted on Apple's website on February 6, 2007, Jobs wrote that Apple would be willing to sell music on the iTunes Store without digital rights management, thereby allowing tracks to be played on third-party players if record labels would agree to drop the technology. On April 2, 2007, Apple and EMI jointly announced the removal of DRM technology from EMI's catalog in the iTunes Store, effective in May 2007. Other record labels eventually followed suit and Apple published a press release in January 2009 to announce that all songs on the iTunes Store are available without their FairPlay DRM. In July 2008, Apple launched the App Store to sell third-party applications for the iPhone and iPod Touch. Within a month, the store sold 60 million applications and registered an average daily revenue of $1 million, with Jobs speculating in August 2008 that the App Store could become a billion-dollar business for Apple. By October 2008, Apple was the third-largest mobile handset supplier in the world due to the popularity of the iPhone. On January 14, 2009, Jobs announced in an internal memo that he would be taking a six-month medical leave of absence from Apple until the end of June 2009 and would spend the time focusing on his health. In the email, Jobs stated that "the curiosity over my personal health continues to be a distraction not only for me and my family, but everyone else at Apple as well", and explained that the break would allow the company "to focus on delivering extraordinary products". Though Jobs was absent, Apple recorded its best non-holiday quarter (Q1 FY 2009) during the recession, with revenue of $8.16 billion and profit of $1.21 billion. After years of speculation and multiple rumored "leaks", Apple unveiled a large screen, tablet-like media device known as the iPad on January 27, 2010. The iPad ran the same touch-based operating system as the iPhone, and all iPhone apps were compatible with the iPad. This gave the iPad a large app catalog on launch, though having very little development time before the release. Later that year on April 3, 2010, the iPad was launched in the US. It sold more than 300,000 units on its first day, and 500,000 by the end of the first week. In May 2010, Apple's market cap exceeded that of competitor Microsoft for the first time since 1989. In June 2010, Apple released the iPhone 4, which introduced video calling using FaceTime, multitasking, and a new design with an exposed stainless steel frame as the phone's antenna system. Later that year, Apple again refreshed the iPod line by introducing a multi-touch iPod Nano, an iPod Touch with FaceTime, and an iPod Shuffle that brought back the clickwheel buttons of earlier generations. It also introduced the smaller, cheaper second-generation Apple TV which allowed the rental of movies and shows. On January 17, 2011, Jobs announced in an internal Apple memo that he would take another medical leave of absence for an indefinite period to allow him to focus on his health. Chief operating officer Tim Cook assumed Jobs's day-to-day operations at Apple, although Jobs would still remain "involved in major strategic decisions". Apple became the most valuable consumer-facing brand in the world, and has consistently been among the most valuable brands since then. In June 2011, Jobs took the stage and unveiled iCloud, an online storage and syncing service for music, photos, files, and software which replaced MobileMe, Apple's previous attempt at content syncing. This would be the last product launch Jobs would attend before his death. On August 24, 2011, Jobs resigned his position as CEO of Apple. He was replaced by Cook and Jobs became Apple's chairman. Apple did not have a chairman at the time and instead had two co-lead directors, Andrea Jung and Arthur D. Levinson, who continued with those titles until Levinson replaced Jobs as chairman of the board in November after Jobs's death. On October 5, 2011, Steve Jobs died, marking the end of an era for Apple. The next major product announcement by Apple was on January 19, 2012, when Apple's Phil Schiller introduced iBooks Textbooks for iOS and iBooks Author for Mac OS X in New York City. Jobs stated in the biography Steve Jobs that he wanted to reinvent the textbook industry and education. From 2011 to 2012, Apple released the iPhone 4S and iPhone 5, which featured improved cameras, an intelligent software assistant named Siri, and cloud-synced data with iCloud; the third- and fourth-generation iPads, which featured Retina displays; and the iPad Mini, which featured a 7.9-inch screen in contrast to the iPad's 9.7-inch screen. These launches were successful, with the iPhone 5 (released September 21, 2012) becoming Apple's biggest iPhone launch with over two million pre-orders and sales of three million iPads in three days following the launch of the iPad Mini and fourth-generation iPad (released November 3, 2012). Apple also released a third-generation 13-inch MacBook Pro with a Retina display and new iMac and Mac Mini computers. On August 20, 2012, Apple's rising stock price increased the company's market capitalization to a then-record $624 billion. This beat the non-inflation-adjusted record for market capitalization previously set by Microsoft in 1999. On August 24, 2012, a US jury ruled that Samsung should pay Apple $1.05 billion (£665m) in damages in an intellectual property lawsuit. Samsung appealed the damages award, which was reduced by $450 million and further granted Samsung's request for a new trial. On November 10, 2012, Apple confirmed a global settlement that dismissed all existing lawsuits between Apple and HTC up to that date, in favor of a ten-year license agreement for current and future patents between the two companies. It is predicted that Apple will make US$280 million per year from this deal with HTC. In May 2014, Apple confirmed its intent to acquire Dr. Dre and Jimmy Iovine's audio company Beats Electronics, producer of the "Beats by Dr. Dre" line of headphones and speaker products, and operator of the music streaming service Beats Music, for US$3 billion, and to sell their products through Apple's retail outlets and resellers. Iovine believed that Beats had always "belonged" with Apple, as the company modeled itself after Apple's "unmatched ability to marry culture and technology". The acquisition was the largest purchase in Apple's history. During a press event on September 9, 2014, Apple introduced a smartwatch called the Apple Watch. Initially, Apple marketed the device as a fashion accessory and a complement to the iPhone, that would allow people to look at their smartphones less. Over time, the company has focused on developing health and fitness-oriented features on the watch, in an effort to compete with dedicated activity trackers. In January 2016, Apple announced that over one billion Apple devices were in active use worldwide. On June 6, 2016, Fortune released Fortune 500, its list of companies ranked on revenue generation. In the trailing fiscal year of 2015, Apple was listed as the top tech company. It ranked third, overall, with US$233 billion in revenue. This represents a movement upward of two spots from the previous year's list. In June 2017, Apple announced the HomePod, its smart speaker aimed to compete against Sonos, Google Home, and Amazon Echo. Toward the end of the year, TechCrunch reported that Apple was acquiring Shazam, a company that introduced its products at WWDC and specializing in music, TV, film and advertising recognition. The acquisition was confirmed a few days later, reportedly costing Apple US$400 million, with media reports that the purchase looked like a move to acquire data and tools bolstering the Apple Music streaming service. The purchase was approved by the European Union in September 2018. Also in June 2017, Apple appointed Jamie Erlicht and Zack Van Amburg to head the newly formed worldwide video unit. In November 2017, Apple announced it was branching out into original scripted programming: a drama series starring Jennifer Aniston and Reese Witherspoon, and a reboot of the anthology series Amazing Stories with Steven Spielberg. In June 2018, Apple signed the Writers Guild of America's minimum basic agreement and Oprah Winfrey to a multi-year content partnership. Additional partnerships for original series include Sesame Workshop and DHX Media and its subsidiary Peanuts Worldwide, and a partnership with A24 to create original films. During the Apple Special Event in September 2017, the AirPower wireless charger was announced alongside the iPhone X, iPhone 8, and Watch Series 3. The AirPower was intended to wirelessly charge multiple devices, simultaneously. Though initially set to release in early 2018, the AirPower would be canceled in March 2019, marking the first cancellation of a device under Cook's leadership. On August 19, 2020, Apple's share price briefly topped $467.77, making it the first US company with a market capitalization of US$2 trillion. During its annual WWDC keynote speech on June 22, 2020, Apple announced it would move away from Intel processors, and the Mac would transition to processors developed in-house. The announcement was expected by industry analysts, and it has been noted that Macs featuring Apple's processors would allow for big increases in performance over current Intel-based models. On November 10, 2020, the MacBook Air, MacBook Pro, and the Mac Mini became the first Macs powered by an Apple-designed processor, the Apple M1. In April 2022, it was reported that Samsung Electro-Mechanics would be collaborating with Apple on its M2 chip instead of LG Innotek. Developer logs showed that at least nine Mac models with four different M2 chips were being tested. The Wall Street Journal reported that Apple's effort to develop its own chips left it better prepared to deal with the semiconductor shortage that emerged during the COVID-19 pandemic, which led to increased profitability, with sales of M1-based Mac computers rising sharply in 2020 and 2021. It also inspired other companies like Tesla, Amazon, and Meta Platforms to pursue a similar path. In April 2022, Apple opened an online store that allowed anyone in the US to view repair manuals and order replacement parts for specific recent iPhones, although the difference in cost between this method and official repair is anticipated to be minimal. In May 2022, a trademark was filed for RealityOS, an operating system reportedly intended for virtual and augmented reality headsets, first mentioned in 2017. According to Bloomberg, the headset may come out in 2023. Further insider reports state that the device uses iris scanning for payment confirmation and signing into accounts. In June 2023, Apple formally announced its first mixed reality headset, the Apple Vision Pro, which ran its new visionOS operating system. The headset was released in February of the following year. On June 18, 2022, the Apple Store in Towson, Maryland, became the first to unionize in the US, with the employees voting to join the International Association of Machinists and Aerospace Workers. On July 7, 2022, Apple added Lockdown Mode to macOS 13 and iOS 16, as a response to the earlier Pegasus revelations; the mode increases security protections for high-risk users against targeted zero-day malware. Apple launched a buy now, pay later service called 'Apple Pay Later' for its Apple Wallet users in March 2023. The program allows its users to apply for loans between $50 and $1,000 to make online or in-app purchases and then repaying them through four installments spread over six weeks without any interest or fees. In November 2023, Apple agreed to a $25-million settlement in a US Department of Justice case that alleged Apple was discriminating against US citizens in hiring. Apple created jobs that were not listed online and that required a paper submission application, while advertising these jobs to foreign workers as part of recruitment for PERM. In January 2024, Apple announced compliance with the European Union's competition law, with major changes to the App Store and other services, effective on March 7. This enables iOS users in the 27-nation bloc to use alternative app stores, and alternative payment methods within apps. This adds a menu in Safari for downloading alternative browsers, such as Chrome or Firefox. In June 2024, Apple introduced Apple Intelligence to incorporate on-device artificial intelligence (AI) capabilities. On November 1, 2024, Apple announced its acquisition of Pixelmator, a company known for its image editing applications for iPhone and Mac. Apple had previously showcased Pixelmator's apps during its product launches, including naming Pixelmator Pro its Mac App of the Year in 2018 for its innovative use of machine learning and AI. In the announcement, Pixelmator stated that there would be no significant changes to its existing apps following the acquisition. On December 31, 2024, a preliminary settlement was filed in the Oakland California federal court that accused Apple of unlawfully recording private conversations, through unintentional Siri activations, and of sharing them with third parties, including advertisers. Apple agreed to a $95-million cash settlement to resolve this lawsuit in which its Siri assistant violated user privacy. While denying any wrongdoing, Apple settled the case, allowing affected users to potentially claim up to $20 per device. Attorneys sought $28.5 million in fees from the settlement fund. In 2025, Apple undertook its largest investment initiative to date, announcing a commitment to spend over $500 billion in the United States over the following four years. This extensive strategy includes the opening of a new manufacturing facility in Houston to produce servers supporting Apple Intelligence, expansion of research and development in fields like silicon engineering and AI, and the establishment of a new advanced manufacturing academy in Detroit. The company also pledged to double its US Advanced Manufacturing Fund and increase collaboration with American suppliers, aiming to create tens of thousands of jobs related to R&D, AI, and manufacturing technologies. The software landscape at Apple underwent a transformation in 2025. At its Worldwide Developers Conference (WWDC), Apple introduced the new "Liquid Glass" design language, rolled out unified system design updates across iOS 26, iPadOS 26, macOS Tahoe, and other platforms, and significantly expanded the capabilities of "Apple Intelligence", the company's personal AI system. According to Apple, these updates were intended to address previous criticisms of fragmented interfaces and to use on‑device and cloud‑based AI to improve privacy and user experience. Despite continued growth in its services sector, including a new all-time high for services revenue in the March quarter and the launch of updated models such as the iPhone 16e and M4 MacBook Air, Apple faced significant challenges. The company contended with a 19% decline in stock value year-to-date, ongoing antitrust investigations by the US Department of Justice, and legal disputes involving the App Store. Competition in the AI space escalated, with rivals gaining ground. High-profile departures and political tensions, including calls for Apple to manufacture iPhones domestically or face tariffs, added to the pressure, have been cited by analysts as contributing to a difficult year for CEO Tim Cook. In December 2025, Cook met with US House members to push back against the App Store Accountability Act which could require that Apple authenticates users' ages and possibly collect sensitive data on children. On January 12, 2026, Apple announced a partnership with Google Gemini for AI-powered Siri. In January 2026, Apple acquired Q.ai, an Israeli artificial intelligence startup specializing in imaging and machine learning technologies for audio processing. The financial terms were not disclosed, though media reports estimated the acquisition at nearly US$2 billion; this is Apple's second-largest purchase to date. Following the deal, Q.ai's founders and approximately 100 employees joined Apple. Products Since the company's founding and into the early 2000s, Apple primarily sold computers, which are marketed as Macintosh since the mid-1980s. Since then, the company has expanded its product categories to include various portable devices, starting with the now discontinued iPod (2001), and later with the iPhone (2007) and iPad (2010). Apple also sells several other products that it categorizes as "Wearables, Home and Accessories", such as the Apple Watch, Apple TV, AirPods, HomePod, and Apple Vision Pro. Commentators have described Apple devices as forming a cohesive ecosystem when used together, and have also criticized them for reduced functionality or fewer features when used with competing devices, and for reliance on Apple's proprietary features, software, and services— an approach often described as a "walled garden". As of 2023[update], there were over 2 billion Apple devices in active use worldwide. Mac, which is short for Macintosh, its official name until 1999, is Apple's line of personal computers that use the company's proprietary macOS operating system. Personal computers were Apple's original business line, but as of the end of 2024[update] they account for only about eight percent of the company's revenue. There are six Mac computer families in production: Macs use Apple silicon chips, run the macOS operating system, and include Apple software like the Safari web browser, iMovie for home movie editing, GarageBand for music creation, and the iWork productivity suite. Apple also sells pro apps: Final Cut Pro for video production, Logic Pro for musicians and producers, and Xcode for software developers. Apple also sells a variety of accessories for Macs, including the Pro Display XDR, Apple Studio Display, Magic Mouse, Magic Trackpad, and Magic Keyboard. The iPhone is Apple's line of smartphones, which run the iOS operating system. The first iPhone was unveiled by Steve Jobs on January 9, 2007. Since then, new iPhone models have been released every year. When it was introduced, its multi-touch screen was described as "revolutionary" and a "game-changer" for the mobile phone industry. The device has been credited with creating the app economy. iOS is one of the two major smartphone platforms in the world, alongside Android. The iPhone has generated large profits for the company, and is credited with helping to make Apple one of the world's most valuable publicly traded companies. As of the end of 2024[update], the iPhone accounts for nearly half of the company's revenue. The iPad is Apple's line of tablets, which run iPadOS. The first-generation iPad was announced on January 27, 2010. The iPad is mainly marketed for consuming multimedia, creating art, working on documents, videoconferencing, and playing games. The iPad lineup consists of several base iPad models, and the smaller iPad Mini, upgraded iPad Air, and high-end iPad Pro. Apple has consistently improved the iPad's performance, with the iPad Pro adopting the same M-series chips as the Mac; but the iPad still receives criticism for its limited OS. As of September 2020,[update] Apple has sold more than 500 million iPads, though sales peaked in 2013. The iPad still remains the most popular tablet computer by sales as of the second quarter of 2020[update], and accounted for seven percent of the company's revenue as of the end of 2024[update]. Apple sells several iPad accessories, including the Apple Pencil, Smart Keyboard, Smart Keyboard Folio, Magic Keyboard, and several adapters. Apple makes several other products that it categorizes as "Wearables, Home and Accessories". These products include the AirPods line of wireless headphones, Apple TV digital media players, Apple Watch smartwatches, Beats headphones, HomePod smart speakers, and the Vision Pro mixed reality headset. As of the end of 2024[update], this broad line of products comprises about ten percent of the company's revenues. Apple offers a broad line of services, including advertising in the App Store and Apple News app, the AppleCare+ extended warranty plan, the iCloud+ cloud-based data storage service, payment services through the Apple Card credit card and the Apple Pay processing platform, digital content services including Apple Books, Apple Fitness+, Apple Music, Apple News+, Apple TV (formerly TV+), and the iTunes Store. Apple also provides Apple One, which is a bundle of these services. As of the end of 2024[update], services comprise about 26% of the company's revenue. In 2019, Apple announced it would be making a concerted effort to expand its service revenues. Marketing According to Steve Jobs, the company's name was inspired by his visit to an apple farm while on a fruitarian diet. Apple's first logo, designed by Ron Wayne, depicts Sir Isaac Newton sitting under an apple tree. It was almost immediately replaced by Rob Janoff's "rainbow Apple", the now-familiar rainbow-colored silhouette of an apple with a bite taken out of it. This logo has been erroneously referred to as a tribute to Alan Turing, with the bite mark a reference to his method of suicide. On August 27, 1999, Apple officially dropped the rainbow scheme and began to use monochromatic logos nearly identical in shape to the previous rainbow incarnation. An Aqua-themed version of the monochrome logo was used from 1997 until 2003, and a glass-themed version was used from 2007 until 2013. Apple evangelists were actively engaged by the company at one time, but this was after the phenomenon had already been firmly established. Apple evangelist Guy Kawasaki has called the brand fanaticism "something that was stumbled upon", while Jonathan Ive claimed in 2014 that "people have an incredibly personal relationship" with Apple's products. Fortune magazine named Apple the most admired company in the United States in 2008, and in the world from 2008 to 2012. On September 30, 2013, Apple surpassed Coca-Cola to become the world's most valuable brand in the Omnicom Group's "Best Global Brands" report. Boston Consulting Group has ranked Apple as the world's most innovative brand every year as of 2005[update]. As of January 2021,[update] 1.65 billion Apple products were in active use. In February 2023, that number exceeded 2 billion devices. In 2023, the World Intellectual Property Organization's Madrid Yearly Review ranked Apple's number of trademark applications, filled under the Madrid System, as 10th in the world, with 74 trademark applications submitted during 2023. Apple was ranked the No. 3 company in the world in the 2024 Fortune 500 list. Apple's first slogan, "Byte into an Apple", was coined in the late 1970s. From 1997 to 2002, the slogan "Think different" was used in advertising campaigns, and is still closely associated with Apple. Apple also has slogans for specific product lines—for example, "iThink, therefore iMac" was used in 1998 to promote the iMac, and "Say hello to iPhone" has been used in iPhone advertisements. "Hello" was also used to introduce the original Macintosh, Newton, iMac ("hello (again)"), and iPod. From the introduction of the Macintosh in 1984, with the 1984 Super Bowl advertisement to the more modern Get a Mac adverts, Apple has been recognized for its efforts toward effective advertising and marketing for its products. However, claims made by later campaigns were criticized, particularly the 2005 Power Mac ads. Apple's product advertisements gained significant attention as a result of their graphics and song choice. Musicians who benefited from an improved profile as a result of their songs being included on Apple advertisements include Canadian singer Feist with the song "1234" and Yael Naïm with the song "New Soul". The first Apple Stores were originally opened as two locations in May 2001 by then-CEO Steve Jobs, after years of attempting but failing store-within-a-store concepts. Seeing a need for improved retail presentation of the company's products, he began an effort in 1997 to revamp the retail program to get an improved relationship to consumers, and hired Ron Johnson in 2000. Jobs relaunched Apple's online store in 1997, and opened the first two physical stores in 2001. The media initially speculated that Apple would fail, but they exceeded the sales numbers of competing nearby stores, and within three years reached US$1 billion in annual sales, becoming the fastest retailer in history to do so. Over the years, Apple has expanded the number of retail locations and its geographical coverage, with 499 stores across 22 countries worldwide as of December 2017[update]. Strong product sales have placed Apple among the top-tier retail stores, with sales over $16 billion globally in 2011. Apple Stores underwent a period of significant redesign, beginning in May 2016. This redesign included physical changes to the Apple Stores, such as open spaces and re-branded rooms, and changes in function to facilitate interaction between consumers and professionals. Many Apple Stores are located inside shopping malls, but Apple has built several stand-alone "flagship" stores in high-profile locations. It has been granted design patents and received architectural awards for its stores' designs and construction, specifically for its use of glass staircases and cubes. The success of Apple Stores have had significant influence over other consumer electronics retailers, who have lost traffic, control and profits due to a perceived higher quality of service and products at Apple Stores. Due to the popularity of the brand, Apple receives a large number of job applications, many of which come from young workers. Although Apple Store employees receive above-average pay, are offered money toward education and health care, and receive product discounts, there are limited or no paths of career advancement. On March 16, 2020, France fined Apple €1.1 billion for colluding with two wholesalers to stifle competition and keep prices high by impeding independent resellers. The arrangement created aligned prices for Apple products such as iPads and personal computers for about half the French retail market. According to the French regulators, the abuses occurred between 2005 and 2017 but were first discovered after a complaint by an independent reseller, eBizcuss, in 2012. On August 13, 2020, Epic Games, the maker of the popular game Fortnite, sued both Apple and Google after Fortnite was removed from Apple's and Google's app stores. The lawsuits came after Apple and Google blocked the game after it introduced a direct payment system that bypassed the fees that Apple and Google had imposed. In September 2020, Epic Games founded the Coalition for App Fairness together with thirteen other companies, which aims for better conditions for the inclusion of apps in the app stores. Later, in December 2020, Facebook agreed to assist Epic in their legal game against Apple, planning to support the company by providing materials and documents to Epic. Facebook had, however, stated that the company would not participate directly with the lawsuit, although did commit to helping with the discovery of evidence relating to the trial of 2021. In the months prior to their agreement, Facebook had been dealing with feuds against Apple relating to the prices of paid apps and privacy rule changes. Head of ad products for Facebook Dan Levy commented, saying that "this is not really about privacy for them, this is about an attack on personalized ads and the consequences it's going to have on small-business owners," commenting on the full-page ads placed by Facebook in various newspapers in December 2020. Apple has publicly taken a pro-privacy stance, actively making privacy-conscious features and settings part of its conferences, promotional campaigns, and public image. With its iOS 8 mobile operating system in 2014, the company started encrypting all contents of iOS devices through users' passcodes, making it impossible at the time for the company to provide customer data to law enforcement requests seeking such information. With the popularity rise of cloud storage solutions, Apple began a technique in 2016 to do deep learning scans for facial data in photos on the user's local device and encrypting the content before uploading it to Apple's iCloud storage system. It also introduced "differential privacy", a way to collect crowdsourced data from many users, while keeping individual users anonymous, in a system that Wired described as "trying to learn as much as possible about a group while learning as little as possible about any individual in it". Users are explicitly asked if they want to participate, and can actively opt-in or opt-out. However, Apple has aided law enforcement in criminal investigations by providing iCloud backups of users' devices, and the company's commitment to privacy has been questioned by its efforts to promote biometric authentication technology in its newer[when?] iPhone models, which do not have the same level of constitutional privacy as a passcode in the United States. With Apple's release of an update to iOS 14, Apple required all developers of iPhone, iPad, and iPod Touch applications to directly ask iPhone users permission to track them. The feature, called "App Tracking Transparency", received heavy criticism from Facebook, whose primary business model revolves around the tracking of users' data and sharing such data with advertisers so users can see more relevant ads, a technique commonly known as targeted advertising. After Facebook's measures, including purchasing full-page newspaper advertisements protesting App Tracking Transparency, Apple released the update in early 2021. A study by Verizon subsidiary Flurry Analytics reported only 4% of iOS users in the United States and 12% worldwide have opted into tracking. Prior to the release of iOS 15, Apple announced new efforts at combating child sexual abuse material on iOS and Mac platforms. Parents of minor iMessage users can now be alerted if their child sends or receives nude photographs. Additionally, on-device hashing would take place on media destined for upload to iCloud, and hashes would be compared to a list of known abusive images provided by law enforcement; if enough matches were found, Apple would be alerted and authorities informed. The new features received praise from law enforcement and victims rights advocates. However, privacy advocates, including the Electronic Frontier Foundation, condemned the new features as invasive and highly prone to abuse by authoritarian governments. Ireland's Data Protection Commission launched a privacy investigation to examine whether Apple complied with the EU's GDPR law following an investigation into how the company processes personal data with targeted ads on its platform. In December 2019, security researcher Brian Krebs discovered that the iPhone 11 Pro would still show the arrow indicator –signifying location services are being used– at the top of the screen while the main location services toggle is enabled, despite all individual location services being disabled. Krebs was unable to replicate this behavior on older models and when asking Apple for comment, he was told by Apple that "It is expected behavior that the Location Services icon appears in the status bar when Location Services is enabled. The icon appears for system services that do not have a switch in Settings." Apple later further clarified that this behavior was to ensure compliance with ultra-wideband regulations in specific countries, a technology Apple started implementing in iPhones starting with iPhone 11 Pro, and emphasized that "the management of ultra wideband compliance and its use of location data is done entirely on the device and Apple is not collecting user location data." Will Strafach, an executive at security firm Guardian Firewall, confirmed the lack of evidence that location data was sent off to a remote server. Apple promised to add a new toggle for this feature and in later iOS revisions Apple provided users with the option to tap on the location services indicator in Control Center to see which specific service is using the device's location. According to published reports by Bloomberg News on March 30, 2022, Apple turned over data such as phone numbers, physical addresses, and IP addresses to hackers posing as law enforcement officials using forged documents. The law enforcement requests sometimes included forged signatures of real or fictional officials. When asked about the allegations, an Apple representative referred the reporter to a section of the company policy for law enforcement guidelines, which stated, "We review every data request for legal sufficiency and use advanced systems and processes to validate law enforcement requests and detect abuse." Corporate affairs The key trends for Apple are, as of each financial year ending September 24: As of January 3, 2025[update], the management of Apple includes: As of January 3, 2025[update], the board of directors of Apple includes: As of December 30, 2024[update], the largest shareholders of Apple were: Apple is one of several highly successful companies founded in the 1970s that bucked the traditional notions of corporate culture. Jobs often walked around the office barefoot even after Apple became a Fortune 500 company. By the time of the "1984" television advertisement, Apple's informal culture had become a key trait that differentiated it from its competitors. According to a 2011 report in Fortune, this has resulted in a corporate culture more akin to a startup rather than a multinational corporation. In a 2017 interview, Wozniak credited watching Star Trek and attending Star Trek conventions in his youth as inspiration for co-founding Apple. As the company has grown and been led by a series of differently opinionated chief executives, some media have suggested that it has lost some of its original character. Nonetheless, it has maintained a reputation for fostering individuality and excellence that reliably attracts talented workers, particularly after Jobs returned. Numerous Apple employees have stated that projects without Jobs's involvement often took longer than others. The Apple Fellows program awards employees for extraordinary technical or leadership contributions to personal computing. Recipients include Bill Atkinson, Steve Capps, Rod Holt, Alan Kay, Guy Kawasaki, Al Alcorn, Don Norman, Rich Page, Steve Wozniak, and Phil Schiller. Jobs intended that employees were to be specialists who are not exposed to functions outside their area of expertise. For instance, Ron Johnson—Senior Vice President of Retail Operations until November 1, 2011—was responsible for site selection, in-store service, and store layout, yet had no control of the inventory in his stores. This was done by Tim Cook, who had a background in supply-chain management. Apple is known for strictly enforcing accountability. Each project has a "directly responsible individual" or "DRI" in Apple jargon. Unlike other major US companies, Apple provides a relatively simple compensation policy for executives that does not include perks enjoyed by other CEOs like country club fees or private use of company aircraft. The company typically grants stock options to executives every other year. In 2015, Apple had 110,000 full-time employees. This increased to 116,000 full-time employees the next year, a notable hiring decrease, largely due to its first revenue decline. Apple does not specify how many of its employees work in retail, though its 2014 SEC filing put the number at approximately half of its employee base. In September 2017, Apple announced that it had over 123,000 full-time employees. Apple has a strong culture of corporate secrecy, and has an anti-leak Global Security team that recruits from the National Security Agency, the Federal Bureau of Investigation, and the United States Secret Service. In December 2017, Glassdoor said Apple was the 48th best place to work, having originally entered at rank 19 in 2009, peaking at rank 10 in 2012, and falling down the ranks in subsequent years. In 2023, Bloomberg's Mark Gurman revealed the existence of Apple's Exploratory Design Group (XDG), which was working to add glucose monitoring to the Apple Watch. Gurman compared XDG to Alphabet's X "moonshot factory". Apple's world corporate headquarters are located in Cupertino, in the middle of California's Silicon Valley, at Apple Park, a massive circular groundscraper building with a circumference of one mile (1.6 km). The building opened in April 2017 and houses more than 12,000 employees. Apple co-founder Steve Jobs wanted Apple Park to look less like a business park and more like a nature refuge, and personally appeared before the Cupertino City Council in June 2011 to make the proposal, in his final public appearance before his death. Apple also operates from the Apple Campus (also known by its address, 1 Infinite Loop), a grouping of six buildings in Cupertino that total 850,000 square feet (79,000 m2) located about 1 mile (1.6 km) to the west of Apple Park. The Apple Campus was the company's headquarters from its opening in 1993, until the opening of Apple Park in 2017. The buildings, located at 1–6 Infinite Loop, are arranged in a circular pattern around a central green space, in a design that has been compared to that of a university. In addition to Apple Park and the Apple Campus, Apple occupies an additional thirty office buildings scattered throughout the city of Cupertino, including three buildings as prior headquarters: Stephens Creek Three from 1977 to 1978, Bandley One from 1978 to 1982, and Mariani One from 1982 to 1993. In total, Apple occupies almost 40% of the available office space in the city. Apple's headquarters for Europe, the Middle East and Africa (EMEA) are located in Cork in the south of Ireland, called the Hollyhill campus. The facility, which opened in 1980, houses 5,500 people and was Apple's first location outside of the United States. Apple's international sales and distribution arms operate out of the campus in Cork. Apple has two campuses near Austin, Texas: a 216,000-square-foot (20,100 m2) campus opened in 2014 houses 500 engineers who work on Apple silicon and a 1.1-million-square-foot (100,000 m2) campus opened in 2021 where 6,000 people work in technical support, supply chain management, online store curation, and Apple Maps data management. The company also has several other locations in Boulder, Colorado; Culver City, California; Herzliya (Israel), London, New York, Pittsburgh, San Diego, and Seattle that each employ hundreds of people. Apple has been a participant in various legal proceedings and claims since it began operation. In particular, Apple is known for and promotes itself as actively and aggressively enforcing its intellectual property interests. Some litigation examples include Apple v. Samsung, Apple v. Microsoft, Motorola Mobility v. Apple Inc., and Apple Corps v. Apple Computer. Apple has also had to defend itself against charges on numerous occasions of violating intellectual property rights. Most have been dismissed in the courts as shell companies known as patent trolls, with no evidence of actual use of patents in question. On December 21, 2016, Nokia announced that in the US and Germany, it has filed a suit against Apple, claiming that the latter's products infringe on Nokia's patents. Most recently, in November 2017, the United States International Trade Commission announced an investigation into allegations of patent infringement in regards to Apple's remote desktop technology; Aqua Connect, a company that builds remote desktop software, has claimed that Apple infringed on two of its patents. Epic Games filed lawsuit against Apple in August 2020 in the United States District Court for the Northern District of California, related to Apple's practices in the iOS App Store. In January 2022, Ericsson sued Apple over payment of royalty of 5G technology. On June 24, 2024, the European Commission accused Apple of violating the Digital Markets Act by preventing "app developers from freely steering consumers to alternative channels for offers and content". In April 2025, Apple was found guilty and fined €500 million ($570 million) for violating the Digital Markets Act. In 2025, Apple was one of the donors who funded the White House's East Wing demolition, and planned building of a ballroom. Finances As of 2024, Apple was the world's fourth-largest personal computer vendor, the largest vendor of tablet computers, and the largest vendor of mobile phones. It is a Big Tech company. In its fiscal year ending in September 2011, Apple reported a total of $108 billion in annual revenues—a significant increase from its 2010 revenues of $65 billion—and nearly $82 billion in cash reserves. On March 19, 2012, Apple announced plans for a $2.65-per-share dividend beginning in fourth quarter of 2012, per approval by their board of directors. The company's worldwide annual revenue in 2013 totaled $170 billion. In May 2013, Apple entered the top ten of the Fortune 500 list of companies for the first time, rising 11 places above its 2012 ranking to take the sixth position. As of 2016[update], Apple has around US$234 billion of cash and marketable securities, of which 90% is located outside the United States for tax purposes. Apple amassed 65% of all profits made by the eight largest worldwide smartphone manufacturers in quarter one of 2014, according to a report by Canaccord Genuity. In the first quarter of 2015, the company garnered 92% of all earnings. On April 30, 2017, The Wall Street Journal reported that Apple had cash reserves of $250 billion, officially confirmed by Apple as specifically $256.8 billion a few days later. As of August 3, 2018[update], Apple was the largest publicly traded corporation in the world by market capitalization. On August 2, 2018, Apple became the first publicly traded US company to reach a $1 trillion market value, and, as of October 2025[update], is valued at just over $4 trillion. Apple was ranked No. 4 on the 2018 Fortune 500 rankings of the largest United States corporations by revenue. In July 2022, Apple reported an 11% decline in Q3 profits compared to 2021. Its revenue in the same period rose 2% year-on-year to $83 billion, though this figure was also lower than in 2021, where the increase was at 36%. The general downturn is reportedly caused by the slowing global economy and supply chain disruptions in China. That year, Apple was one of the largest corporate spenders on research and development worldwide, with R&D expenditure amounting to over $27 billion. In May 2023, Apple reported a decline in its sales for the first quarter of 2023. Compared to that of 2022, revenue for 2023 fell by 3%. This is Apple's second consecutive quarter of sales decline. This fall is attributed to the slowing economy and consumers putting off purchases of iPads and computers due to increased pricing. However, iPhone sales held up with a year-on-year increase of 1.5%. According to Apple, demands for such devices were strong, particularly in Latin America and South Asia. Apple has created subsidiaries in low-tax places such as Ireland, the Netherlands, Luxembourg, and the British Virgin Islands to cut the taxes it pays around the world. According to The New York Times, in the 1980s Apple was among the first tech companies to designate overseas salespeople in high-tax countries in a manner that allowed the company to sell on behalf of low-tax subsidiaries on other continents, sidestepping income taxes. In the late 1980s, Apple was a pioneer of an accounting technique known as the "Double Irish with a Dutch sandwich", which reduces taxes by routing profits through Irish subsidiaries and the Netherlands and then to the Caribbean. British Conservative Party Member of Parliament Charlie Elphicke published research on October 30, 2012, which showed that some multinational companies, including Apple, were making billions of pounds of profit in the UK, but were paying an effective tax rate to the UK Treasury of only 3 percent, well below standard corporate tax rates. He followed this research by calling on the Chancellor of the Exchequer George Osborne to force these multinationals, which also included Google and The Coca-Cola Company, to state the effective rate of tax they pay on their UK revenues. Elphicke also said that government contracts should be withheld from multinationals who do not pay their fair share of UK tax. According to a US Senate report on the company's offshore tax structure concluded in May 2013, Apple has held billions of dollars in profits in Irish subsidiaries to pay little or no taxes to any government by using an unusual global tax structure. The main subsidiary, a holding company that includes Apple's retail stores throughout Europe, has not paid any corporate income tax in the last five years. "Apple has exploited a difference between Irish and U.S. tax residency rules", the report said. On May 21, 2013, Apple CEO Tim Cook defended his company's tax tactics at a Senate hearing. Apple says that it is the single largest taxpayer in the US, with an effective tax rate of approximately of 26% as of Q2 FY2016. In an interview with the German newspaper FAZ in October 2017, Tim Cook stated that Apple was the biggest taxpayer worldwide. In 2016, after a two-year investigation, the European Commission claimed that Apple's use of a hybrid Double Irish tax arrangement constituted "illegal state aid" from Ireland, and ordered Apple to pay €13 billion ($14.5 billion) in unpaid taxes, the largest corporate tax fine in history. This was later annulled, after the European General Court ruled that the commission had provided insufficient evidence. In 2018, Apple repatriated $285 billion to the United States, resulting in a $38-billion tax payment spread over the following eight years. Apple is a partner of Product Red, a fundraising campaign for AIDS charity. In November 2014, Apple arranged for all App Store revenue in a two-week period to go to the fundraiser, generating more than US$20 million, and in March 2017, it released an iPhone 7 with a red color finish. As of 2021[update], Apple has donated over $250 million to Product Red. Apple contributes financially to fundraisers in times of natural disasters. In November 2012, it donated $2.5 million to the American Red Cross to aid relief efforts after Hurricane Sandy, and in 2017 it donated $5 million to relief efforts for both Hurricane Irma and Hurricane Harvey, and for the 2017 Central Mexico earthquake. The company has used its iTunes platform to encourage donations in the wake of environmental disasters and humanitarian crises, such as the 2010 Haiti earthquake, the 2011 Japan earthquake, Typhoon Haiyan in the Philippines in November 2013, and the 2015 European migrant crisis. Apple emphasizes that it does not incur any processing or other fees for iTunes donations, sending 100% of the payments directly to relief efforts, though it also acknowledges that the Red Cross does not receive any personal information on the users donating and that the payments may not be tax deductible. On April 14, 2016, Apple and the World Wide Fund for Nature (WWF) announced that they have engaged in a partnership to, "help protect life on our planet". Apple released a special page in the iTunes App Store, Apps for Earth. In the arrangement, Apple has committed that through April 24, WWF will receive 100% of the proceeds from the applications participating in the App Store via both the purchases of any paid apps and the In-App Purchases. Apple and WWF's Apps for Earth campaign raised more than $8 million in total proceeds to support WWF's conservation work. WWF announced the results at WWDC 2016 in San Francisco. During the COVID-19 pandemic, Apple's CEO Cook announced that the company will be donating "millions" of masks to health workers in the United States and Europe. On January 13, 2021, Apple announced a $100-million Racial Equity and Justice Initiative to help combat institutional racism worldwide after the 2020 murder of George Floyd. In June 2023, Apple announced doubling this and then distributed more than $200 million to support organizations focused on education, economic growth, and criminal justice. Half is philanthropic grants and half is centered on equity. Environment Apple Energy, LLC is a wholly owned subsidiary of Apple that sells solar energy. As of June 6, 2016[update], Apple's solar farms in California and Nevada have been declared to provide 217.9 megawatts of solar generation capacity. Apple has received regulatory approval to construct a landfill gas energy plant in North Carolina to use the methane emissions to generate electricity. Apple's North Carolina data center is already powered entirely by renewable sources. In 2010, Climate Counts, a nonprofit organization dedicated to directing consumers toward the greenest companies, gave Apple a score of 52 points out of a possible 100, which puts Apple in their top category "Striding". This was an increase from May 2008, when Climate Counts only gave Apple 11 points out of 100, which placed the company last among electronics companies, at which time Climate Counts also labeled Apple with a "stuck icon", adding that Apple at the time was "a choice to avoid for the climate-conscious consumer". Following a Greenpeace protest, Apple released a statement on April 17, 2012, committing to ending its use of coal and shifting to 100% renewable clean energy. By 2013, Apple was using 100% renewable energy to power their data centers. Overall, 75% of the company's power came from clean renewable sources. In May 2015, Greenpeace evaluated the state of the Green Internet and commended Apple on their environmental practices saying, "Apple's commitment to renewable energy has helped set a new bar for the industry, illustrating in very concrete terms that a 100% renewable Internet is within its reach, and providing several models of intervention for other companies that want to build a sustainable Internet." As of 2016[update], Apple states that 100% of its US operations run on renewable energy, 100% of Apple's data centers run on renewable energy and 93% of Apple's global operations run on renewable energy. However, the facilities are connected to the local grid which usually contains a mix of fossil and renewable sources, so Apple carbon offsets its electricity use. The Electronic Product Environmental Assessment Tool (EPEAT) allows consumers to see the effect a product has on the environment. Each product receives a Gold, Silver, or Bronze rank depending on its efficiency and sustainability. Every Apple tablet, notebook, desktop computer, and display that EPEAT ranks achieves a Gold rating, the highest possible. Although Apple's data centers recycle water 35 times, the increased activity in retail, corporate and data centers also increase the amount of water use to 573 million US gal (2.2 million m3) in 2015. During an event on March 21, 2016, Apple provided a status update on its environmental initiative to be 100% renewable in all of its worldwide operations. Lisa P. Jackson, Apple's vice president of Environment, Policy and Social Initiatives who reports directly to CEO, Tim Cook, announced that as of March 2016[update], 93% of Apple's worldwide operations are powered with renewable energy. Also featured was the company's efforts to use sustainable paper in their product packaging; 99% of all paper used by Apple in the product packaging comes from post-consumer recycled paper or sustainably managed forests, as the company continues its move to all paper packaging for all of its products. Apple announced on August 16, 2016, that Lens Technology, one of its major suppliers in China, has committed to power all its glass production for Apple with 100 percent renewable energy by 2018. The commitment is a large step in Apple's efforts to help manufacturers lower their carbon footprint in China. Apple also announced that all 14 of its final assembly sites in China are now compliant with UL's Zero Waste to Landfill validation. The standard, which started in January 2015, certifies that all manufacturing waste is reused, recycled, composted, or converted into energy (when necessary). Since the program began, nearly 140,000 metric tons of waste have been diverted from landfills. On July 21, 2020, Apple announced its plan to become carbon neutral across its entire business, manufacturing supply chain, and product life cycle by 2030. In the next 10 years, Apple will try to lower emissions with a series of innovative actions, including: low carbon product design, expanding energy efficiency, renewable energy, process and material innovations, and carbon removal. In June 2024, the United States Environmental Protection Agency (EPA) published a report about an electronic computer manufacturing facility leased by Apple in 2015 in Santa Clara, California, code named Aria. The EPA report stated that Apple was potentially in violation of federal regulations under the Resource Conservation and Recovery Act (RCRA). According to a report from Bloomberg in 2018, the facility is used to develop microLED screens under the code name T159. The inspection found that Apple was potentially mistreating waste as only subject to California regulations and that they had potentially miscalculated the effectiveness of Apple's activated carbon filters, which filter volatile organic compounds (VOCs) from the air. The EPA inspected the facility in August 2023 due to a tip from a former Apple employee who posted the report on X. Following further campaigns by Greenpeace, in 2008, Apple became the first electronics manufacturer to eliminate all polyvinyl chloride (PVC) and brominated flame retardants (BFRs) in its complete product line. In June 2007, Apple began replacing the cold cathode fluorescent lamp (CCFL) backlit LCD displays in its computers with mercury-free LED-backlit LCD displays and arsenic-free glass, starting with the upgraded MacBook Pro. Apple offers comprehensive and transparent information about the CO2e, emissions, materials, and electrical usage concerning every product they currently produce or have sold in the past (and which they have enough data needed to produce the report), in their portfolio on their homepage. Allowing consumers to make informed purchasing decisions on the products they offer for sale. In June 2009, Apple's iPhone 3GS was free of PVC, arsenic, and BFRs. Since 2009, all Apple products have mercury-free LED-backlit LCD displays, arsenic-free glass, and non-PVC cables. All Apple products have EPEAT Gold status and beat the latest Energy Star guidelines in each product's respective regulatory category. In November 2011, Apple was featured in Greenpeace's Guide to Greener Electronics, which ranks electronics manufacturers on sustainability, climate and energy policy, and how "green" their products are. The company ranked fourth of fifteen electronics companies (moving up five places from the previous year) with a score of 4.6/10. Greenpeace praised Apple's sustainability, noting that the company exceeded its 70% global recycling goal in 2010. Apple continues to score well on product ratings, with all of their products now being free of PVC plastic and BFRs. However, the guide criticized Apple on the Energy criteria for not seeking external verification of its greenhouse gas emissions data, and for not setting any targets to reduce emissions. In January 2012, Apple requested that its cable maker, Volex, begin producing halogen-free USB and power cables. In February 2016, Apple issued a US$1.5-billion green bond (climate bond), the first ever of its kind by a US tech company. The green bond proceeds are dedicated to the financing of environmental projects. Supply chain Apple products were made in the United States in Apple-owned factories until the late 1990s; however, as a result of outsourcing initiatives in the 2000s, almost all of its manufacturing is now handled abroad. According to a report by The New York Times, Apple insiders "believe the vast scale of overseas factories, as well as the flexibility, diligence and industrial skills of foreign workers, have so outpaced their American counterparts that 'Made in the U.S.A.' is no longer a viable option for most Apple products". The company's manufacturing, procurement, and logistics enable it to execute massive product launches without having to maintain large, profit-sapping inventories. In 2011, Apple's profit margins were 40 percent, compared with between 10 and 20 percent for most other hardware companies. Cook's catchphrase to describe his focus on the company's operational arm is: "Nobody wants to buy sour milk." In May 2017, the company announced a $1-billion funding project for "advanced manufacturing" in the United States, and subsequently invested $200 million in Corning Inc., a manufacturer of toughened Gorilla Glass technology used in Apple's iPhones. The following December, Apple's chief operating officer, Jeff Williams, told CNBC that the "$1 billion" amount was "absolutely not" the final limit on its spending, elaborating that "We're not thinking in terms of a fund limit... We're thinking about, where are the opportunities across the U.S. to help nurture companies that are making the advanced technology — and the advanced manufacturing that goes with that — that quite frankly is essential to our innovation." During the Mac's early history, Apple generally refused to adopt prevailing industry standards for hardware, instead creating their own. This trend was largely reversed in the late 1990s, beginning with Apple's adoption of the PCI bus in the 7500/8500/9500 Power Macs. Apple has since joined the industry standards groups to influence the future direction of technology standards such as USB, AGP, HyperTransport, Wi-Fi, NVMe, PCIe and others in its products. FireWire is an Apple-originated standard that was widely adopted across the industry after it was standardized as IEEE 1394 and is a legally mandated port in all cable TV boxes in the United States. Apple has gradually expanded its efforts in getting its products into the Indian market. In July 2012, during a conference call with investors, CEO Tim Cook said that he "[loves] India", but that Apple saw larger opportunities outside the region. India's requirement that 30% of products sold be manufactured in the country was described as "really adds cost to getting product to market". In May 2016, Apple opened an iOS app development center in Bangalore and a maps development office for 4,000 staff in Hyderabad. In March, The Wall Street Journal reported that Apple would begin manufacturing iPhone models in India "over the next two months", and in May, the Journal wrote that an Apple manufacturer had begun production of the iPhone SE in the country, while Apple told CNBC that the manufacturing was for a "small number" of units. In April 2019, Apple initiated manufacturing of the iPhone 7 at its Bengaluru facility, keeping in mind demand from local customers even as they seek more incentives from the government of India. At the beginning of 2020, Tim Cook announced that Apple schedules the opening of its first physical outlet in India for 2021, while an online store is to be launched by the end of the year. The opening of the Apple Store was postponed, and finally took place in April 2023, while the online store was launched in September 2020. Apple directly employs 147,000 workers including 25,000 corporate employees in Apple Park and across Silicon Valley. The vast majority of its employees work at the over 500 retail Apple Stores globally. Apple relies on a larger, outsourced workforce for manufacturing, particularly in China where Apple directly employs 10,000 workers across its retail and corporate divisions. In addition, one further million workers are contracted by Apple's suppliers to assemble Apple products, including Foxconn and Pegatron. Zhengzhou Technology Park alone employs 350,000 Chinese workers in Zhengzhou to exclusively work on the iPhone. As of 2021[update], Apple uses hardware components from 43 different countries. The majority of assembling is done by Taiwanese original design manufacturer firms Foxconn, Pegatron, Wistron and Compal Electronics in factories primarily located inside China, and, to a lesser extent, Foxconn plants in Brazil, and India. Apple workers around the globe have been involved in organizing since the 1990s. Apple unions are made up of retail, corporate, and outsourced workers. Apple employees have joined trade unions or formed works councils in Australia, France, Germany, Italy, Japan, the United Kingdom and the United States. In 2021, Apple Together, a solidarity union, sought to bring together the company's global worker organizations. The majority of industrial labor disputes (including union recognition) involving Apple occur indirectly through its suppliers and contractors, notably Foxconn plants in China and, to a lesser extent, in Brazil and India. In 2019, Apple was named as a defendant in a forced labour and child slavery lawsuit by Congolese families of children injured and killed in cobalt mines owned by Glencore and Zhejiang Huayou Cobalt, which supply battery materials to Apple and other companies. In April 2024, lawyers representing the Democratic Republic of the Congo notified Apple of evidence that Apple may be sourcing minerals from conflict areas of eastern Congo. Apple policies and documentation describe mitigation efforts against conflict minerals, however the lawyers identify discrepancies in supplier reporting as well as a Global Witness report describing a lack of "meaningful mitigation" on Apple's part. In December 2024, DRC filed a lawsuit against Apple's European subsidiaries, accusing them of using conflict minerals. In response, Apple said it "strongly disputes" these allegations and insisted it is "deeply committed to responsible sourcing" of the minerals it uses. See also Notes References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Autodesk_3ds_Max] | [TOKENS: 1550]
Contents Autodesk 3ds Max Autodesk 3ds Max, formerly 3D Studio and 3D Studio Max, is a professional 3D computer graphics program for making 3D animations, models, games and images. It is developed and produced by Autodesk Media and Entertainment. It has modeling capabilities and a flexible plugin architecture and must be used on the Microsoft Windows platform. It is frequently used by video game developers, many TV commercial studios, and architectural visualization studios. It is also used for movie effects and movie pre-visualization. 3ds Max features shaders (such as ambient occlusion and subsurface scattering), dynamic simulation, particle systems, radiosity, normal map creation and rendering, global illumination, a customizable user interface, and its own scripting language. History The original 3D Studio product was created for the DOS platform by the Yost Group, and published by Autodesk. The release of 3D Studio made Autodesk's previous 3D rendering package AutoShade obsolete. After 3D Studio DOS Release 4, the product was rewritten for the Windows NT platform, and renamed "3D Studio MAX." This version was also originally created by the Yost Group. It was released by Kinetix, which was at that time Autodesk's division of media and entertainment. Autodesk purchased the product at the second release update of the 3D Studio MAX version and internalized development entirely over the next two releases. Later, the product name was changed to "3ds max" (all lower case) to better comply with the naming conventions of Discreet, a Montreal-based software company which Autodesk had purchased. When it was re-released (release 7), the product was again branded with the Autodesk logo, and the short name was again changed to "3ds Max" (upper and lower case), while the formal product name became the current "Autodesk 3ds Max." Features Adoption Many films have made use of 3ds Max, or previous versions of the program under previous names, in CGI animation, such as Avatar and 2012, which contain computer generated graphics from 3ds Max alongside live-action acting. Mudbox was also used in the final texturing of the set and characters in Avatar, with 3ds Max and Mudbox being closely related. 3ds Max has been used in the development of 3D computer graphics for a number of video games. Architectural and engineering design firms use 3ds Max for developing concept art and previsualization. Educational programs at secondary and tertiary level use 3ds Max in their courses on 3D computer graphics and computer animation. Students in the FIRST competition for 3d animation are known to use 3ds Max. Modeling techniques Polygon modeling is more common with game design than any other modeling technique as the very specific control over individual polygons allows for extreme optimization. Usually, the modeler begins with one of the 3ds max primitives, and using such tools as bevel and extrude, adds detail to and refines the model. Versions 4 and up feature the Editable Polygon object, which simplifies most mesh editing operations, and provides subdivision smoothing at customizable levels (see NURMS). Version 7 introduced the edit poly modifier, which allows the use of the tools available in the editable polygon object to be used higher in the modifier stack (i.e., on top of other modifications). NURBS in 3ds Max is a legacy feature. None of the features have been updated since version 4 and have been ignored by the development teams over the past decade. For example, the updated path deform and the updated normalize spline modifiers in version 2018 do not work on NURBS curves anymore as they did in previous versions. An alternative to polygons, it gives a smoothed out surface that eliminates the straight edges of a polygon model. NURBS is a mathematically exact representation of freeform surfaces like those used for car bodies and ship hulls, which can be exactly reproduced at any resolution whenever needed. With NURBS, a smooth sphere can be created with only one face. The non-uniform property of NURBS brings up an important point. Because they are generated mathematically, NURBS objects have a parameter space in addition to the 3D geometric space in which they are displayed. Specifically, an array of values called knots specifies the extent of influence of each control vertex (CV) on the curve or surface. Knots are invisible in 3D space and can't be manipulated directly, but occasionally their behavior affects the visible appearance of the NURBS object. Parameter space is one-dimensional for curves, which have only a single U dimension topologically, even though they exist geometrically in 3D space. Surfaces have two dimensions in parameter space, called U and V. NURBS curves and surfaces have the important properties of not changing under the standard geometric affine transformations (Transforms), or under perspective projections. The CVs have local control of the object: moving a CV or changing its weight does not affect any part of the object beyond the neighboring CVs. (This property can be overridden by using the Soft Selection controls). Also, the control lattice that connects CVs surrounds the surface. This is known as the convex hull property. Surface tool was originally a 3rd party plugin, but Kinetix acquired and included this feature since version 3.0.[citation needed] The surface tool is for creating common 3ds Max splines, and then applying a modifier called "surface." This modifier makes a surface from every three or four vertices in a grid. It is often seen as an alternative to "mesh" or "nurbs" modeling, as it enables a user to interpolate curved sections with straight geometry (for example a hole through a box shape). Although the surface tool is a useful way to generate parametrically accurate geometry, it lacks the "surface properties" found in the similar Edit Patch modifier, which enables a user to maintain the original parametric geometry whilst being able to adjust "smoothing groups" between faces.[citation needed] Predefined primitives This is a basic method, in which one models something using only boxes, spheres, cones, cylinders and other predefined objects from the list of Predefined Standard Primitives or a list of Predefined Extended Primitives. One may also apply Boolean operations, including subtract, cut and connect. For example, one can make two spheres which will work as blobs that will connect with each other. These are called metaballs. Rendering Licensing Earlier versions (up to and including 3D Studio Max R3.1) required a special copy protection device (called a dongle) to be plugged into the parallel port while the program was run, but later versions incorporated software-based copy prevention methods instead. Current versions require online registration. Due to the high price of the commercial version of the program, Autodesk also offers a free student version, which explicitly states that it is to be used for "educational purposes only". The student version has identical features to the full version, but is only for single use and cannot be installed on a network. The student license expires after three years, at which time the user, if they are still a student, may download the latest version, thus renewing the license for another three years. In 2020, Autodesk had since reduced the free student version limit to one year only, as opposed to three years previously. In addition, all customers seeking free access to Autodesk educational products and services are required to provide proof of enrollment, employment, or contractor status at a qualified educational institution. See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Mass_production] | [TOKENS: 5256]
Contents Mass production Mass production, also known as series production, series manufacture, or continuous production, is the production of substantial amounts of standardized products in a constant flow, including and especially on assembly lines. Together with job production and batch production, it is one of the three main production methods. The term mass production was popularized by a 1926 article in the Encyclopædia Britannica supplement that was written based on correspondence with Ford Motor Company. The New York Times used the term in the title of an article that appeared before the publication of the Britannica article. The idea of mass production is applied to many kinds of products: from fluids and particulates handled in bulk (food, fuel, chemicals and mined minerals), to clothing, textiles, parts and assemblies of parts (household appliances and automobiles). Some mass production techniques, such as standardized sizes and production lines, predate the Industrial Revolution by many centuries; however, it was not until the introduction of machine tools and techniques to produce interchangeable parts were developed in the mid-19th century that modern mass production was possible. Overview Mass production involves making many copies of products (Mainly done through machines), very quickly, using assembly line techniques to send partially complete products to workers who each work on an individual step, rather than having a worker work on a whole product from start to finish. The emergence of mass production allowed supply to outstrip demand in many markets, forcing companies to seek new ways to become more competitive. Mass production ties into the idea of overconsumption and the idea that we as humans consume too much. Mass production of fluid matter typically involves piping with centrifugal pumps or screw conveyors (augers) to transfer raw materials or partially complete products between vessels. Fluid flow processes such as oil refining and bulk materials such as wood chips and pulp are automated using a system of process control which uses various instruments to measure variables such as temperature, pressure, volumetric and level, providing feedback. Bulk materials such as coal, ores, grains and wood chips are handled by belt, chain, slat, pneumatic or screw conveyors, bucket elevators and mobile equipment such as front-end loaders. Materials on pallets are handled with forklifts. Also used for handling heavy items like reels of paper, steel or machinery are electric overhead cranes, sometimes called bridge cranes because they span large factory bays. Mass production is capital-intensive and energy-intensive, for it uses a high proportion of machinery and energy in relation to workers. It is also usually automated while total expenditure per unit of product is decreased. However, the machinery that is needed to set up a mass production line (such as robots and machine presses) is so expensive that in order to attain profits there must be some assurance that the product will be successful. One of the descriptions of mass production is that "the skill is built into the tool" [citation needed], which means that the worker using the tool may not need the skill. For example, in the 19th or early 20th century, this could be expressed as "the craftsmanship is in the workbench itself" (not the training of the worker). Rather than having a skilled worker measure every dimension of each part of the product against the plans or the other parts as it is being formed, there were jigs ready at hand to ensure that the part was made to fit this set-up. It had already been checked that the finished part would be to specifications to fit all the other finished parts—and it would be made more quickly, with no time spent on finishing the parts to fit one another. Later, once computerized control came about (for example, CNC), jigs were obviated, but it remained true that the skill (or knowledge) was built into the tool (or process, or documentation) rather than residing in the worker's head. This is the specialized capital required for mass production; each workbench and set of tools (or each CNC cell, or each fractionating column) is different (fine-tuned to its task). History Standardized parts and sizes and factory production techniques were developed in pre-industrial times; before the invention of machine tools the manufacture of precision parts, especially metal ones, was highly labour-intensive. Crossbows made with bronze parts were produced in China during the Warring States period. The Qin Emperor unified China at least in part by equipping large armies with these weapons, which were fitted with a sophisticated trigger mechanism made of interchangeable parts. The Terracotta Army guarding the Emperor's tomb is also believed to have been created through the use of standardized molds on an assembly line. In ancient Carthage, ships of war were mass-produced on a large scale at a moderate cost, allowing them to efficiently maintain their control of the Mediterranean. Many centuries later, the Republic of Venice would follow Carthage in producing ships with prefabricated parts on an assembly line: the Venetian Arsenal produced nearly one ship every day in what was effectively the world's first factory, which at its height employed 16,000 people. The invention of movable type has allowed for documents such as books to be mass produced. The first movable type system was invented in China by Bi Sheng, during the reign of the Song dynasty, where it was used to, among other things, issue paper money. The oldest extant book produced using metal type is Jikji, printed in Korea in the year 1377. Johannes Gutenberg, through his invention of the printing press and production of the Gutenberg Bible, introduced movable type to Europe. Through this introduction, mass production in the European publishing industry was made commonplace, leading to a democratization of knowledge, increased literacy and education, and the beginnings of modern science. French artillery engineer Jean-Baptiste de Gribeauval introduced the standardization of cannon design in the late 18th century. He streamlined production and management of cannonballs and cannons by limiting them to only three calibers, and he improved their effectiveness by requiring more spherical ammunition. Redesigning these weapons to use interchangeable wheels, screws, and axles simplified mass production and repair. In the Industrial Revolution, simple mass production techniques were used at the Portsmouth Block Mills in England to make ships' pulley blocks for the Royal Navy in the Napoleonic Wars. It was achieved in 1803 by Marc Isambard Brunel in cooperation with Henry Maudslay under the management of Sir Samuel Bentham. The first unmistakable examples of manufacturing operations carefully designed to reduce production costs by specialized labour and the use of machines appeared in the 18th century in England. The Navy was in a state of expansion that required 100,000 pulley blocks to be manufactured a year. Bentham had already achieved remarkable efficiency at the docks by introducing power-driven machinery and reorganising the dockyard system. Brunel, a pioneering engineer, and Maudslay, a pioneer of machine tool technology who had developed the first industrially practical screw-cutting lathe in 1800 which standardized screw thread sizes for the first time which in turn allowed the application of interchangeable parts, collaborated on plans to manufacture block-making machinery. By 1805, the dockyard had been fully updated with the revolutionary, purpose-built machinery at a time when products were still built individually with different components. A total of 45 machines were required to perform 22 processes on the blocks, which could be made into one of three possible sizes. The machines were almost entirely made of metal thus improving their accuracy and durability. The machines would make markings and indentations on the blocks to ensure alignment throughout the process. One of the many advantages of this new method was the increase in labour productivity due to the less labour-intensive requirements of managing the machinery. Richard Beamish, assistant to Brunel's son and engineer, Isambard Kingdom Brunel, wrote: So that ten men, by the aid of this machinery, can accomplish with uniformity, celerity and ease, what formerly required the uncertain labour of one hundred and ten. By 1808, annual production from the 45 machines had reached 130,000 blocks and some of the equipment was still in operation as late as the mid-twentieth century. Mass production techniques were also used to rather limited extent to make clocks and watches, and to make small arms, though parts were usually non-interchangeable. Though produced on a very small scale, Crimean War gunboat engines designed and assembled by John Penn of Greenwich are recorded as the first instance of the application of mass production techniques (though not necessarily the assembly-line method) to marine engineering. In filling an Admiralty order for 90 sets to his high-pressure and high-revolution horizontal trunk engine design, Penn produced them all in 90 days. He also used Whitworth Standard threads throughout. Prerequisites for the wide use of mass production were interchangeable parts, machine tools and power, especially in the form of electricity. Some of the organizational management concepts needed to create 20th-century mass production, such as scientific management, had been pioneered by other engineers (most of whom are not famous, but Frederick Winslow Taylor is one of the well-known ones), whose work would later be synthesized into fields such as industrial engineering, manufacturing engineering, operations research, and management consultancy. Although after leaving the Henry Ford Company which was rebranded as Cadillac and later was awarded the Dewar Trophy in 1908 for creating interchangeable mass-produced precision engine parts, Henry Ford downplayed the role of Taylorism in the development of mass production at his company. However, Ford management performed time studies and experiments to mechanize their factory processes, focusing on minimizing worker movements. The difference is that while Taylor focused mostly on efficiency of the worker, Ford also substituted for labor by using machines, thoughtfully arranged, wherever possible. In 1807, Eli Terry was hired to produce 4,000 wooden movement clocks in the Porter Contract. At this time, the annual yield for wooden clocks did not exceed a few dozen on average. Terry developed a milling machine in 1795, in which he perfected Interchangeable parts. In 1807, Terry developed a spindle cutting machine, which could produce multiple parts at the same time. Terry hired Silas Hoadley and Seth Thomas to work the Assembly line at the facilities. The Porter Contract was the first contract which called for mass production of clock movements in history. In 1815, Terry began mass-producing the first shelf clock. Chauncey Jerome, an apprentice of Eli Terry mass-produced up to 20,000 brass clocks annually in 1840 when he invented the cheap 30-hour OG clock. The United States Department of War sponsored the development of interchangeable parts for guns produced at the arsenals at Springfield, Massachusetts and Harpers Ferry, Virginia (now West Virginia) in the early decades of the 19th century, finally achieving reliable interchangeability by about 1850. This period coincided with the development of machine tools, with the armories designing and building many of their own. Some of the methods employed were a system of gauges for checking dimensions of the various parts and jigs and fixtures for guiding the machine tools and properly holding and aligning the work pieces. This system came to be known as armory practice or the American system of manufacturing, which spread throughout New England aided by skilled mechanics from the armories who were instrumental in transferring the technology to the sewing machines manufacturers and other industries such as machine tools, harvesting machines and bicycles. Singer Manufacturing Co., at one time the largest sewing machine manufacturer, did not achieve interchangeable parts until the late 1880s, around the same time Cyrus McCormick adopted modern manufacturing practices in making harvesting machines. During World War II, The United States mass-produced many vehicles and weapons, such as ships (i.e. Liberty Ships, Higgins boats ), aircraft (i.e. North American P-51 Mustang, Consolidated B-24 Liberator, Boeing B-29 Superfortress), jeeps (i.e. Willys MB), trucks, tanks (i.e. M4 Sherman) and M2 Browning and M1919 Browning machine guns. Many vehicles, transported by ships have been shipped in parts and later assembled on-site. For the ongoing energy transition, many wind turbine components and solar panels are being mass-produced. Wind turbines and solar panels are being used in respectively wind farms and solar farms. In addition, in the ongoing climate change mitigation, large-scale carbon sequestration (through reforestation, blue carbon restoration, etc) has been proposed. Some projects (such as the Trillion Tree Campaign) involve planting a very large amount of trees. In order to speed up such efforts, fast propagation of trees may be useful. Some automated machines have been produced to allow for fast (vegetative) plant propagation.Also, for some plants that help to sequester carbon (such as seagrass), techniques have been developed to help speed up the process . Mass production benefited from the development of materials such as inexpensive steel, high strength steel and plastics. Machining of metals was greatly enhanced with high-speed steel and later very hard materials such as tungsten carbide for cutting edges. Fabrication using steel components was aided by the development of electric welding and stamped steel parts, both which appeared in industry in about 1890. Plastics such as polyethylene, polystyrene and polyvinyl chloride (PVC) can be easily formed into shapes by extrusion, blow molding or injection molding, resulting in very low cost manufacture of consumer products, plastic piping, containers and parts. An influential article that helped to frame and popularize the 20th century's definition of mass production appeared in a 1926 Encyclopædia Britannica supplement. The article was written based on correspondence with Ford Motor Company and is sometimes credited as the first use of the term. Electrification of factories began very gradually in the 1890s after the introduction of a practical DC motor by Frank J. Sprague and accelerated after the AC motor was developed by Galileo Ferraris, Nikola Tesla and Westinghouse, Mikhail Dolivo-Dobrovolsky and others. Electrification of factories was fastest between 1900 and 1930, aided by the establishment of electric utilities with central stations and the lowering of electricity prices from 1914 to 1917. Electric motors were several times more efficient than small steam engines because central station generation were more efficient than small steam engines and because line shafts and belts had high friction losses. Electric motors also allowed more flexibility in manufacturing and required less maintenance than line shafts and belts. Many factories saw a 30% increase in output simply from changing over to electric motors. Electrification enabled modern mass production, as with Thomas Edison's iron ore processing plant (about 1893) that could process 20,000 tons of ore per day with two shifts, each of five men. At that time it was still common to handle bulk materials with shovels, wheelbarrows and small narrow-gauge rail cars, and for comparison, a canal digger in previous decades typically handled five tons per 12-hour day. The biggest impact of early mass production was in manufacturing everyday items, such as at the Ball Brothers Glass Manufacturing Company, which electrified its mason jar plant in Muncie, Indiana, U.S., around 1900. The new automated process used glass-blowing machines to replace 210 craftsman glass blowers and helpers. A small electric truck was used to handle 150 dozen bottles at a time where previously a hand truck would carry six dozen. Electric mixers replaced men with shovels handling sand and other ingredients that were fed into the glass furnace. An electric overhead crane replaced 36 day laborers for moving heavy loads across the factory. According to Henry Ford: The provision of a whole new system of electric generation emancipated industry from the leather belt and line shaft, for it eventually became possible to provide each tool with its own electric motor. This may seem only a detail of minor importance. In fact, modern industry could not be carried out with the belt and line shaft for a number of reasons. The motor enabled machinery to be arranged in the order of the work, and that alone has probably doubled the efficiency of industry, for it has cut out a tremendous amount of useless handling and hauling. The belt and line shaft were also tremendously wasteful – so wasteful indeed that no factory could be really large, for even the longest line shaft was small according to modern requirements. Also high speed tools were impossible under the old conditions – neither the pulleys nor the belts could stand modern speeds. Without high speed tools and the finer steels which they brought about, there could be nothing of what we call modern industry. Mass production was popularized in the late 1910s and 1920s by Henry Ford's Ford Motor Company, which introduced electric motors to the then-well-known technique of chain or sequential production. Ford also bought or designed and built special purpose machine tools and fixtures such as multiple spindle drill presses that could drill every hole on one side of an engine block in one operation and a multiple head milling machine that could simultaneously machine 15 engine blocks held on a single fixture. All of these machine tools were arranged systematically in the production flow and some had special carriages for rolling heavy items into machining position. Production of the Ford Model T used 32,000 machine tools. The process of prefabrication, wherein parts are created separately from the finished product, is at the core of all mass-produced construction. Early examples include movable structures reportedly utilized by Akbar the Great, and the chattel houses built by emancipated slaves on Barbados. The Nissen hut, first used by the British during World War I, married prefabrication and mass production in a way that suited the needs of the military. The simple structures, which cost little and could be erected in just a couple of hours, were highly successful: over 100,000 Nissen huts were produced during World War I alone, and they would go on to serve in other conflicts and inspire a number of similar designs. Following World War II, in the United States, William Levitt pioneered the building of standardized tract houses in 56 different locations around the country. These communities were dubbed Levittowns, and they were able to be constructed quickly and cheaply through the leveraging of economies of scale, as well as the specialization of construction tasks in a process akin to an assembly line. This era also saw the invention of the mobile home, a small prefabricated house that can be transported cheaply on a truck bed. In the modern industrialization of construction, mass production is often used for prefabrication of house components. Mass production has significantly impacted the fashion industry, particularly in the realm of fibers and materials. The advent of synthetic fibers, such as polyester and nylon, revolutionized textile manufacturing by providing cost-effective alternatives to natural fibers. This shift enabled the rapid production of inexpensive clothing, contributing to the rise of fast fashion. This reliance on mass production has raised concerns about environmental sustainability and labor conditions, spurring the need for greater ethical and sustainable practices within the fashion industry. The use of assembly lines Mass production systems for items made of numerous parts are usually organized into assembly lines. The assemblies pass by on a conveyor, or if they are heavy, hung from an overhead crane or monorail. In a factory for a complex product, rather than one assembly line, there may be many auxiliary assembly lines feeding sub-assemblies (i.e. car engines or seats) to a backbone "main" assembly line. A diagram of a typical mass-production factory looks more like the skeleton of a fish than a single line. Vertical integration Vertical integration is a business practice that involves gaining complete control over a product's production, from raw materials to final assembly. In the age of mass production, this caused shipping and trade problems in that shipping systems were unable to transport huge volumes of finished automobiles (in Henry Ford's case) without causing damage, and also government policies imposed trade barriers on finished units. Ford built the Ford River Rouge Complex with the idea of making the company's own iron and steel in the same large factory site where parts and car assembly took place. River Rouge also generated its own electricity. Upstream vertical integration, such as to raw materials, is away from leading technology toward mature, low-return industries. Most companies chose to focus on their core business rather than vertical integration. This included buying parts from outside suppliers, who could often produce them as cheaply or cheaper. Standard Oil, the major oil company in the 19th century, was vertically integrated partly because there was no demand for unrefined crude oil, but kerosene and some other products were in great demand. The other reason was that Standard Oil monopolized the oil industry. The major oil companies were, and many still are, vertically integrated, from production to refining and with their own retail stations, although some sold off their retail operations. Some oil companies also have chemical divisions. Lumber and paper companies at one time owned most of their timber lands and sold some finished products such as corrugated boxes. The tendency has been to divest of timber lands to raise cash and to avoid property taxes. Advantages and disadvantages The economies of mass production come from several sources. The primary cause is a reduction of non-productive effort of all types. In craft production, the craftsman must bustle about a shop, getting parts and assembling them. He must locate and use many tools many times for varying tasks. In mass production, each worker repeats one or a few related tasks that use the same tool to perform identical or near-identical operations on a stream of products. The exact tool and parts are always at hand, having been moved down the assembly line consecutively. The worker spends little or no time retrieving and/or preparing materials and tools, and so the time taken to manufacture a product using mass production is shorter than when using traditional methods. The probability of human error and variation is also reduced, as tasks are predominantly carried out by machinery; error in operating such machinery has more far-reaching consequences. A reduction in labour costs, as well as an increased rate of production, enables a company to produce a larger quantity of one product at a lower cost than using traditional, non-linear methods. However, mass production is inflexible because it is difficult to alter a design or production process after a production line is implemented. Also, all products produced on one production line will be identical or very similar, and introducing variety to satisfy individual tastes is not easy. However, some variety can be achieved by applying different finishes and decorations at the end of the production line if necessary. The starter cost for the machinery can be expensive so the producer must be sure it sells or the producers will lose a lot of money. The Ford Model T produced tremendous affordable output but was not very good at responding to demand for variety, customization, or design changes. As a consequence Ford eventually lost market share to General Motors, who introduced annual model changes, more accessories and a choice of colors. With each passing decade, engineers have found ways to increase the flexibility of mass production systems, driving down the lead times on new product development and allowing greater customization and variety of products. Compared with other production methods, mass production can create new occupational hazards for workers. This is partly due to the need for workers to operate heavy machinery while also working close together with many other workers. Preventative safety measures, such as fire drills, as well as special training is therefore necessary to minimise the occurrence of industrial accidents. Socioeconomic impacts In the 1830s, French political thinker and historian Alexis de Tocqueville identified one of the key characteristics of America that would later make it so amenable to the development of mass production: the homogeneous consumer base. De Tocqueville wrote in his Democracy in America (1835) that "The absence in the United States of those vast accumulations of wealth which favor the expenditures of large sums on articles of mere luxury ... impact to the productions of American industry a character distinct from that of other countries' industries. [Production is geared toward] articles suited to the wants of the whole people". Mass production improved productivity, which was a contributing factor to economic growth and the decline in work week hours, alongside other factors such as transportation infrastructures (canals, railroads and highways) and agricultural mechanization. These factors caused the typical work week to decline from 70 hours in the early 19th century to 60 hours late in the century, then to 50 hours in the early 20th century and finally to 40 hours in the mid-1930s. Mass production permitted great increases in total production. Using a European crafts system into the late 19th century it was difficult to meet demand for products such as sewing machines and animal powered mechanical harvesters. By the late 1920s many previously scarce goods were in good supply. One economist has argued that this constituted "overproduction" and contributed to high unemployment during the Great Depression. Say's law denies the possibility of general overproduction and for this reason classical economists deny that it had any role in the Great Depression. Mass production allowed the evolution of consumerism by lowering the unit cost of many goods used. Mass production has been linked to the Fast Fashion Industry, often leaving the consumer with lower quality garments for a lower cost. Most fast-fashion clothing is mass-produced, which means it is typically made of cheap fabrics, such as polyester, and constructed poorly in order to keep short turnaround times to meet the demands of consumers and shifting trends. See also References From old price tables it can be deduced that the capacity of a printing press around 1600, assuming a fifteen-hour workday, was between 3,200 and 3,600 impressions per day. Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Rescue_archaeology] | [TOKENS: 427]
Contents Rescue archaeology Rescue archaeology, sometimes called commercial archaeology, preventive archaeology, salvage archaeology, contract archaeology, developer-funded archaeology, or compliance archaeology, is state-sanctioned, archaeological survey and excavation carried out as part of the planning process in advance of construction or other land development. In Western Europe, excavation is the final stage in a sequence of activities that start with desk-based assessments of the archaeological potential through exploratory fieldwork: monument surveys, test pitting, shovel pitting, evaluations, and so forth. Other, less common causes for salvage digs can be looting and illegal construction. Conditions leading to rescue archaeology could include, but are not limited to, road and other major construction, the floodplain of a proposed dam, or even before the onset of war. Unlike traditional survey and excavation, rescue archaeology must be undertaken at speed. Rescue archaeology is included in the broader categories of cultural resource management (CRM) and cultural heritage management (CHM). Background Rescue archaeology occurs on sites about to be destroyed but, on occasion, may include in situ preservation of any finds or protective measures taken to preserve an unexcavated site beneath a building. Urban areas with many overlaid years of habitation are often candidates for rescue archaeology. The focus of early work was to set up organisations to undertake rescue excavations shortly before an area was disturbed by construction equipment. Archaeologists relied on the goodwill of the developer to provide the opportunity to record remains. In the present day, an archaeological survey may be required by planning process or building law, as with the National Planning Policy Framework (NPPF) in the UK. Common conditions required by planning authorities are archaeological field surveys, watching briefs, shovel test pits, trial trenching, and excavation. Guidance and standards of practice in the UK are largely monitored through the Chartered Institute for Archaeologists (CIfA). Contract or commercial archaeology services have sprung up to meet the needs of developers and to comply with local laws and planning regulations. See also References Further reading External links Media related to Rescue archaeology at Wikimedia Commons
========================================
[SOURCE: https://en.wikipedia.org/wiki/Magnetic_field_of_Mars] | [TOKENS: 1760]
Contents Magnetic field of Mars The magnetic field of Mars is the magnetic field generated from Mars's interior. Today, Mars does not have a global magnetic field. However, Mars did power an early dynamo that produced a strong magnetic field 4 billion years ago, comparable to Earth's present surface field. After the early dynamo ceased, a weak late dynamo was reactivated (or persisted up to) ~3.8 billion years ago. The distribution of Martian crustal magnetism is similar to the Martian dichotomy. Whereas the Martian northern lowlands are largely unmagnetized, the southern hemisphere possesses strong remanent magnetization, showing alternating stripes. Scientific understanding of the evolution of the magnetic field of Mars is based on the combination of satellite measurements and Martian ground-based magnetic data. Crustal magnetism The reconstruction of the Martian global crustal magnetism is mainly based on magnetic field measurements from the Mars Global Surveyor (MGS) magnetic field experiment/electron reflectometer (MAG/ER) and Mars Atmosphere and Volatile Evolution (MAVEN) magnetic-field data. However, these satellites are located at altitudes of 90–6000 km and have spatial resolutions of ≥160 km, so the measured magnetization cannot observe crustal magnetic fields at shorter length scales. Mars currently does not sustain an active dynamo based on the Mars Global Surveyor (MGS) and Mars Atmosphere and Volatile Evolution (MAVEN) magnetic field measurements. The satellite data show that the older (~4.2–4.3 billion years, Ga) southern-hemisphere crust records strong remanent magnetization (~22 nT), but the younger northern lowlands have a much weaker or zero remanent magnetization. The large basins formed during the Late Heavy Bombardment (LHB) (~ 4.1–3.9 Ga) (e.g., Argyre, Hellas, and Isidis) and volcanic provinces (e.g., Elysium, Olympus Mons, Tharsis Montes, and Alba Patera) lack magnetic signatures, but the younger Noachian and Hesperian volcanoes (e.g., Tyrrhenus Mons and Syrtis Major) have crustal remanence. The Interior Exploration using Seismic Investigations, Geodesy and Heat Transport (InSight) mission measured the crustal field at the Insight landing site located in Elysium Planitia to be ~2 μT. This detailed ground-level data is an order of magnitude higher than satellite-based estimates of ~200 nT at the InSight landing site. The source of this high magnetization is suggested to be Noachian basement (~3.9 Ga) beneath the Early Amazonian and Hesperian flows (~3.6 and 1.5 Ga). Paleomagnetism Martian meteorites enable estimates of Mars's paleofield based on the thermal remanent magnetization (or TRM) (i.e., the remanent magnetization acquired when the meteorite cooled below the Curie temperature in the presence of the ambient magnetic field). The thermal remanent magnetization of carbonates in meteorite ALH84001 revealed that the early (4.1–3.9 Ga) Martian magnetic field was ~50 μT, much higher than the modern field, suggesting that a Martian dynamo was present until at least this time. Younger (~1.4 Ga) Martian Nakhlite meteorite Miller Range (MIL) 03346 recorded a paleofield of only ~5 μT. However, given the possible source locations of the Nakhlite meteorite, this paleointensity still suggests that the surface magnetization is stronger than the magnetic fields estimated from satellite measurements. The ~5 μT paleofield of this meteorite can be explained either by a late active dynamo or the field generated from lava flows emplaced in the absence of a late Martian dynamo. Martian meteorites contain a wide range of magnetic minerals that can record ancient remanent magnetism, including magnetite, titano-magnetite, pyrrhotite, and hematite. The magnetic mineralogy includes single domain (SD), pseudo single domain (PSD)-like, multi-domain (MD) states. However, only limited Martian meteorites are available to reconstruct the Martian paleofield due to aqueous, thermal, and shock overprints that make many Martian meteorites unsuitable for these studies. Paleomagnetic studies of Martian meteorites are listed in the table below: Martian dynamo The exact timing and duration of the Martian dynamo remain unknown, but there are several constraints from satellite observations and paleomagnetic studies. The strong crustal magnetization in the southern hemisphere and the paleomagnetic evidence of ALH84001 indicate that Mars sustained a strong magnetic field between ~4.2–4.3 Ga. The absence of crustal magnetic signatures in the upper lowlands and large impact basins implies dynamo termination prior to the formation of these basins (~4.0–3.9 Ga). Magnetic anomalies from two young volcanoes (e.g., Tyrrhenus Mons, Syrtis Major) may reflect the presence of a Martian magnetic field with possible magnetic reversals during the late Noachian and Hesperian period. One unresolved question is why the Martian crustal hemispheric dichotomy correlates to the magnetic dichotomy (and whether the origin of this dichotomy is an exogenic or endogenic process). One exogenic explanation is that the Borealis impact event resulted in thermal demagnetization of an initially magnetized northern hemisphere, but the proposed age of this event (~4.5 Ga) is long before the Martian dynamo termination (~4.0–4.1 Ga). An alternate model suggests that degree-1 mantle convection (i.e., a convective structure in which mantle upwelling dominates in one hemisphere but downwelling takes in the other hemisphere) can produce a single-hemisphere dynamo. One striking feature in Martian crustal magnetism is the long E–W trending alternating stripes on the southern hemisphere (Terra Cimmeria and Terra Sirenum). It has been proposed that these bands are formed by plate tectonic activity similar to the alternating magnetic polarity caused by seafloor crust spreading on Earth or the results of repeated dike intrusions. However, careful selection of the data analysis method is required to interpret these alternating stripes. Using sparse solutions (e.g., L1 regularization) of crustal-field measurements instead of smoothing solutions (e.g., L2 regularization) shows highly magnetized local patches (with the rest of the crust unmagnetized) instead of stripes. These patches might be formed by localized events such as volcanism or heating by impact events, which may not require continuous fields (e.g., intermittent dynamo). The dynamo mechanism of Mars is poorly understood but expected to be similar to the Earth's dynamo mechanism. Thermal convection due to the high thermal gradients in the hot, initial core was likely the primary mechanism for driving a dynamo early in Mars's history. As the mantle and core cooled over time, inner-core crystallization (which would provide latent heat) and chemical convection may have played a major role in driving the dynamo. Following inner-core formation, light elements migrated from the inner-core boundary into the liquid outer core and drove convection by buoyancy. However, even InSight lander data could not confirm the presence of Mars's solid inner core, and we cannot exclude the possibility that there was no core crystallization (only thermal convection without chemical convection). Also, the possibility that magnetic fields may have been generated by a magma ocean cannot be ruled out. It is also unclear when and by what mechanism the Martian dynamo shut down. Perhaps a change in the cooling rate of the mantle may have caused the cessation of the Martian dynamo. One theory is giant impacts during the early and mid-Noachian periods stopped the dynamo by decreasing global heat flow at the core-mantle boundary. The seismic measurements from the InSight lander revealed that the Martian outer core is in a liquid state and larger than expected. In one model, a partially crystallized Martian core explains the current state of Mars (i.e., lack of magnetic field despite liquid outer core), and this model predicts that the magnetic field has the potential to be reactivated in the future. - no solid inner core (Top-down crystallization) - possible future dynamo reactivation (Bottom-up crystallization or iron snow) - powers dynamo based on the light element partitioning coefficient See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Owarai] | [TOKENS: 868]
Contents Owarai Owarai (お笑い) is a broad word used to describe Japanese comedy as seen on television. The word owarai is the honorific form of the word warai (by adding o- prefix), meaning "a laugh" or "a smile". Owarai is most common on Japanese variety shows and the comedians are referred to as owarai geinin or owarai tarento. Presently[when?] Japan is considered to be in an "owarai boom", and many minor talents have been finding sudden fame after a gag or skit became popular. Characteristics Manzai (漫才), a traditional form of Japanese comedy that became the basis of many modern acts today, is characterized by a pair of usually older male comedians acting certain roles in a constant comedic battle against themselves. This tradition is continued in the acts of many modern talents. Whereas manzai performers traditionally wore kimono (traditional Japanese dress), these days a western suit is the outfit of choice for many owarai kombi (コンビ, combination; referring to a pair of comedians in a unit) and many talents who begin their careers performing in a style very similar to stand-up comedy, usually including aspects of manzai and conte. Some minor characteristics include frequently used sound effects (cheap, old-fashioned sound effects are used intentionally for comic effect), dajare (ダジャレ, a Japanese-style pun), and dokkiri (ドッキリ, a hidden-camera prank like those seen in the popular American show Candid Camera). Owarai geinin On television, most owarai geinin are introduced using their kombi name (e.g. Yoiko Hamaguchi) and some geinin even retain the name of their former groups after they have parted ways. A few popular kombi include: Many owarai units have names based on English words or phrases. Kombi are usually included as guests for shows, though some (namely Downtown, Cream Stew, and Ninety-nine) often act as hosts as well. Some popular talents that usually do not perform in units are: Of these, Sanma, Tamori, and Beat Takeshi are sometimes referred to collectively as the "big three" because of their massive popularity. Talents such as these often act as hosts for shows, or perform together in small or large groups, something almost unimaginable for most western comedians. Variety shows Japanese variety shows are the main outlet for most owarai geinin and along with drama and anime they are some of the most popular shows on Japanese television. As a general term in Japan, "variety show" can refer to "straight" variety shows with an appropriate myriad of topics, segments, and games. It is also used for comedy oriented shows that focus more on stand-up and skits, and quiz/trivia type shows featuring comic elements. It is not to be expected that a variety show will always follow the same format, and guests from Japanese music and talent pools are frequent. The variety style shows generally divided into segments of games, features, and "corners", some very short and some shows focusing (for a special episode) solely on one game or feature. Trivia, quiz, or game shows in Japan are often considered owarai as the contestants of such shows are often a mix of owarai geinin and other Japanese talents of various descriptions. Game shows without any famous characters playing the role of contestants are rare. Of these sections and games, many can be seen recurring on a variety of shows all across Japan. It may even be possible to classify Japanese variety shows (or at least the individual sections of the shows) according to the following formats: Some concepts of variety shows are consistent over most of Japanese television, though they may be considered quite different from those seen in the western world. Many shows are made up of what are called VTRs, or video segments, and are usually introduced with a hand gesture and the word dōzo (the implied meaning is "let's have a look"), though this procedure is usually made into a joke with strange gestures instead of the usual wave. A few popular variety/comedy shows of varying contents are: See also Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Perseverance_(rover)] | [TOKENS: 5211]
Contents Perseverance (rover) Perseverance is a NASA rover that has been exploring Mars since February 18, 2021, as part of the Mars 2020 mission. Built and managed by the Jet Propulsion Laboratory, it was launched on July 30, 2020, from Cape Canaveral aboard an Atlas V rocket and landed in Jezero Crater, a site chosen for its ancient river delta that may preserve evidence of past microbial life. The rover's main goals are to search for signs of ancient life, study the planet's geology and climate, and collect rock and regolith samples for possible return to Earth by a future mission. Perseverance also tests technologies intended to support later human exploration, including an experiment that successfully produced oxygen from the thin carbon-dioxide atmosphere. Perseverance carries seven primary scientific instruments, 19 cameras, and two microphones. It also deployed the experimental helicopter Ingenuity, which in April 2021 performed the first powered and controlled flight on another planet. Originally intended for up to five flights, Ingenuity completed dozens of sorties before being retired in 2024. Powered by a radioisotope thermoelectric generator, Perseverance has an expected mission duration of over a decade. It has provided high-resolution panoramas, drilled and cached samples for later retrieval, and identified rocks which may have been habitable for ancient microbial life in Jezero Crater. In July 2024, it discovered the Cheyava Falls rock containing "possible biosignature". As of January 5, 2026[update], the rover has been active on Mars for 1,734 sols (1,782 Earth days). Mission Despite the high-profile success of the Curiosity rover landing in August 2012, NASA's Mars Exploration Program was in a state of uncertainty in the early 2010s. Budget cuts forced NASA to pull out of a planned collaboration with the European Space Agency which included a rover mission. By the summer of 2012, a program that had been launching a mission to Mars every two years suddenly found itself with no missions approved after 2013. In 2011, the Planetary Science Decadal Survey, a report from the National Academies of Sciences, Engineering, and Medicine containing an influential set of recommendations made by the planetary science community, stated that the top priority of NASA's planetary exploration program in the decade between 2013 and 2022 should be to begin a NASA-ESA Mars Sample Return campaign, a four-mission project to cache, retrieve, launch, and safely return samples of the Martian surface to Earth. The report stated that NASA should invest in a sample-caching rover as the first step in this effort, with the goal of keeping costs under US$2.5 billion. After the success of the Curiosity rover and in response to the recommendations of the decadal survey, NASA announced its intent to launch a new Mars rover mission by 2020 at the American Geophysical Union conference in December 2012. Though initially hesitant to commit to an ambitious sample-caching capability (and subsequent follow-on missions), a NASA-convened science definition team for the Mars 2020 project released a report in July 2013 that the mission should "select and store a compelling suite of samples in a returnable cache." The Perseverance rover has four main science objectives that support the Mars Exploration Program's science goals: In the first science campaign, dubbed "Crater Floor", Perseverance performed an arching drive southward from its landing site to the Séítah unit to perform a "toe dip" into the unit to collect remote-sensing measurements of geologic targets. After that it returned to the Crater Floor Fractured Rough to collect the first core sample there. Passing by the Octavia E. Butler landing site concluded the first science campaign. The second campaign, "Fan Front", included several months of travel towards the "Three Forks" where Perseverance accessed geologic locations at the base of the ancient delta of Neretva river, as well as ascending the delta by driving up a valley wall to the northwest. The third and fourth campaigns were called "Upper Fan", and "Margin Unit", and the fifth campaign, "Northern Rim", in progress as of December 2024, is exploring "the northern part of the southwestern section of Jezero's rim" to study "rocks from deep down inside Mars that were thrown upward to form the crater rim" after the impact 3.9 billion years ago that formed Jezero Crater. The scientific results, as of 2025, are as follows. According to NASA, the mission has made "discoveries about the volcanic history, habitability, and role of water in Jezero Crater." Specifically, they reported that instead of all the rocks in Jezero crater being sedimentary, being "transported into the crater by wind or water," "several types of igneous rock" were discovered, which "showed evidence of interaction with water." Additionally, At a rock named "Wildcat Ridge" located within Jezero's well-preserved sedimentary fan deposit, Perseverance found evidence for an ancient lake environment. Not only were these sediments likely deposited in a standing body of water, but they also continued to interact with water long after they were formed. The environments recorded within the rocks at Wildcat Ridge would have been habitable for ancient microbial life, and this type of rock is ideal for preserving possible signs of ancient life. They also found that "sediments entering Jezero's lake were deposited in a delta" and "evidence for late-stage, high-energy flooding that carried large boulders into the crater." The MOXIE experiment produced 122 grams of oxygen from carbon dioxide. The microphone studies showed that the speed of sound is slower and the volumes of sounds transmitted through the atmosphere is lower, than on Earth. PIXL found that the Seitah formation and a rock at "Otis Peak" contained olivine, phosphates, sulfates, clays, carbonate minerals, silicate minerals, "augite pyroxene, feldspathic mesostasis, various Fe,Cr,Ti-spinels, and merrillite", perchlorate, feldspar, magnesite, siderite, oxides, as well as minerals with composition including magnesium, iron, chlorine, and sodium. RIMFAX revealed findings "consistent with a subsurface dominated by solid rock and mafic material" and that "the crater floor experienced a period of erosion before the deposition of the overlying delta strata. The regularity and horizontality of the basal delta sediments observed in the radar cross sections indicate that they were deposited in a low-energy lake environment." Design The Perseverance design evolved from its predecessor, the Curiosity rover. The two rovers share a similar body plan, landing system, cruise stage, and power system, but the design was improved in several ways for Perseverance. Engineers designed the rover wheels to be more robust than Curiosity's wheels, which had sustained some damage. Perseverance has thicker, more durable aluminum wheels, with reduced width and a greater diameter, 52.5 cm (20.7 in), than Curiosity's 50 cm (20 in) wheels. The aluminum wheels are covered with cleats for traction and curved titanium spokes for springy support. The heat shield for the rover was made out of phenolic-impregnated carbon ablator (PICA), to allow it to withstand up to 2,400 °F (1,320 °C) of heat. Like Curiosity, the rover includes a robotic arm, although Perseverance's arm is longer and stronger, measuring 2.1 m (6 ft 11 in). The arm hosts an elaborate rock-coring and sampling mechanism to store geologic samples from the Martian surface in sterile caching tubes. There is also a secondary arm hidden below the rover that helps store the chalk-sized samples. This arm is known as the Sample Handling Assembly (SHA), and is responsible for moving the soil samples to various stations within the Adaptive Caching Assembly (ACA) on the underside of the rover. These stations include volume assessment (measuring the length of sample), imaging, seal dispensing, and hermetic seal station, among others. Owing to the small space in which the SHA must operate, as well as load requirements during sealing activities, the Sample Caching System "is the most complicated, most sophisticated mechanism that we have ever built, tested and readied for spaceflight." The combination of larger instruments, new sampling and caching system, and modified wheels makes Perseverance heavier, weighing 1,025 kg (2,260 lb) compared to Curiosity at 899 kg (1,982 lb)—a 14% increase. The rover's multi-mission radioisotope thermoelectric generator (MMRTG) has a mass of 45 kg (99 lb) and uses 4.8 kg (11 lb) of plutonium-238 oxide as its power source. The radioactive decay of plutonium-238, which has a half-life of 87.7 years, gives off heat which is converted to electricity—approximately 110 watts at launch. This will decrease over time as its power source decays. The MMRTG charges two lithium-ion rechargeable batteries which power the rover's activities, and must be recharged periodically. Unlike solar panels, the MMRTG provides engineers with significant flexibility in operating the rover's instruments even at night, during dust storms, and through winter. The rover's computer uses the BAE Systems RAD750 radiation-hardened single board computer based on a ruggedized PowerPC G3 microprocessor (PowerPC 750). The computer contains 128 megabytes of volatile DRAM, and runs at 133 MHz. The flight software runs on the VxWorks operating system, is written in C and is able to access 4 gigabytes of NAND non-volatile memory on a separate card. Perseverance relies on three antennas for telemetry, all of which are relayed through craft currently in orbit around Mars. The primary UHF antenna can send data from the rover at a maximum rate of two megabits per second. Two slower X-band antennas provide communications redundancy. Instruments NASA considered nearly 60 proposals for rover instrumentation. On July 31, 2014, NASA announced the seven instruments that would make up the payload for the rover: There are additional cameras and two audio microphones (the first working microphones on Mars), that will be used for engineering support during landing, driving, and collecting samples. For a full look at Perseverance's components see Learn About the Rover. The Ingenuity helicopter, powered by solar-charged batteries, was sent to Mars in the same bundle with Perseverance. With a mass of 1.8 kg (4.0 lb), the helicopter demonstrated the reality of flight in the rarefied Martian atmosphere and the potential usefulness of aerial scouting for rover missions. It carried two cameras but no scientific instruments and communicated with Earth via a base station onboard Perseverance. Its pre-launch experimental test plan was three flights in 45 days, but it far exceeded expectations and made 72 flights in nearly three years. After its first few flights, it made incrementally more ambitious ones, several of which were recorded by Perseverance's cameras. The first flight was April 19, 2021, at 07:15 UTC, with confirmation from data reception at 10:15 UTC. It was the first powered flight by any aircraft on another planet. On January 18, 2024 (UTC), it made its 72nd and final flight, suffering the loss of a rotor blade (imaged, by Perseverance, lying on the sand roughly 15 m (49 ft) distant from the upright body of Ingenuity), causing NASA to retire it. Associate Administrator of NASA's Science Mission Directorate Thomas Zurbuchen selected the name Perseverance following a nationwide K-12 student "name the rover" contest that attracted more than 28,000 proposals. A seventh-grade student, Alexander Mather from Lake Braddock Secondary School in Burke, Virginia, submitted the winning entry at the Jet Propulsion Laboratory. In addition to the honor of naming the rover, Mather and his family were invited to NASA's Kennedy Space Center to watch the rover's July 2020 launch from Cape Canaveral Air Force Station (CCAFS) in Florida. He was also joined at the launch by 11th grade student Vaneeza Rupani from Tuscaloosa County High School in Northport, Alabama, who named the Ingenuity helicopter that would fly with Perseverance. Mather wrote in his winning essay: Curiosity. InSight. Spirit. Opportunity. If you think about it, all of these names of past Mars rovers are qualities we possess as humans. We are always curious, and seek opportunity. We have the spirit and insight to explore the Moon, Mars, and beyond. But, if rovers are to be the qualities of us as a race, we missed the most important thing: Perseverance. We as humans evolved as creatures who could learn to adapt to any situation, no matter how harsh. We are a species of explorers, and we will meet many setbacks on the way to Mars. However, we can persevere. We, not as a nation but as humans, will not give up. The human race will always persevere into the future. JPL built a copy of the Perseverance; a twin rover used for testing and problem solving, OPTIMISM (Operational Perseverance Twin for Integration of Mechanisms and Instruments Sent to Mars), a vehicle system test bed (VSTB). It is housed at the JPL Mars Yard and is used to test operational procedures and to aid in problem solving should any issues arise with Perseverance. Operational history The Perseverance rover lifted off successfully on July 30, 2020, at 11:50:00 UTC aboard a United Launch Alliance Atlas V launch vehicle from Space Launch Complex 41, at Cape Canaveral Air Force Station (CCAFS) in Florida. The rover took 29 weeks to travel to Mars and made its landing in Jezero Crater on February 18, 2021, to begin its science phase. After May 17, 2022, the rover will move uphill and examine rocks on the surface for evidence of past life on Mars. On its return downhill, it will collect sample rocks to be retrieved and examined by future expeditions. The successful landing of Perseverance in Jezero Crater was announced at 20:55 UTC on February 18, 2021, the signal from Mars taking 11 minutes to arrive at Earth. The rover touched down at 18°26′41″N 77°27′03″E / 18.4446°N 77.4509°E / 18.4446; 77.4509, roughly 1 km (0.62 mi) southeast of the center of its 7.7 km × 6.6 km (4.8 mi × 4.1 mi) wide landing ellipse. It came down pointed almost directly to the southeast, with the RTG on the back of the vehicle pointing northwest. The descent stage ("sky crane"), parachute and heat shield all came to rest within 1.5 km of the rover (see satellite image). Having come within five meters (16 ft) of its target,[which?] the landing was more accurate than any previous Mars landing; a feat enabled by the experience gained from Curiosity's landing and the use of new steering technology. One such new technology is Terrain Relative Navigation (TRN), a technique in which the rover compares images of the surface taken during its descent with reference maps, allowing it to make last minute adjustments to its course. The rover also uses the images to select a safe landing site at the last minute, allowing it to land in relatively unhazardous terrain. This enables it to land much closer to its science objectives than previous missions, which all had to use a landing ellipse devoid of hazards. The landing occurred in the late afternoon, with the first images taken at 15:53:58 on the mission clock (local mean solar time). The landing took place shortly after Mars passed through its northern vernal equinox (Ls = 5.2°), at the start of the astronomical spring, the equivalent of the end of March on Earth. The parachute descent of the Perseverance rover was photographed by the HiRISE high-resolution camera on the Mars Reconnaissance Orbiter (MRO). Jezero Crater is a paleolake basin. It was selected as the landing site for this mission in part because paleolake basins tend to contain perchlorates. Astrobiologist Dr. Kennda Lynch's work in analog environments on Earth suggests that the composition of the crater, including the bottomset deposits accumulated from three different sources in the area, is a likely place to discover evidence of perchlorate-reducing microbes, if such bacteria are living or were formerly living on Mars. A few days after landing, Perseverance released the first audio recorded on the surface of Mars, capturing the sound of Martian wind. During its travels on Mars, NASA scientists had observed around Sol 341 (February 4, 2022) that a small rock had dropped into one of its wheels while the rover was studying the Máaz rock formation. The rock was visible from one of the hazard avoidance cameras, and was determined not to be harmful to the rover's mission. The rock has since stayed on Perseverance's wheel for around 427 sols (439 days) as the rover traveled over 6 miles (9.7 km) on the martian surface. NASA deemed that Perseverance had adopted a pet rock for its journey. Later, by May 2024, the rover picked up another pet rock named "Dwayne". It is planned for Perseverance to visit the bottom and upper parts of the 3.4 to 3.8 billion-year-old Neretva Vallis delta, the smooth and etched parts of the Jezero Crater floor deposits interpreted as volcanic ash or aeolian airfall deposits, emplaced before the formation of the delta; the ancient shoreline covered with Transverse Aeolian Ridges (dunes) and mass wasting deposits, and finally, it is planned to climb onto the Jezero Crater rim. In its progressive commissioning and tests, Perseverance made its first test drive on Mars on March 4, 2021. NASA released photographs of the rover's first wheel tracks on the Martian soil. In support of the NASA-ESA Mars Sample Return, rock, regolith (Martian soil), and atmosphere samples are being cached by Perseverance. As of July 2025,[update] 33 out of 43 sample tubes have been filled, including 8 igneous rock samples, 13 sedimentary rock sample tubes, 3 Igneous/Impactite rock sample tubes, a Serpentinite rock sample tube, a Silica-cemented carbonate rock sample tube, two regolith sample tubes, an atmosphere sample tube, and three witness tubes. Before launch, 5 of the 43 tubes were designated "witness tubes" and filled with materials that would capture particulates in the ambient environment of Mars. Out of 43 tubes, 3 witness sample tubes will not be returned to Earth and will remain on rover as the sample canister will only have 30 tube slots. Further, 10 of the 43 tubes are left as backups at the Three Forks Sample Depot. In July 2024, Perseverance discovered "leopard spots" on a reddish rock nicknamed "Cheyava Falls" in Mars' Jezero Crater, that has some indications it may have hosted microbial life billions of years ago, but further research is needed. Cost NASA plans to invest roughly US$2.75 billion in the project over 11 years, including US$2.2 billion for the development and building of the hardware, US$243 million for launch services, and US$291 million for 2.5 years of mission operations. Adjusted for inflation, Perseverance is NASA's sixth-most expensive robotic planetary mission, though it is cheaper than its predecessor, Curiosity. Perseverance benefited from spare hardware and "build-to print" designs from the Curiosity mission, which helped reduce development costs and saved "probably tens of millions, if not 100 million dollars" according to Mars 2020 Deputy Chief Engineer Keith Comeaux. Commemorative artifacts NASA's "Send Your Name to Mars" campaign invited people from around the world to submit their names to travel aboard the agency's next rover to Mars. 10,932,295 names were submitted. The names were etched by an electron beam onto three fingernail-sized silicon chips, along with the essays of the 155 finalists in NASA's "Name the Rover" contest. The three chips share space on an anodized plate with a laser engraved graphic representing Earth, Mars, and the Sun. The rays emanating from the Sun contain the phrase "Explore As One" written in Morse code. The plate was then mounted on the rover on March 26, 2020. Part of Perseverance's cargo is a geocaching trackable item viewable with the SHERLOC's WATSON camera. In 2016, NASA SHERLOC co-investigator Dr. Marc Fries — with help from his son Wyatt — was inspired by Geocaching's 2008 placement of a cache on the International Space Station to set out and try something similar with the rover mission. After floating the idea around mission management, it eventually reached NASA scientist Francis McCubbin, who would join the SHERLOC instrument team as a collaborator to move the project forward. The Geocaching inclusion was scaled-down to a trackable item that players could search for from NASA camera views and then log on to the site. In a manner similar to the "Send Your Name to Mars" campaign, the geocaching trackable code was carefully printed on a one-inch, polycarbonate glass disk serving as part of the rover's calibration target. It will serve as an optical target for the WATSON imager and a spectroscopic standard for the SHERLOC instrument. The disk is made of a prototype astronaut helmet visor material that will be tested for its potential use in crewed missions to Mars. Designs were approved by the mission leads at NASA's Jet Propulsion Laboratory (JPL), NASA Public Affairs, and NASA HQ, in addition to Groundspeak Geocaching HQ. Perseverance launched during the COVID-19 pandemic, which began to affect the mission planning in March 2020. To show appreciation for healthcare workers who helped during the pandemic, an 8 cm × 13 cm (3.1 in × 5.1 in) plate with a staff-and-serpent symbol (a Greek symbol of medicine) was placed on the rover. The project manager, Matt Wallace, said he hoped that future generations going to Mars would be able to appreciate healthcare workers during 2020. One of the external plates of Perseverance includes a simplified representation of all previous NASA Martian rovers, Sojourner, Spirit, Opportunity, Curiosity, as well as Perseverance and Ingenuity, similar to the trend of automobile window decals used to show a family's makeup. The orange-and-white parachute used to land the rover on Mars contained a coded message that was deciphered by Twitter users. NASA's systems engineer Ian Clark used binary code to hide the message "dare mighty things" in the parachute color pattern. The 21-meter-wide (70 ft) parachute consisted of 80 strips of fabric that form a hemisphere-shape canopy, and each strip consisted of four pieces. Dr. Clark thus had 320 pieces with which to encode the message. He also included the GPS coordinates for the Jet Propulsion Laboratory's headquarters in Pasadena, California (34°11'58" N 118°10'31" W). Clark said that only six people knew about the message before landing. The code was deciphered a few hours after the image was presented by Perseverance's team. "Dare mighty things" is a quote attributed to U.S. president Theodore Roosevelt and is the unofficial motto of the Jet Propulsion Laboratory. It adorns many of the JPL center's walls. In December 2021, the NASA team announced a program to students who have persevered with academic challenges. Those nominated will be rewarded with a personal message beamed back from Mars by the Perseverance rover. Gallery March 5, 2024: NASA released images of transits of the moon Deimos, the moon Phobos and the planet Mercury as viewed by the Perseverance rover on the planet Mars. Notes See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-Johnson26–302-68] | [TOKENS: 17273]
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023​, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America)
========================================
[SOURCE: https://en.wikipedia.org/wiki/Special:BookSources/9812293841] | [TOKENS: 380]
Contents Book sources This page allows users to search multiple sources for a book given a 10- or 13-digit International Standard Book Number. Spaces and dashes in the ISBN do not matter. This page links to catalogs of libraries, booksellers, and other book sources where you will be able to search for the book by its International Standard Book Number (ISBN). Online text Google Books and other retail sources below may be helpful if you want to verify citations in Wikipedia articles, because they often let you search an online version of the book for specific words or phrases, or you can browse through the book (although for copyright reasons the entire book is usually not available). At the Open Library (part of the Internet Archive) you can borrow and read entire books online. Online databases Subscription eBook databases Libraries Alabama Alaska California Colorado Connecticut Delaware Florida Georgia Illinois Indiana Iowa Kansas Kentucky Massachusetts Michigan Minnesota Missouri Nebraska New Jersey New Mexico New York North Carolina Ohio Oklahoma Oregon Pennsylvania Rhode Island South Carolina South Dakota Tennessee Texas Utah Washington state Wisconsin Bookselling and swapping Find your book on a site that compiles results from other online sites: These sites allow you to search the catalogs of many individual booksellers: Non-English book sources If the book you are looking for is in a language other than English, you might find it helpful to look at the equivalent pages on other Wikipedias, linked below – they are more likely to have sources appropriate for that language. Find other editions The WorldCat xISBN tool for finding other editions is no longer available. However, there is often a "view all editions" link on the results page from an ISBN search. Google books often lists other editions of a book and related books under the "about this book" link. You can convert between 10 and 13 digit ISBNs with these tools: Find on Wikipedia See also Get free access to research! Research tools and services Outreach Get involved
========================================
[SOURCE: https://en.wikipedia.org/wiki/Black_hole#Formation] | [TOKENS: 13839]
Contents Black hole A black hole is an astronomical body so compact that its gravity prevents anything, including light, from escaping. Albert Einstein's theory of general relativity predicts that a sufficiently compact mass will form a black hole. The boundary of no escape is called the event horizon. In general relativity, a black hole's event horizon seals an object's fate but produces no locally detectable change when crossed. General relativity also predicts that every black hole should have a central singularity, where the curvature of spacetime is infinite. In many ways, a black hole acts like an ideal black body, as it reflects no light. Quantum field theory in curved spacetime predicts that event horizons emit Hawking radiation, with the same spectrum as a black body of a temperature inversely proportional to its mass. This temperature is of the order of billionths of a kelvin for stellar black holes, making it essentially impossible to observe directly. Objects whose gravitational fields are too strong for light to escape were first considered in the 18th century by John Michell and Pierre-Simon Laplace. In 1916, Karl Schwarzschild found the first modern solution of general relativity that would characterise a black hole. Due to his influential research, the Schwarzschild metric is named after him. David Finkelstein, in 1958, first interpreted Schwarzschild's model as a region of space from which nothing can escape. Black holes were long considered a mathematical curiosity; it was not until the 1960s that theoretical work showed they were a generic prediction of general relativity. The first black hole known was Cygnus X-1, identified by several researchers independently in 1971. Black holes typically form when massive stars collapse at the end of their life cycle. After a black hole has formed, it can grow by absorbing mass from its surroundings. Supermassive black holes of millions of solar masses may form by absorbing other stars and merging with other black holes, or via direct collapse of gas clouds. There is consensus that supermassive black holes exist in the centres of most galaxies. The presence of a black hole can be inferred through its interaction with other matter and with electromagnetic radiation such as visible light. Matter falling toward a black hole can form an accretion disk of infalling plasma, heated by friction and emitting light. In extreme cases, this creates a quasar, some of the brightest objects in the universe. Merging black holes can also be detected by observation of the gravitational waves they emit. If other stars are orbiting a black hole, their orbits can be used to determine the black hole's mass and location. Such observations can be used to exclude possible alternatives such as neutron stars. In this way, astronomers have identified numerous stellar black hole candidates in binary systems and established that the radio source known as Sagittarius A*, at the core of the Milky Way galaxy, contains a supermassive black hole of about 4.3 million solar masses. History The idea of a body so massive that even light could not escape was first proposed in the late 18th century by English astronomer and clergyman John Michell and independently by French scientist Pierre-Simon Laplace. Both scholars proposed very large stars in contrast to the modern concept of an extremely dense object. Michell's idea, in a short part of a letter published in 1784, calculated that a star with the same density but 500 times the radius of the sun would not let any emitted light escape; the surface escape velocity would exceed the speed of light.: 122 Michell correctly hypothesized that such supermassive but non-radiating bodies might be detectable through their gravitational effects on nearby visible bodies. In 1796, Laplace mentioned that a star could be invisible if it were sufficiently large while speculating on the origin of the Solar System in his book Exposition du Système du Monde. Franz Xaver von Zach asked Laplace for a mathematical analysis, which Laplace provided and published in a journal edited by von Zach. In 1905, Albert Einstein showed that the laws of electromagnetism would be invariant under a Lorentz transformation: they would be identical for observers travelling at different velocities relative to each other. This discovery became known as the principle of special relativity. Although the laws of mechanics had already been shown to be invariant, gravity remained yet to be included.: 19 In 1907, Einstein published a paper proposing his equivalence principle, the hypothesis that inertial mass and gravitational mass have a common cause. Using the principle, Einstein predicted the redshift and half of the lensing effect of gravity on light; the full prediction of gravitational lensing required development of general relativity.: 19 By 1915, Einstein refined these ideas into his general theory of relativity, which explained how matter affects spacetime, which in turn affects the motion of other matter. This formed the basis for black hole physics. Only a few months after Einstein published the field equations describing general relativity, astrophysicist Karl Schwarzschild set out to apply the idea to stars. He assumed spherical symmetry with no spin and found a solution to Einstein's equations.: 124 A few months after Schwarzschild, Johannes Droste, a student of Hendrik Lorentz, independently gave the same solution. At a certain radius from the center of the mass, the Schwarzschild solution became singular, meaning that some of the terms in the Einstein equations became infinite. The nature of this radius, which later became known as the Schwarzschild radius, was not understood at the time. Many physicists of the early 20th century were skeptical of the existence of black holes. In a 1926 popular science book, Arthur Eddington critiqued the idea of a star with mass compressed to its Schwarzschild radius as a flaw in the then-poorly-understood theory of general relativity.: 134 In 1939, Einstein himself used his theory of general relativity in an attempt to prove that black holes were impossible. His work relied on increasing pressure or increasing centrifugal force balancing the force of gravity so that the object would not collapse beyond its Schwarzschild radius. He missed the possibility that implosion would drive the system below this critical value.: 135 By the 1920s, astronomers had classified a number of white dwarf stars as too cool and dense to be explained by the gradual cooling of ordinary stars. In 1926, Ralph Fowler showed that quantum-mechanical degeneracy pressure was larger than thermal pressure at these densities.: 145 In 1931, Subrahmanyan Chandrasekhar calculated that a non-rotating body of electron-degenerate matter below a certain limiting mass is stable, and by 1934 he showed that this explained the catalog of white dwarf stars.: 151 When Chandrasekhar announced his results, Eddington pointed out that stars above this limit would radiate until they were sufficiently dense to prevent light from exiting, a conclusion he considered absurd. Eddington and, later, Lev Landau argued that some yet unknown mechanism would stop the collapse. In the 1930s, Fritz Zwicky and Walter Baade studied stellar novae, focusing on exceptionally bright ones they called supernovae. Zwicky promoted the idea that supernovae produced stars with the density of atomic nuclei—neutron stars—but this idea was largely ignored.: 171 In 1939, based on Chandrasekhar's reasoning, J. Robert Oppenheimer and George Volkoff predicted that neutron stars below a certain mass limit, later called the Tolman–Oppenheimer–Volkoff limit, would be stable due to neutron degeneracy pressure. Above that limit, they reasoned that either their model would not apply or that gravitational contraction would not stop.: 380 John Archibald Wheeler and two of his students resolved questions about the model behind the Tolman–Oppenheimer–Volkoff (TOV) limit. Harrison and Wheeler developed the equations of state relating density to pressure for cold matter all the way through electron degeneracy and neutron degeneracy. Masami Wakano and Wheeler then used the equations to compute the equilibrium curve for stars, relating mass to circumference. They found no additional features that would invalidate the TOV limit. This meant that the only thing that could prevent black holes from forming was a dynamic process ejecting sufficient mass from a star as it cooled.: 205 The modern concept of black holes was formulated by Robert Oppenheimer and his student Hartland Snyder in 1939.: 80 In the paper, Oppenheimer and Snyder solved Einstein's equations of general relativity for an idealized imploding star, in a model later called the Oppenheimer–Snyder model, then described the results from far outside the star. The implosion starts as one might expect: the star material rapidly collapses inward. However, as the density of the star increases, gravitational time dilation increases and the collapse, viewed from afar, seems to slow down further and further until the star reaches its Schwarzschild radius, where it appears frozen in time.: 217 In 1958, David Finkelstein identified the Schwarzschild surface as an event horizon, calling it "a perfect unidirectional membrane: causal influences can cross it in only one direction". In this sense, events that occur inside of the black hole cannot affect events that occur outside of the black hole. Finkelstein created a new reference frame to include the point of view of infalling observers.: 103 Finkelstein's new frame of reference allowed events at the surface of an imploding star to be related to events far away. By 1962 the two points of view were reconciled, convincing many skeptics that implosion into a black hole made physical sense.: 226 The era from the mid-1960s to the mid-1970s was the "golden age of black hole research", when general relativity and black holes became mainstream subjects of research.: 258 In this period, more general black hole solutions were found. In 1963, Roy Kerr found the exact solution for a rotating black hole. Two years later, Ezra Newman found the cylindrically symmetric solution for a black hole that is both rotating and electrically charged. In 1967, Werner Israel found that the Schwarzschild solution was the only possible solution for a nonspinning, uncharged black hole, meaning that a Schwarzschild black hole would be defined by its mass alone. Similar identities were later found for Reissner-Nordstrom and Kerr black holes, defined only by their mass and their charge or spin respectively. Together, these findings became known as the no-hair theorem, which states that a stationary black hole is completely described by the three parameters of the Kerr–Newman metric: mass, angular momentum, and electric charge. At first, it was suspected that the strange mathematical singularities found in each of the black hole solutions only appeared due to the assumption that a black hole would be perfectly spherically symmetric, and therefore the singularities would not appear in generic situations where black holes would not necessarily be symmetric. This view was held in particular by Vladimir Belinski, Isaak Khalatnikov, and Evgeny Lifshitz, who tried to prove that no singularities appear in generic solutions, although they would later reverse their positions. However, in 1965, Roger Penrose proved that general relativity without quantum mechanics requires that singularities appear in all black holes. Astronomical observations also made great strides during this era. In 1967, Antony Hewish and Jocelyn Bell Burnell discovered pulsars and by 1969, these were shown to be rapidly rotating neutron stars. Until that time, neutron stars, like black holes, were regarded as just theoretical curiosities, but the discovery of pulsars showed their physical relevance and spurred a further interest in all types of compact objects that might be formed by gravitational collapse. Based on observations in Greenwich and Toronto in the early 1970s, Cygnus X-1, a galactic X-ray source discovered in 1964, became the first astronomical object commonly accepted to be a black hole. Work by James Bardeen, Jacob Bekenstein, Carter, and Hawking in the early 1970s led to the formulation of black hole thermodynamics. These laws describe the behaviour of a black hole in close analogy to the laws of thermodynamics by relating mass to energy, area to entropy, and surface gravity to temperature. The analogy was completed: 442 when Hawking, in 1974, showed that quantum field theory implies that black holes should radiate like a black body with a temperature proportional to the surface gravity of the black hole, predicting the effect now known as Hawking radiation. While Cygnus X-1, a stellar-mass black hole, was generally accepted by the scientific community as a black hole by the end of 1973, it would be decades before a supermassive black hole would gain the same broad recognition. Although, as early as the 1960s, physicists such as Donald Lynden-Bell and Martin Rees had suggested that powerful quasars in the center of galaxies were powered by accreting supermassive black holes, little observational proof existed at the time. However, the Hubble Space Telescope, launched decades later, found that supermassive black holes were not only present in these active galactic nuclei, but that supermassive black holes in the center of galaxies were ubiquitous: Almost every galaxy had a supermassive black hole at its center, many of which were quiescent. In 1999, David Merritt proposed the M–sigma relation, which related the dispersion of the velocity of matter in the center bulge of a galaxy to the mass of the supermassive black hole at its core. Subsequent studies confirmed this correlation. Around the same time, based on telescope observations of the velocities of stars at the center of the Milky Way galaxy, independent work groups led by Andrea Ghez and Reinhard Genzel concluded that the compact radio source in the center of the galaxy, Sagittarius A*, was likely a supermassive black hole. On 11 February 2016, the LIGO Scientific Collaboration and Virgo Collaboration announced the first direct detection of gravitational waves, named GW150914, representing the first observation of a black hole merger. At the time of the merger, the black holes were approximately 1.4 billion light-years away from Earth and had masses of 30 and 35 solar masses.: 6 In 2017, Rainer Weiss, Kip Thorne, and Barry Barish, who had spearheaded the project, were awarded the Nobel Prize in Physics for their work. Since the initial discovery in 2015, hundreds more gravitational waves have been observed by LIGO and another interferometer, Virgo. On 10 April 2019, the first direct image of a black hole and its vicinity was published, following observations made by the Event Horizon Telescope (EHT) in 2017 of the supermassive black hole in Messier 87's galactic centre. In 2022, the Event Horizon Telescope collaboration released an image of the black hole in the center of the Milky Way galaxy, Sagittarius A*; The data had been collected in 2017. In 2020, the Nobel Prize in Physics was awarded for work on black holes. Andrea Ghez and Reinhard Genzel shared one-half for their discovery that Sagittarius A* is a supermassive black hole. Penrose received the other half for his work showing that the mathematics of general relativity requires the formation of black holes. Cosmologists lamented that Hawking's extensive theoretical work on black holes would not be honored since he died in 2018. In December 1967, a student reportedly suggested the phrase black hole at a lecture by John Wheeler; Wheeler adopted the term for its brevity and "advertising value", and Wheeler's stature in the field ensured it quickly caught on, leading some to credit Wheeler with coining the phrase. However, the term was used by others around that time. Science writer Marcia Bartusiak traces the term black hole to physicist Robert H. Dicke, who in the early 1960s reportedly compared the phenomenon to the Black Hole of Calcutta, notorious as a prison where people entered but never left alive. The term was used in print by Life and Science News magazines in 1963, and by science journalist Ann Ewing in her article "'Black Holes' in Space", dated 18 January 1964, which was a report on a meeting of the American Association for the Advancement of Science held in Cleveland, Ohio. Definition A black hole is generally defined as a region of spacetime from which no information-carrying signals or objects can escape. However, verifying an object as a black hole by this definition would require waiting for an infinite time and at an infinite distance from the black hole to verify that indeed, nothing has escaped, and thus cannot be used to identify a physical black hole. Broadly, physicists do not have a precisely-agreed-upon definition of a black hole. Among astrophysicists, a black hole is a compact object with a mass larger than four solar masses. A black hole may also be defined as a reservoir of information: 142 or a region where space is falling inwards faster than the speed of light. Properties The no-hair theorem postulates that, once it achieves a stable condition after formation, a black hole has only three independent physical properties: mass, electric charge, and angular momentum; the black hole is otherwise featureless. If the conjecture is true, any two black holes that share the same values for these properties, or parameters, are indistinguishable from one another. The degree to which the conjecture is true for real black holes is currently an unsolved problem. The simplest static black holes have mass but neither electric charge nor angular momentum. According to Birkhoff's theorem, these Schwarzschild black holes are the only vacuum solution that is spherically symmetric. Solutions describing more general black holes also exist. Non-rotating charged black holes are described by the Reissner–Nordström metric, while the Kerr metric describes a non-charged rotating black hole. The most general stationary black hole solution known is the Kerr–Newman metric, which describes a black hole with both charge and angular momentum. The simplest static black holes have mass but neither electric charge nor angular momentum. Contrary to the popular notion of a black hole "sucking in everything" in its surroundings, from far away, the external gravitational field of a black hole is identical to that of any other body of the same mass. While a black hole can theoretically have any positive mass, the charge and angular momentum are constrained by the mass. The total electric charge Q and the total angular momentum J are expected to satisfy the inequality Q 2 4 π ϵ 0 + c 2 J 2 G M 2 ≤ G M 2 {\displaystyle {\frac {Q^{2}}{4\pi \epsilon _{0}}}+{\frac {c^{2}J^{2}}{GM^{2}}}\leq GM^{2}} for a black hole of mass M. Black holes with the maximum possible charge or spin satisfying this inequality are called extremal black holes. Solutions of Einstein's equations that violate this inequality exist, but they do not possess an event horizon. These are so-called naked singularities that can be observed from the outside. Because these singularities make the universe inherently unpredictable, many physicists believe they could not exist. The weak cosmic censorship hypothesis, proposed by Sir Roger Penrose, rules out the formation of such singularities, when they are created through the gravitational collapse of realistic matter. However, this theory has not yet been proven, and some physicists believe that naked singularities could exist. It is also unknown whether black holes could even become extremal, forming naked singularities, since natural processes counteract increasing spin and charge when a black hole becomes near-extremal. The total mass of a black hole can be estimated by analyzing the motion of objects near the black hole, such as stars or gas. All black holes spin, often fast—One supermassive black hole, GRS 1915+105 has been estimated to spin at over 1,000 revolutions per second. The Milky Way's central black hole Sagittarius A* rotates at about 90% of the maximum rate. The spin rate can be inferred from measurements of atomic spectral lines in the X-ray range. As gas near the black hole plunges inward, high energy X-ray emission from electron-positron pairs illuminates the gas further out, appearing red-shifted due to relativistic effects. Depending on the spin of the black hole, this plunge happens at different radii from the hole, with different degrees of redshift. Astronomers can use the gap between the x-ray emission of the outer disk and the redshifted emission from plunging material to determine the spin of the black hole. A newer way to estimate spin is based on the temperature of gasses accreting onto the black hole. The method requires an independent measurement of the black hole mass and inclination angle of the accretion disk followed by computer modeling. Gravitational waves from coalescing binary black holes can also provide the spin of both progenitor black holes and the merged hole, but such events are rare. A spinning black hole has angular momentum. The supermassive black hole in the center of the Messier 87 (M87) galaxy appears to have an angular momentum very close to the maximum theoretical value. That uncharged limit is J ≤ G M 2 c , {\displaystyle J\leq {\frac {GM^{2}}{c}},} allowing definition of a dimensionless spin magnitude such that 0 ≤ c J G M 2 ≤ 1. {\displaystyle 0\leq {\frac {cJ}{GM^{2}}}\leq 1.} Most black holes are believed to have an approximately neutral charge. For example, Michal Zajaček, Arman Tursunov, Andreas Eckart, and Silke Britzen found the electric charge of Sagittarius A* to be at least ten orders of magnitude below the theoretical maximum. A charged black hole repels other like charges just like any other charged object. If a black hole were to become charged, particles with an opposite sign of charge would be pulled in by the extra electromagnetic force, while particles with the same sign of charge would be repelled, neutralizing the black hole. This effect may not be as strong if the black hole is also spinning. The presence of charge can reduce the diameter of the black hole by up to 38%. The charge Q for a nonspinning black hole is bounded by Q ≤ G M , {\displaystyle Q\leq {\sqrt {G}}M,} where G is the gravitational constant and M is the black hole's mass. Classification Black holes can have a wide range of masses. The minimum mass of a black hole formed by stellar gravitational collapse is governed by the maximum mass of a neutron star and is believed to be approximately two-to-four solar masses. However, theoretical primordial black holes, believed to have formed soon after the Big Bang, could be far smaller, with masses as little as 10−5 grams at formation. These very small black holes are sometimes called micro black holes. Black holes formed by stellar collapse are called stellar black holes. Estimates of their maximum mass at formation vary, but generally range from 10 to 100 solar masses, with higher estimates for black holes progenated by low-metallicity stars. The mass of a black hole formed via a supernova has a lower bound: If the progenitor star is too small, the collapse may be stopped by the degeneracy pressure of the star's constituents, allowing the condensation of matter into an exotic denser state. Degeneracy pressure occurs from the Pauli exclusion principle—Particles will resist being in the same place as each other. Smaller progenitor stars, with masses less than about 8 M☉, will be held together by the degeneracy pressure of electrons and will become a white dwarf. For more massive progenitor stars, electron degeneracy pressure is no longer strong enough to resist the force of gravity and the star will be held together by neutron degeneracy pressure, which can occur at much higher densities, forming a neutron star. If the star is still too massive, even neutron degeneracy pressure will not be able to resist the force of gravity and the star will collapse into a black hole.: 5.8 Stellar black holes can also gain mass via accretion of nearby matter, often from a companion object such as a star. Black holes that are larger than stellar black holes but smaller than supermassive black holes are called intermediate-mass black holes, with masses of approximately 102 to 105 solar masses. These black holes seem to be rarer than their stellar and supermassive counterparts, with relatively few candidates having been observed. Physicists have speculated that such black holes may form from collisions in globular and star clusters or at the center of low-mass galaxies. They may also form as the result of mergers of smaller black holes, with several LIGO observations finding merged black holes within the 110-350 solar mass range. The black holes with the largest masses are called supermassive black holes, with masses more than 106 times that of the Sun. These black holes are believed to exist at the centers of almost every large galaxy, including the Milky Way. Some scientists have proposed a subcategory of even larger black holes, called ultramassive black holes, with masses greater than 109-1010 solar masses. Theoretical models predict that the accretion disc that feeds black holes will be unstable once a black hole reaches 50-100 billion times the mass of the Sun, setting a rough upper limit to black hole mass. Structure While black holes are conceptually invisible sinks of all matter and light, in astronomical settings, their enormous gravity alters the motion of surrounding objects and pulls nearby gas inwards at near-light speed, making the area around black holes the brightest objects in the universe. Some black holes have relativistic jets—thin streams of plasma travelling away from the black hole at more than one-tenth of the speed of light. A small faction of the matter falling towards the black hole gets accelerated away along the hole rotation axis. These jets can extend as far as millions of parsecs from the black hole itself. Black holes of any mass can have jets. However, they are typically observed around spinning black holes with strongly-magnetized accretion disks. Relativistic jets were more common in the early universe, when galaxies and their corresponding supermassive black holes were rapidly gaining mass. All black holes with jets also have an accretion disk, but the jets are usually brighter than the disk. Quasars, typically found in other galaxies, are believed to be supermassive black holes with jets; microquasars are believed to be stellar-mass objects with jets, typically observed in the Milky Way. The mechanism of formation of jets is not yet known, but several options have been proposed. One method proposed to fuel these jets is the Blandford-Znajek process, which suggests that the dragging of magnetic field lines by a black hole's rotation could launch jets of matter into space. The Penrose process, which involves extraction of a black hole's rotational energy, has also been proposed as a potential mechanism of jet propulsion. Due to conservation of angular momentum, gas falling into the gravitational well created by a massive object will typically form a disk-like structure around the object.: 242 As the disk's angular momentum is transferred outward due to internal processes, its matter falls farther inward, converting its gravitational energy into heat and releasing a large flux of x-rays. The temperature of these disks can range from thousands to millions of Kelvin, and temperatures can differ throughout a single accretion disk. Accretion disks can also emit in other parts of the electromagnetic spectrum, depending on the disk's turbulence and magnetization and the black hole's mass and angular momentum. Accretion disks can be defined as geometrically thin or geometrically thick. Geometrically thin disks are mostly confined to the black hole's equatorial plane and have a well-defined edge at the innermost stable circular orbit (ISCO), while geometrically thick disks are supported by internal pressure and temperature and can extend inside the ISCO. Disks with high rates of electron scattering and absorption, appearing bright and opaque, are called optically thick; optically thin disks are more translucent and produce fainter images when viewed from afar. Accretion disks of black holes accreting beyond the Eddington limit are often referred to as polish donuts due to their thick, toroidal shape that resembles that of a donut. Quasar accretion disks are expected to usually appear blue in color. The disk for a stellar black hole, on the other hand, would likely look orange, yellow, or red, with its inner regions being the brightest. Theoretical research suggests that the hotter a disk is, the bluer it should be, although this is not always supported by observations of real astronomical objects. Accretion disk colors may also be altered by the Doppler effect, with the part of the disk travelling towards an observer appearing bluer and brighter and the part of the disk travelling away from the observer appearing redder and dimmer. In Newtonian gravity, test particles can stably orbit at arbitrary distances from a central object. In general relativity, however, there exists a smallest possible radius for which a massive particle can orbit stably. Any infinitesimal inward perturbations to this orbit will lead to the particle spiraling into the black hole, and any outward perturbations will, depending on the energy, cause the particle to spiral in, move to a stable orbit further from the black hole, or escape to infinity. This orbit is called the innermost stable circular orbit, or ISCO. The location of the ISCO depends on the spin of the black hole and the spin of the particle itself. In the case of a Schwarzschild black hole (spin zero) and a particle without spin, the location of the ISCO is: r I S C O = 3 r s = 6 G M c 2 , {\displaystyle r_{\rm {ISCO}}=3\,r_{\text{s}}={\frac {6\,GM}{c^{2}}},} where r I S C O {\displaystyle r_{\rm {_{ISCO}}}} is the radius of the ISCO, r s {\displaystyle r_{\text{s}}} is the Schwarzschild radius of the black hole, G {\displaystyle G} is the gravitational constant, and c {\displaystyle c} is the speed of light. The radius of this orbit changes slightly based on particle spin. For charged black holes, the ISCO moves inwards. For spinning black holes, the ISCO is moved inwards for particles orbiting in the same direction that the black hole is spinning (prograde) and outwards for particles orbiting in the opposite direction (retrograde). For example, the ISCO for a particle orbiting retrograde can be as far out as about 9 r s {\displaystyle 9r_{\text{s}}} , while the ISCO for a particle orbiting prograde can be as close as at the event horizon itself. The photon sphere is a spherical boundary for which photons moving on tangents to that sphere are bent completely around the black hole, possibly orbiting multiple times. Light rays with impact parameters less than the radius of the photon sphere enter the black hole. For Schwarzschild black holes, the photon sphere has a radius 1.5 times the Schwarzschild radius; the radius for non-Schwarzschild black holes is at least 1.5 times the radius of the event horizon. When viewed from a great distance, the photon sphere creates an observable black hole shadow. Since no light emerges from within the black hole, this shadow is the limit for possible observations.: 152 The shadow of colliding black holes should have characteristic warped shapes, allowing scientists to detect black holes that are about to merge. While light can still escape from the photon sphere, any light that crosses the photon sphere on an inbound trajectory will be captured by the black hole. Therefore, any light that reaches an outside observer from the photon sphere must have been emitted by objects between the photon sphere and the event horizon. Light emitted towards the photon sphere may also curve around the black hole and return to the emitter. For a rotating, uncharged black hole, the radius of the photon sphere depends on the spin parameter and whether the photon is orbiting prograde or retrograde. For a photon orbiting prograde, the photon sphere will be 1-3 Schwarzschild radii from the center of the black hole, while for a photon orbiting retrograde, the photon sphere will be between 3-5 Schwarzschild radii from the center of the black hole. The exact location of the photon sphere depends on the magnitude of the black hole's rotation. For a charged, nonrotating black hole, there will only be one photon sphere, and the radius of the photon sphere will decrease for increasing black hole charge. For non-extremal, charged, rotating black holes, there will always be two photon spheres, with the exact radii depending on the parameters of the black hole. Near a rotating black hole, spacetime rotates similar to a vortex. The rotating spacetime will drag any matter and light into rotation around the spinning black hole. This effect of general relativity, called frame dragging, gets stronger closer to the spinning mass. The region of spacetime in which it is impossible to stay still is called the ergosphere. The ergosphere of a black hole is a volume bounded by the black hole's event horizon and the ergosurface, which coincides with the event horizon at the poles but bulges out from it around the equator. Matter and radiation can escape from the ergosphere. Through the Penrose process, objects can emerge from the ergosphere with more energy than they entered with. The extra energy is taken from the rotational energy of the black hole, slowing down the rotation of the black hole.: 268 A variation of the Penrose process in the presence of strong magnetic fields, the Blandford–Znajek process, is considered a likely mechanism for the enormous luminosity and relativistic jets of quasars and other active galactic nuclei. The observable region of spacetime around a black hole closest to its event horizon is called the plunging region. In this area it is no longer possible for free falling matter to follow circular orbits or stop a final descent into the black hole. Instead, it will rapidly plunge toward the black hole at close to the speed of light, growing increasingly hot and producing a characteristic, detectable thermal emission. However, light and radiation emitted from this region can still escape from the black hole's gravitational pull. For a nonspinning, uncharged black hole, the radius of the event horizon, or Schwarzschild radius, is proportional to the mass, M, through r s = 2 G M c 2 ≈ 2.95 M M ⊙ k m , {\displaystyle r_{\mathrm {s} }={\frac {2GM}{c^{2}}}\approx 2.95\,{\frac {M}{M_{\odot }}}~\mathrm {km,} } where rs is the Schwarzschild radius and M☉ is the mass of the Sun.: 124 For a black hole with nonzero spin or electric charge, the radius is smaller,[Note 1] until an extremal black hole could have an event horizon close to r + = G M c 2 , {\displaystyle r_{\mathrm {+} }={\frac {GM}{c^{2}}},} half the radius of a nonspinning, uncharged black hole of the same mass. Since the volume within the Schwarzschild radius increase with the cube of the radius, average density of a black hole inside its Schwarzschild radius is inversely proportional to the square of its mass: supermassive black holes are much less dense than stellar black holes. The average density of a 108 M☉ black hole is comparable to that of water. The defining feature of a black hole is the existence of an event horizon, a boundary in spacetime through which matter and light can pass only inward towards the center of the black hole. Nothing, not even light, can escape from inside the event horizon. The event horizon is referred to as such because if an event occurs within the boundary, information from that event cannot reach or affect an outside observer, making it impossible to determine whether such an event occurred.: 179 For non-rotating black holes, the geometry of the event horizon is precisely spherical, while for rotating black holes, the event horizon is oblate. To a distant observer, a clock near a black hole would appear to tick more slowly than one further from the black hole.: 217 This effect, known as gravitational time dilation, would also cause an object falling into a black hole to appear to slow as it approached the event horizon, never quite reaching the horizon from the perspective of an outside observer.: 218 All processes on this object would appear to slow down, and any light emitted by the object to appear redder and dimmer, an effect known as gravitational redshift. An object falling from half of a Schwarzschild radius above the event horizon would fade away until it could no longer be seen, disappearing from view within one hundredth of a second. It would also appear to flatten onto the black hole, joining all other material that had ever fallen into the hole. On the other hand, an observer falling into a black hole would not notice any of these effects as they cross the event horizon. Their own clocks appear to them to tick normally, and they cross the event horizon after a finite time without noting any singular behaviour. In general relativity, it is impossible to determine the location of the event horizon from local observations, due to Einstein's equivalence principle.: 222 Black holes that are rotating and/or charged have an inner horizon, often called the Cauchy horizon, inside of the black hole. The inner horizon is divided up into two segments: an ingoing section and an outgoing section. At the ingoing section of the Cauchy horizon, radiation and matter that fall into the black hole would build up at the horizon, causing the curvature of spacetime to go to infinity. This would cause an observer falling in to experience tidal forces. This phenomenon is often called mass inflation, since it is associated with a parameter dictating the black hole's internal mass growing exponentially, and the buildup of tidal forces is called the mass-inflation singularity or Cauchy horizon singularity. Some physicists have argued that in realistic black holes, accretion and Hawking radiation would stop mass inflation from occurring. At the outgoing section of the inner horizon, infalling radiation would backscatter off of the black hole's spacetime curvature and travel outward, building up at the outgoing Cauchy horizon. This would cause an infalling observer to experience a gravitational shock wave and tidal forces as the spacetime curvature at the horizon grew to infinity. This buildup of tidal forces is called the shock singularity. Both of these singularities are weak, meaning that an object crossing them would only be deformed a finite amount by tidal forces, even though the spacetime curvature would still be infinite at the singularity. This is as opposed to a strong singularity, where an object hitting the singularity would be stretched and squeezed by an infinite amount. They are also null singularities, meaning that a photon could travel parallel to the them without ever being intercepted. Ignoring quantum effects, every black hole has a singularity inside, points where the curvature of spacetime becomes infinite, and geodesics terminate within a finite proper time.: 205 For a non-rotating black hole, this region takes the shape of a single point; for a rotating black hole it is smeared out to form a ring singularity that lies in the plane of rotation.: 264 In both cases, the singular region has zero volume. All of the mass of the black hole ends up in the singularity.: 252 Since the singularity has nonzero mass in an infinitely small space, it can be thought of as having infinite density. Observers falling into a Schwarzschild black hole (i.e., non-rotating and not charged) cannot avoid being carried into the singularity once they cross the event horizon. As they fall further into the black hole, they will be torn apart by the growing tidal forces in a process sometimes referred to as spaghettification or the noodle effect. Eventually, they will reach the singularity and be crushed into an infinitely small point.: 182 However any perturbations, such as those caused by matter or radiation falling in, would cause space to oscillate chaotically near the singularity. Any matter falling in would experience intense tidal forces rapidly changing in direction, all while being compressed into an increasingly small volume. Alternative forms of general relativity, including addition of some quatum effects, can lead to regular, or nonsingular, black holes without singularities. For example, the fuzzball model, based on string theory, states that black holes are actually made up of quantum microstates and need not have a singularity or an event horizon. The theory of loop quantum gravity proposes that the curvature and density at the center of a black hole is large, but not infinite. Formation Black holes are formed by gravitational collapse of massive stars, either by direct collapse or during a supernova explosion in a process called fallback. Black holes can result from the merger of two neutron stars or a neutron star and a black hole. Other more speculative mechanisms include primordial black holes created from density fluctuations in the early universe, the collapse of dark stars, a hypothetical object powered by annihilation of dark matter, or from hypothetical self-interacting dark matter. Gravitational collapse occurs when an object's internal pressure is insufficient to resist the object's own gravity. At the end of a star's life, it will run out of hydrogen to fuse, and will start fusing more and more massive elements, until it gets to iron. Since the fusion of elements heavier than iron would require more energy than it would release, nuclear fusion ceases. If the iron core of the star is too massive, the star will no longer be able to support itself and will undergo gravitational collapse. While most of the energy released during gravitational collapse is emitted very quickly, an outside observer does not actually see the end of this process. Even though the collapse takes a finite amount of time from the reference frame of infalling matter, a distant observer would see the infalling material slow and halt just above the event horizon, due to gravitational time dilation. Light from the collapsing material takes longer and longer to reach the observer, with the delay growing to infinity as the emitting material reaches the event horizon. Thus the external observer never sees the formation of the event horizon; instead, the collapsing material seems to become dimmer and increasingly red-shifted, eventually fading away. Observations of quasars at redshift z ∼ 7 {\displaystyle z\sim 7} , less than a billion years after the Big Bang, has led to investigations of other ways to form black holes. The accretion process to build supermassive black holes has a limiting rate of mass accumulation and a billion years is not enough time to reach quasar status. One suggestion is direct collapse of nearly pure hydrogen gas (low metalicity) clouds characteristic of the young universe, forming a supermassive star which collapses into a black hole. It has been suggested that seed black holes with typical masses of ~105 M☉ could have formed in this way which then could grow to ~109 M☉. However, the very large amount of gas required for direct collapse is not typically stable to fragmentation to form multiple stars. Thus another approach suggests massive star formation followed by collisions that seed massive black holes which ultimately merge to create a quasar.: 85 A neutron star in a common envelope with a regular star can accrete sufficient material to collapse to a black hole or two neutron stars can merge. These avenues for the formation of black holes are considered relatively rare. In the current epoch of the universe, conditions needed to form black holes are rare and are mostly only found in stars. However, in the early universe, conditions may have allowed for black hole formations via other means. Fluctuations of spacetime soon after the Big Bang may have formed areas that were denser then their surroundings. Initially, these regions would not have been compact enough to form a black hole, but eventually, the curvature of spacetime in the regions become large enough to cause them to collapse into a black hole. Different models for the early universe vary widely in their predictions of the scale of these fluctuations. Various models predict the creation of primordial black holes ranging from a Planck mass (~2.2×10−8 kg) to hundreds of thousands of solar masses. Primordial black holes with masses less than 1015 g would have evaporated by now due to Hawking radiation. Despite the early universe being extremely dense, it did not re-collapse into a black hole during the Big Bang, since the universe was expanding rapidly and did not have the gravitational differential necessary for black hole formation. Models for the gravitational collapse of objects of relatively constant size, such as stars, do not necessarily apply in the same way to rapidly expanding space such as the Big Bang. In principle, black holes could be formed in high-energy particle collisions that achieve sufficient density, although no such events have been detected. These hypothetical micro black holes, which could form from the collision of cosmic rays and Earth's atmosphere or in particle accelerators like the Large Hadron Collider, would not be able to aggregate additional mass. Instead, they would evaporate in about 10−25 seconds, posing no threat to the Earth. Evolution Black holes can also merge with other objects such as stars or even other black holes. This is thought to have been important, especially in the early growth of supermassive black holes, which could have formed from the aggregation of many smaller objects. The process has also been proposed as the origin of some intermediate-mass black holes. Mergers of supermassive black holes may take a long time: As a binary of supermassive black holes approach each other, most nearby stars are ejected, leaving little for the remaining black holes to gravitationally interact with that would allow them to get closer to each other. This phenomenon has been called the final parsec problem, as the distance at which this happens is usually around one parsec. When a black hole accretes matter, the gas in the inner accretion disk orbits at very high speeds because of its proximity to the black hole. The resulting friction heats the inner disk to temperatures at which it emits vast amounts of electromagnetic radiation (mainly X-rays) detectable by telescopes. By the time the matter of the disk reaches the ISCO, between 5.7% and 42% of its mass will have been converted to energy, depending on the black hole's spin. About 90% of this energy is released within about 20 black hole radii. In many cases, accretion disks are accompanied by relativistic jets that are emitted along the black hole's poles, which carry away much of the energy. The mechanism for the creation of these jets is currently not well understood, in part due to insufficient data. Many of the universe's most energetic phenomena have been attributed to the accretion of matter on black holes. Active galactic nuclei and quasars are believed to be the accretion disks of supermassive black holes. X-ray binaries are generally accepted to be binary systems in which one of the two objects is a compact object accreting matter from its companion. Ultraluminous X-ray sources may be the accretion disks of intermediate-mass black holes. At a certain rate of accretion, the outward radiation pressure will become as strong as the inward gravitational force, and the black hole should unable to accrete any faster. This limit is called the Eddington limit. However, many black holes accrete beyond this rate due to their non-spherical geometry or instabilities in the accretion disk. Accretion beyond the limit is called Super-Eddington accretion and may have been commonplace in the early universe. Stars have been observed to get torn apart by tidal forces in the immediate vicinity of supermassive black holes in galaxy nuclei, in what is known as a tidal disruption event (TDE). Some of the material from the disrupted star forms an accretion disk around the black hole, which emits observable electromagnetic radiation. The correlation between the masses of supermassive black holes in the centres of galaxies with the velocity dispersion and mass of stars in their host bulges suggests that the formation of galaxies and the formation of their central black holes are related. Black hole winds from rapid accretion, particularly when the galaxy itself is still accreting matter, can compress gas nearby, accelerating star formation. However, if the winds become too strong, the black hole may blow nearly all of the gas out of the galaxy, quenching star formation. Black hole jets may also energize nearby cavities of plasma and eject low-entropy gas from out of the galactic core, causing gas in galactic centers to be hotter than expected. If Hawking's theory of black hole radiation is correct, then black holes are expected to shrink and evaporate over time as they lose mass by the emission of photons and other particles. The temperature of this thermal spectrum (Hawking temperature) is proportional to the surface gravity of the black hole, which is inversely proportional to the mass. Hence, large black holes emit less radiation than small black holes.: Ch. 9.6 A stellar black hole of 1 M☉ has a Hawking temperature of 62 nanokelvins. This is far less than the 2.7 K temperature of the cosmic microwave background radiation. Stellar-mass or larger black holes receive more mass from the cosmic microwave background than they emit through Hawking radiation and thus will grow instead of shrinking. To have a Hawking temperature larger than 2.7 K (and be able to evaporate), a black hole would need a mass less than the Moon. Such a black hole would have a diameter of less than a tenth of a millimetre. The Hawking radiation for an astrophysical black hole is predicted to be very weak and would thus be exceedingly difficult to detect from Earth. A possible exception is the burst of gamma rays emitted in the last stage of the evaporation of primordial black holes. Searches for such flashes have proven unsuccessful and provide stringent limits on the possibility of existence of low mass primordial black holes, with modern research predicting that primordial black holes must make up less than a fraction of 10−7 of the universe's total mass. NASA's Fermi Gamma-ray Space Telescope, launched in 2008, has searched for these flashes, but has not yet found any. The properties of a black hole are constrained and interrelated by the theories that predict these properties. When based on general relativity, these relationships are called the laws of black hole mechanics. For a black hole that is not still forming or accreting matter, the zeroth law of black hole mechanics states the black hole's surface gravity is constant across the event horizon. The first law relates changes in the black hole's surface area, angular momentum, and charge to changes in its energy. The second law says the surface area of a black hole never decreases on its own. Finally, the third law says that the surface gravity of a black hole is never zero. These laws are mathematical analogs of the laws of thermodynamics. They are not equivalent, however, because, according to general relativity without quantum mechanics, a black hole can never emit radiation, and thus its temperature must always be zero.: 11 Quantum mechanics predicts that a black hole will continuously emit thermal Hawking radiation, and therefore must always have a nonzero temperature. It also predicts that all black holes have entropy which scales with their surface area. When quantum mechanics is accounted for, the laws of black hole mechanics become equivalent to the classical laws of thermodynamics. However, these conclusions are derived without a complete theory of quantum gravity, although many potential theories do predict black holes having entropy and temperature. Thus, the true quantum nature of black hole thermodynamics continues to be debated.: 29 Observational evidence Millions of black holes with around 30 solar masses derived from stellar collapse are expected to exist in the Milky Way. Even a dwarf galaxy like Draco should have hundreds. Only a few of these have been detected. By nature, black holes do not themselves emit any electromagnetic radiation other than the hypothetical Hawking radiation, so astrophysicists searching for black holes must generally rely on indirect observations. The defining characteristic of a black hole is its event horizon. The horizon itself cannot be imaged, so all other possible explanations for these indirect observations must be considered and eliminated before concluding that a black hole has been observed.: 11 The Event Horizon Telescope (EHT) is a global system of radio telescopes capable of directly observing a black hole shadow. The angular resolution of a telescope is based on its aperture and the wavelengths it is observing. Because the angular diameters of Sagittarius A* and Messier 87* in the sky are very small, a single telescope would need to be about the size of the Earth to clearly distinguish their horizons using radio wavelengths. By combining data from several different radio telescopes around the world, the Event Horizon Telescope creates an effective aperture the diameter size of the Earth. The EHT team used imaging algorithms to compute the most probable image from the data in its observations of Sagittarius A* and M87*. Gravitational-wave interferometry can be used to detect merging black holes and other compact objects. In this method, a laser beam is split down two long arms of a tunnel. The laser beams reflect off of mirrors in the tunnels and converge at the intersection of the arms, cancelling each other out. However, when a gravitational wave passes, it warps spacetime, changing the lengths of the arms themselves. Since each laser beam is now travelling a slightly different distance, they do not cancel out and produce a recognizable signal. Analysis of the signal can give scientists information about what caused the gravitational waves. Since gravitational waves are very weak, gravitational-wave observatories such as LIGO must have arms several kilometers long and carefully control for noise from Earth to be able to detect these gravitational waves. Since the first measurements in 2016, multiple gravitational waves from black holes have been detected and analyzed. The proper motions of stars near the centre of the Milky Way provide strong observational evidence that these stars are orbiting a supermassive black hole. Since 1995, astronomers have tracked the motions of 90 stars orbiting an invisible object coincident with the radio source Sagittarius A*. In 1998, by fitting the motions of the stars to Keplerian orbits, the astronomers were able to infer that Sagittarius A* must be a 2.6×106 M☉ object must be contained within a radius of 0.02 light-years. Since then, one of the stars—called S2—has completed a full orbit. From the orbital data, astronomers were able to refine the calculations of the mass of Sagittarius A* to 4.3×106 M☉, with a radius of less than 0.002 light-years. This upper limit radius is larger than the Schwarzschild radius for the estimated mass, so the combination does not prove Sagittarius A* is a black hole. Nevertheless, these observations strongly suggest that the central object is a supermassive black hole as there are no other plausible scenarios for confining so much invisible mass into such a small volume. Additionally, there is some observational evidence that this object might possess an event horizon, a feature unique to black holes. The Event Horizon Telescope image of Sagittarius A*, released in 2022, provided further confirmation that it is indeed a black hole. X-ray binaries are binary systems that emit a majority of their radiation in the X-ray part of the electromagnetic spectrum. These X-ray emissions result when a compact object accretes matter from an ordinary star. The presence of an ordinary star in such a system provides an opportunity for studying the central object and to determine if it might be a black hole. By measuring the orbital period of the binary, the distance to the binary from Earth, and the mass of the companion star, scientists can estimate the mass of the compact object. The Tolman-Oppenheimer-Volkoff limit (TOV limit) dictates the largest mass a nonrotating neutron star can be, and is estimated to be about two solar masses. While a rotating neutron star can be slightly more massive, if the compact object is much more massive than the TOV limit, it cannot be a neutron star and is generally expected to be a black hole. The first strong candidate for a black hole, Cygnus X-1, was discovered in this way by Charles Thomas Bolton, Louise Webster, and Paul Murdin in 1972. Observations of rotation broadening of the optical star reported in 1986 lead to a compact object mass estimate of 16 solar masses, with 7 solar masses as the lower bound. In 2011, this estimate was updated to 14.1±1.0 M☉ for the black hole and 19.2±1.9 M☉ for the optical stellar companion. X-ray binaries can be categorized as either low-mass or high-mass; This classification is based on the mass of the companion star, not the compact object itself. In a class of X-ray binaries called soft X-ray transients, the companion star is of relatively low mass, allowing for more accurate estimates of the black hole mass. These systems actively emit X-rays for only several months once every 10–50 years. During the period of low X-ray emission, called quiescence, the accretion disk is extremely faint, allowing detailed observation of the companion star. Numerous black hole candidates have been measured by this method. Black holes are also sometimes found in binaries with other compact objects, such as white dwarfs, neutron stars, and other black holes. The centre of nearly every galaxy contains a supermassive black hole. The close observational correlation between the mass of this hole and the velocity dispersion of the host galaxy's bulge, known as the M–sigma relation, strongly suggests a connection between the formation of the black hole and that of the galaxy itself. Astronomers use the term active galaxy to describe galaxies with unusual characteristics, such as unusual spectral line emission and very strong radio emission. Theoretical and observational studies have shown that the high levels of activity in the centers of these galaxies, regions called active galactic nuclei (AGN), may be explained by accretion onto supermassive black holes. These AGN consist of a central black hole that may be millions or billions of times more massive than the Sun, a disk of interstellar gas and dust called an accretion disk, and two jets perpendicular to the accretion disk. Although supermassive black holes are expected to be found in most AGN, only some galaxies' nuclei have been more carefully studied in attempts to both identify and measure the actual masses of the central supermassive black hole candidates. Some of the most notable galaxies with supermassive black hole candidates include the Andromeda Galaxy, Messier 32, Messier 87, the Sombrero Galaxy, and the Milky Way itself. Another way black holes can be detected is through observation of effects caused by their strong gravitational field. One such effect is gravitational lensing: The deformation of spacetime around a massive object causes light rays to be deflected, making objects behind them appear distorted. When the lensing object is a black hole, this effect can be strong enough to create multiple images of a star or other luminous source. However, the distance between the lensed images may be too small for contemporary telescopes to resolve—this phenomenon is called microlensing. Instead of seeing two images of a lensed star, astronomers see the star brighten slightly as the black hole moves towards the line of sight between the star and Earth and then return to its normal luminosity as the black hole moves away. The turn of the millennium saw the first 3 candidate detections of black holes in this way, and in January 2022, astronomers reported the first confirmed detection of a microlensing event from an isolated black hole. This was also the first determination of an isolated black hole mass, 7.1±1.3 M☉. Alternatives While there is a strong case for supermassive black holes, the model for stellar-mass black holes assumes of an upper limit for the mass of a neutron star: objects observed to have more mass are assumed to be black holes. However, the properties of extremely dense matter are poorly understood. New exotic phases of matter could allow other kinds of massive objects. Quark stars would be made up of quark matter and supported by quark degeneracy pressure, a form of degeneracy pressure even stronger than neutron degeneracy pressure. This would halt gravitational collapse at a higher mass than for a neutron star. Even stronger stars called electroweak stars would convert quarks in their cores into leptons, providing additional pressure to stop the star from collapsing. If, as some extensions of the Standard Model posit, quarks and leptons are made up of the even-smaller fundamental particles called preons, a very compact star could be supported by preon degeneracy pressure. While none of these hypothetical models can explain all of the observations of stellar black hole candidates, a Q star is the only alternative which could significantly exceed the mass limit for neutron stars and thus provide an alternative for supermassive black holes.: 12 A few theoretical objects have been conjectured to match observations of astronomical black hole candidates identically or near-identically, but which function via a different mechanism. A dark energy star would convert infalling matter into vacuum energy; This vacuum energy would be much larger than the vacuum energy of outside space, exerting outwards pressure and preventing a singularity from forming. A black star would be gravitationally collapsing slowly enough that quantum effects would keep it just on the cusp of fully collapsing into a black hole. A gravastar would consist of a very thin shell and a dark-energy interior providing outward pressure to stop the collapse into a black hole or formation of a singularity; It could even have another gravastar inside, called a 'nestar'. Open questions According to the no-hair theorem, a black hole is defined by only three parameters: its mass, charge, and angular momentum. This seems to mean that all other information about the matter that went into forming the black hole is lost, as there is no way to determine anything about the black hole from outside other than those three parameters. When black holes were thought to persist forever, this information loss was not problematic, as the information can be thought of as existing inside the black hole. However, black holes slowly evaporate by emitting Hawking radiation. This radiation does not appear to carry any additional information about the matter that formed the black hole, meaning that this information is seemingly gone forever. This is called the black hole information paradox. Theoretical studies analyzing the paradox have led to both further paradoxes and new ideas about the intersection of quantum mechanics and general relativity. While there is no consensus on the resolution of the paradox, work on the problem is expected to be important for a theory of quantum gravity.: 126 Observations of faraway galaxies have found that ultraluminous quasars, powered by supermassive black holes, existed in the early universe as far as redshift z ≥ 7 {\displaystyle z\geq 7} . These black holes have been assumed to be the products of the gravitational collapse of large population III stars. However, these stellar remnants were not massive enough to produce the quasars observed at early times without accreting beyond the Eddington limit, the theoretical maximum rate of black hole accretion. Physicists have suggested a variety of different mechanisms by which these supermassive black holes may have formed. It has been proposed that smaller black holes may have also undergone mergers to produce the observed supermassive black holes. It is also possible that they were seeded by direct-collapse black holes, in which a large cloud of hot gas avoids fragmentation that would lead to multiple stars, due to low angular momentum or heating from a nearby galaxy. Given the right circumstances, a single supermassive star forms and collapses directly into a black hole without undergoing typical stellar evolution. Additionally, these supermassive black holes in the early universe may be high-mass primordial black holes, which could have accreted further matter in the centers of galaxies. Finally, certain mechanisms allow black holes to grow faster than the theoretical Eddington limit, such as dense gas in the accretion disk limiting outward radiation pressure that prevents the black hole from accreting. However, the formation of bipolar jets prevent super-Eddington rates. In fiction Black holes have been portrayed in science fiction in a variety of ways. Even before the advent of the term itself, objects with characteristics of black holes appeared in stories such as the 1928 novel The Skylark of Space with its "black Sun" and the "hole in space" in the 1935 short story Starship Invincible. As black holes grew to public recognition in the 1960s and 1970s, they began to be featured in films as well as novels, such as Disney's The Black Hole. Black holes have also been used in works of the 21st century, such as Christopher Nolan's science fiction epic Interstellar. Authors and screenwriters have exploited the relativistic effects of black holes, particularly gravitational time dilation. For example, Interstellar features a black hole planet with a time dilation factor of over 60,000:1, while the 1977 novel Gateway depicts a spaceship approaching but never crossing the event horizon of a black hole from the perspective of an outside observer due to time dilation effects. Black holes have also been appropriated as wormholes or other methods of faster-than-light travel, such as in the 1974 novel The Forever War, where a network of black holes is used for interstellar travel. Additionally, black holes can feature as hazards to spacefarers and planets: A black hole threatens a deep-space outpost in 1978 short story The Black Hole Passes, and a binary black hole dangerously alters the orbit of a planet in the 2018 Netflix reboot of Lost in Space. Notes References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Extraterrestrial_life#Search_for_basic_life] | [TOKENS: 11349]
Contents Extraterrestrial life Extraterrestrial life, or alien life (colloquially aliens), is life that originates from another world rather than on Earth. No extraterrestrial life has yet been scientifically or conclusively detected. Such life might range from simple forms such as prokaryotes to intelligent beings, possibly bringing forth civilizations that might be far more, or far less, advanced than humans. The Drake equation speculates about the existence of sapient life elsewhere in the universe. The science of extraterrestrial life is known as astrobiology. Speculation about inhabited worlds beyond Earth dates back to antiquity. Early Christian writers, including Augustine, discussed ideas from thinkers like Democritus and Epicurus about countless worlds in the vast universe. Pre-modern writers typically assumed extraterrestrial "worlds" were inhabited by living beings. William Vorilong, in the 15th century, acknowledged the possibility Jesus could have visited extraterrestrial worlds to redeem their inhabitants.: 26 In 1440, Nicholas of Cusa suggested Earth is a "brilliant star"; he theorized that all celestial bodies, even the Sun, could host life. Descartes wrote that there were no means to prove the stars were not inhabited by "intelligent creatures", but their existence was a matter of speculation.: 67 In comparison to the life-abundant Earth, the vast majority of intrasolar and extrasolar planets and moons have harsh surface conditions and disparate atmospheric chemistry, or lack an atmosphere. However, there are many extreme and chemically harsh ecosystems on Earth that do support forms of life and are often hypothesized to be the origin of life on Earth. Examples include life surrounding hydrothermal vents, acidic hot springs, and volcanic lakes, as well as halophiles and the deep biosphere. Since the mid-20th century, researchers have searched for extraterrestrial life and intelligence. Solar system studies focus on Venus, Mars, Europa, and Titan, while exoplanet discoveries now total 6,022 confirmed planets in 4,490 systems as of October 2025. Depending on the category of search, methods range from analysis of telescope and specimen data to radios used to detect and transmit interstellar communication. Interstellar travel remains largely hypothetical, with only the Voyager 1 and Voyager 2 probes confirmed to have entered the interstellar medium. The concept of extraterrestrial life, especially intelligent life, has greatly influenced culture and fiction. A key debate centers on contacting extraterrestrial intelligence: some advocate active attempts, while others warn it could be risky, given human history of exploiting other societies. Context Initially, after the Big Bang, the universe was too hot to allow life. It is estimated that the temperature of the universe was around 10 billion Kelvin at the one-second mark. Roughly 15 million years later, it cooled to temperate levels, though the elements of organic life were yet nonexistent. The only freely available elements at that point were hydrogen and helium. Carbon and oxygen (and later, water) would not appear until 50 million years later, created through stellar fusion. At that point, the difficulty for life to appear was not the temperature, but the scarcity of free heavy elements. Planetary systems emerged, and the first organic compounds may have formed in the protoplanetary disk of dust grains that would eventually create rocky planets like Earth. Although Earth was in a molten state after its birth and may have burned any organics that fell on it, it would have been more receptive once it cooled down. Once the right conditions on Earth were met, life started by a chemical process known as abiogenesis. Alternatively, life may have formed less frequently, then spread—by meteoroids, for example—between habitable planets in a process called panspermia. During most of its stellar evolution, stars combine hydrogen nuclei to make helium nuclei by stellar fusion, and the comparatively lighter weight of helium allows the star to release the extra energy. The process continues until the star uses all of its available fuel, with the speed of consumption being related to the size of the star. During its last stages, stars start combining helium nuclei to form carbon nuclei. The larger stars can further combine carbon nuclei to create oxygen and silicon, oxygen into neon and sulfur, and so on until iron. Ultimately, the star blows much of its content back into the stellar medium, where it would join clouds that would eventually become new generations of stars and planets. Many of those materials are the raw components of life on Earth. As this process takes place in all the universe, said materials are ubiquitous in the cosmos and not a rarity from the Solar System. Earth is a planet in the Solar System, a planetary system formed by a star at the center, the Sun, and the objects that orbit it: other planets, moons, asteroids, and comets. The sun is part of the Milky Way, a galaxy. The Milky Way is part of the Local Group, a galaxy group that is in turn part of the Laniakea Supercluster. The universe is composed of all similar structures in existence. The immense distances between celestial objects are a difficulty for studying extraterrestrial life. So far, humans have only set foot on the Moon and sent robotic probes to other planets and moons in the Solar System. Although probes can withstand conditions that may be lethal to humans, the distances cause time delays: the New Horizons took nine years after launch to reach Pluto. No probe has ever reached extrasolar planetary systems. The Voyager 2 left the Solar System at a speed of 50,000 kilometers per hour; if it headed towards the Alpha Centauri system, the closest one to Earth at 4.4 light years, it would reach it in 100,000 years. Under current technology, such systems can only be studied by telescopes, which have limitations. It is estimated that dark matter has a larger amount of combined matter than stars and gas clouds, but as it plays no role in the stellar evolution of stars and planets, it is usually not taken into account by astrobiology. There is an area around a star, the circumstellar habitable zone or "Goldilocks zone", wherein water may be at the right temperature to exist in liquid form at a planetary surface. This area is neither too close to the star, where water would become steam, nor too far away, where water would be frozen as ice. However, although useful as an approximation, planetary habitability is complex and defined by several factors. Being in the habitable zone is not enough for a planet to be habitable, not even to actually have such liquid water. Venus is located in the solar system's habitable zone, but does not have liquid water because of the conditions of its atmosphere. Jovian planets or gas giants are not considered habitable even if they orbit close enough to their stars as hot Jupiters, due to crushing atmospheric pressures. The actual distances for the habitable zones vary according to the type of star, and even the solar activity of each specific star influences the local habitability. The type of star also defines the time the habitable zone will exist, as its presence and limits will change along with the star's stellar evolution. The Big Bang occurred 13.8 billion years ago, the Solar System was formed 4.6 billion years ago, and the first hominids appeared 6 million years ago. Life on other planets may have started, evolved, given birth to extraterrestrial intelligences, and perhaps even faced a planetary extinction event millions or billions of years ago. When considered from a cosmic perspective, the brief times of existence of Earth's species may suggest that extraterrestrial life may be equally fleeting under such a scale. During a period of about 7 million years, from about 10 to 17 million years after the Big Bang, the background temperature was between 373 and 273 K (100 and 0 °C; 212 and 32 °F), allowing the possibility of liquid water if any planets existed. Avi Loeb (2014) speculated that primitive life might in principle have appeared during this window, which he called "the Habitable Epoch of the Early Universe". Life on Earth is quite ubiquitous across the planet and has adapted over time to almost all the available environments in it, extremophiles and the deep biosphere thrive at even the most hostile ones. As a result, it is inferred that life in other celestial bodies may be equally adaptive. However, the origin of life is unrelated to its ease of adaptation and may have stricter requirements. A celestial body may not have any life on it, even if it were habitable. Likelihood of existence Life in the cosmos beyond Earth has been observed. The hypothesis of ubiquitous extraterrestrial life relies on three main ideas. The first one, the size of the universe, allows for plenty of planets to have a similar habitability to Earth, and the age of the universe gives enough time for a long process analog to the history of Earth to happen there. The second is that the substances that make life, such as carbon and water, are ubiquitous in the universe. The third is that the physical laws are universal, which means that the forces that would facilitate or prevent the existence of life would be the same ones as on Earth. According to this argument, made by scientists such as Carl Sagan and Stephen Hawking, it would be improbable for life not to exist somewhere else other than Earth. This argument is embodied in the Copernican principle, which states that Earth does not occupy a unique position in the Universe, and the mediocrity principle, which states that there is nothing special about life on Earth. Other authors consider instead that life in the cosmos, or at least multicellular life, may actually be rare. The Rare Earth hypothesis maintains that life on Earth is possible because of a series of factors that range from the location in the galaxy and the configuration of the Solar System to local characteristics of the planet, and that it is unlikely that another planet simultaneously meets all such requirements. The proponents of this hypothesis consider that very little evidence suggests the existence of extraterrestrial life and that, at this point, it is just a desired result and not a reasonable scientific explanation for any gathered data. In 1961, astronomer and astrophysicist Frank Drake devised the Drake equation as a way to stimulate scientific dialogue at a meeting on the search for extraterrestrial intelligence (SETI). The Drake equation is a probabilistic argument used to estimate the number of active, communicative extraterrestrial civilizations in the Milky Way galaxy. The Drake equation is:: xix where: and Drake's proposed estimates are as follows, but numbers on the right side of the equation are agreed as speculative and open to substitution: 10,000 = 5 ⋅ 0.5 ⋅ 2 ⋅ 1 ⋅ 0.2 ⋅ 1 ⋅ 10,000 {\displaystyle 10{,}000=5\cdot 0.5\cdot 2\cdot 1\cdot 0.2\cdot 1\cdot 10{,}000} [better source needed] The Drake equation has proved controversial since, although it is written as a math equation, none of its values were known at the time. Although some values may eventually be measured, others are based on social sciences and are not knowable by their very nature. This does not allow one to make noteworthy conclusions from the equation. Based on observations from the Hubble Space Telescope, there are nearly 2 trillion galaxies in the observable universe. It is estimated that at least ten percent of all Sun-like stars have a system of planets. In other words, there are 6.25×1018 stars with planets orbiting them in the observable universe. Even if it is assumed that only one out of a billion of these stars has planets supporting life, there would be some 6.25 billion life-supporting planetary systems in the observable universe. A 2013 study based on results from the Kepler spacecraft estimated that the Milky Way contains at least as many planets as it does stars, resulting in 100–400 billion exoplanets. The Nebular hypothesis that explains the formation of the Solar System and other planetary systems would suggest that those can have several configurations, and not all of them may have rocky planets within the habitable zone. The apparent contradiction between high estimates of the probability of the existence of extraterrestrial civilisations and the lack of evidence for such civilisations is known as the Fermi paradox. Dennis W. Sciama claimed that life's existence in the universe depends on various fundamental constants. Zhi-Wei Wang and Samuel L. Braunstein suggest that a random universe capable of supporting life is likely to be just barely able to do so, giving a potential explanation to the Fermi paradox. Biochemical basis If extraterrestrial life exists, it could range from simple microorganisms and multicellular organisms similar to animals or plants, to complex alien intelligences akin to humans. When scientists talk about extraterrestrial life, they consider all those types. Although it is possible that extraterrestrial life may have other configurations, scientists use the hierarchy of lifeforms from Earth for simplicity, as it is the only one known to exist. The first basic requirement for life is an environment with non-equilibrium thermodynamics, which means that the thermodynamic equilibrium must be broken by a source of energy. The traditional sources of energy in the cosmos are the stars, such as for life on Earth, which depends on the energy of the sun. However, there are other alternative energy sources, such as volcanoes, plate tectonics, and hydrothermal vents. There are ecosystems on Earth in deep areas of the ocean that do not receive sunlight, and take energy from black smokers instead. Magnetic fields and radioactivity have also been proposed as sources of energy, although they would be less efficient ones. Life on Earth requires water in a liquid state as a solvent in which biochemical reactions take place. It is highly unlikely that an abiogenesis process can start within a gaseous or solid medium: the atom speeds, either too fast or too slow, make it difficult for specific ones to meet and start chemical reactions. A liquid medium also allows the transport of nutrients and substances required for metabolism. Sufficient quantities of carbon and other elements, along with water, might enable the formation of living organisms on terrestrial planets with a chemical make-up and temperature range similar to that of Earth. Life based on ammonia rather than water has been suggested as an alternative, though this solvent appears less suitable than water. It is also conceivable that there are forms of life whose solvent is a liquid hydrocarbon, such as methane, ethane or propane. Another unknown aspect of potential extraterrestrial life would be the chemical elements that would compose it. Life on Earth is largely composed of carbon, but there could be other hypothetical types of biochemistry. A replacement for carbon would need to be able to create complex molecules, store information required for evolution, and be freely available in the medium. To create DNA, RNA, or a close analog, such an element should be able to bind its atoms with many others, creating complex and stable molecules. It should be able to create at least three covalent bonds: two for making long strings and at least a third to add new links and allow for diverse information. Only nine elements meet this requirement: boron, nitrogen, phosphorus, arsenic, antimony (three bonds), carbon, silicon, germanium and tin (four bonds). As for abundance, carbon, nitrogen, and silicon are the most abundant ones in the universe, far more than the others. On Earth's crust the most abundant of those elements is silicon, in the Hydrosphere it is carbon and in the atmosphere, it is carbon and nitrogen. Silicon, however, has disadvantages over carbon. The molecules formed with silicon atoms are less stable, and more vulnerable to acids, oxygen, and light. An ecosystem of silicon-based lifeforms would require very low temperatures, high atmospheric pressure, an atmosphere devoid of oxygen, and a solvent other than water. The low temperatures required would add an extra problem, the difficulty to kickstart a process of abiogenesis to create life in the first place. Norman Horowitz, head of the Jet Propulsion Laboratory bioscience section for the Mariner and Viking missions from 1965 to 1976 considered that the great versatility of the carbon atom makes it the element most likely to provide solutions, even exotic solutions, to the problems of survival of life on other planets. However, he also considered that the conditions found on Mars were incompatible with carbon based life. Even if extraterrestrial life is based on carbon and uses water as a solvent, like Earth life, it may still have a radically different biochemistry. Life is generally considered to be a product of natural selection. It has been proposed that to undergo natural selection a living entity must have the capacity to replicate itself, the capacity to avoid damage/decay, and the capacity to acquire and process resources in support of the first two capacities. Life on Earth may have started with an RNA world and later evolved to its current form, where some of the RNA tasks were transferred to DNA and proteins. Extraterrestrial life may still be stuck using RNA, or evolve into other configurations. It is unclear if our biochemistry is the most efficient one that could be generated, or which elements would follow a similar pattern. However, it is likely that, even if cells had a different composition to those from Earth, they would still have a cell membrane. Life on Earth jumped from prokaryotes to eukaryotes and from unicellular organisms to multicellular organisms through evolution. So far no alternative process to achieve such a result has been conceived, even if hypothetical. Evolution requires life to be divided into individual organisms, and no alternative organisation has been satisfactorily proposed either. At the basic level, membranes define the limit of a cell, between it and its environment, while remaining partially open to exchange energy and resources with it. The evolution from simple cells to eukaryotes, and from them to multicellular lifeforms, is not guaranteed. The Cambrian explosion took place thousands of millions of years after the origin of life, and its causes are not fully known yet. On the other hand, the jump to multicellularity took place several times, which suggests that it could be a case of convergent evolution, and so likely to take place on other planets as well. Palaeontologist Simon Conway Morris considers that convergent evolution would lead to kingdoms similar to our plants and animals, and that many features are likely to develop in alien animals as well, such as bilateral symmetry, limbs, digestive systems and heads with sensory organs. Scientists from the University of Oxford analysed it from the perspective of evolutionary theory and wrote in a study in the International Journal of Astrobiology that aliens may be similar to humans. The planetary context would also have an influence: a planet with higher gravity would have smaller animals, and other types of stars can lead to non-green photosynthesizers. The amount of energy available would also affect biodiversity, as an ecosystem sustained by black smokers or hydrothermal vents would have less energy available than those sustained by a star's light and heat, and so its lifeforms would not grow beyond a certain complexity. There is also research in assessing the capacity of life for developing intelligence. It has been suggested that this capacity arises with the number of potential niches a planet contains, and that the complexity of life itself is reflected in the information density of planetary environments, which in turn can be computed from its niches. It is common knowledge that the conditions on other planets in the solar system, in addition to the many galaxies outside of the Milky Way galaxy, are very harsh and seem to be too extreme to harbor any life. The environmental conditions on these planets can have intense UV radiation paired with extreme temperatures, lack of water, and much more that can lead to conditions that don't seem to favor the creation or maintenance of extraterrestrial life. However, there has been much historical evidence that some of the earliest and most basic forms of life on Earth originated in some extreme environments that seem unlikely to have harbored life at least at one point in Earth's history. Fossil evidence as well as many historical theories backed up by years of research and studies have marked environments like hydrothermal vents or acidic hot springs as some of the first places that life could have originated on Earth. These environments can be considered extreme when compared to the typical ecosystems that the majority of life on Earth now inhabit, as hydrothermal vents are scorching hot due to the magma escaping from the Earth's mantle and meeting the much colder oceanic water. Even in today's world, there can be a diverse population of bacteria found inhabiting the area surrounding these hydrothermal vents which can suggest that some form of life can be supported even in the harshest of environments like the other planets in the solar system. The aspects of these harsh environments that make them ideal for the origin of life on Earth, as well as the possibility of creation of life on other planets, is the chemical reactions forming spontaneously. For example, the hydrothermal vents found on the ocean floor are known to support many chemosynthetic processes which allow organisms to utilize energy through reduced chemical compounds that fix carbon. In return, these reactions will allow for organisms to live in relatively low oxygenated environments while maintaining enough energy to support themselves. The early Earth environment was reducing and therefore, these carbon fixing compounds were necessary for the survival and possible origin of life on Earth. With the little amount of information that scientists have found regarding the atmosphere on other planets in the Milky Way galaxy and beyond, the atmospheres are most likely reducing or with very low oxygen levels, especially when compared with Earth's atmosphere. If there were the necessary elements and ions on these planets, the same carbon fixing, reduced chemical compounds occurring around hydrothermal vents could also occur on these planets' surfaces and possibly result in the origin of extraterrestrial life. Planetary habitability in the Solar System The Solar System has a wide variety of planets, dwarf planets, and moons, and each one is studied for its potential to host life. Each one has its own specific conditions that may benefit or harm life. So far, the only lifeforms found are those from Earth. No extraterrestrial intelligence other than humans exists or has ever existed within the Solar System. Astrobiologist Mary Voytek points out that it would be unlikely to find large ecosystems, as they would have already been detected by now. The inner Solar System is likely devoid of life. However, Venus is still of interest to astrobiologists, as it is a terrestrial planet that was likely similar to Earth in its early stages and developed in a different way. There is a greenhouse effect, the surface is the hottest in the Solar System, sulfuric acid clouds, all surface liquid water is lost, and it has a thick carbon-dioxide atmosphere with huge pressure. Comparing both helps to understand the precise differences that lead to beneficial or harmful conditions for life. And despite the conditions against life on Venus, there are suspicions that microbial life-forms may still survive in high-altitude clouds. Mars is a cold and almost airless desert, inhospitable to life. However, recent studies revealed that water on Mars used to be quite abundant, forming rivers, lakes, and perhaps even oceans. Mars may have been habitable back then, and life on Mars may have been possible. But when the planetary core ceased to generate a magnetic field, solar winds removed the atmosphere and the planet became vulnerable to solar radiation. Ancient life-forms may still have left fossilised remains, and microbes may still survive deep underground. As mentioned, the gas giants and ice giants are unlikely to contain life. The most distant solar system bodies, found in the Kuiper Belt and outwards, are locked in permanent deep-freeze, but cannot be ruled out completely. Although the giant planets themselves are highly unlikely to have life, there is much hope to find it on moons orbiting these planets. Europa, from the Jovian system, has a subsurface ocean below a thick layer of ice. Ganymede and Callisto also have subsurface oceans, but life is less likely in them because water is sandwiched between layers of solid ice. Europa would have contact between the ocean and the rocky surface, which helps the chemical reactions. It may be difficult to dig so deep in order to study those oceans, though. Enceladus, a tiny moon of Saturn with another subsurface ocean, may not need to be dug, as it releases water to space in eruption columns. The space probe Cassini flew inside one of these, but could not make a full study because NASA did not expect this phenomenon and did not equip the probe to study ocean water. Still, Cassini detected complex organic molecules, salts, evidence of hydrothermal activity, hydrogen, and methane. Titan is the only celestial body in the Solar System besides Earth that has liquid bodies on the surface. It has rivers, lakes, and rain of hydrocarbons, methane, and ethane, and even a cycle similar to Earth's water cycle. This special context encourages speculations about lifeforms with different biochemistry, but the cold temperatures would make such chemistry take place at a very slow pace. Water is rock-solid on the surface, but Titan does have a subsurface water ocean like several other moons. However, it is of such a great depth that it would be very difficult to access it for study. Scientific search The science that searches and studies life in the universe, both on Earth and elsewhere, is called astrobiology. With the study of Earth's life, the only known form of life, astrobiology seeks to study how life starts and evolves and the requirements for its continuous existence. This helps to determine what to look for when searching for life in other celestial bodies. This is a complex area of study, and uses the combined perspectives of several scientific disciplines, such as astronomy, biology, chemistry, geology, oceanography, and atmospheric sciences. The scientific search for extraterrestrial life is being carried out both directly and indirectly. As of September 2017[update], 3,667 exoplanets in 2,747 systems have been identified, and other planets and moons in the Solar System hold the potential for hosting primitive life such as microorganisms. As of 8 February 2021, an updated status of studies considering the possible detection of lifeforms on Venus (via phosphine) and Mars (via methane) was reported. Scientists search for biosignatures within the Solar System by studying planetary surfaces and examining meteorites. Some claim to have identified evidence that microbial life has existed on Mars. In 1996, a controversial report stated that structures resembling nanobacteria were discovered in a meteorite, ALH84001, formed of rock ejected from Mars. Although all the unusual properties of the meteorite were eventually explained as the result of inorganic processes, the controversy over its discovery laid the groundwork for the development of astrobiology. An experiment on the two Viking Mars landers reported gas emissions from heated Martian soil samples that some scientists argue are consistent with the presence of living microorganisms. Lack of corroborating evidence from other experiments on the same samples suggests that a non-biological reaction is a more likely hypothesis. In February 2005 NASA scientists reported they may have found some evidence of extraterrestrial life on Mars. The two scientists, Carol Stoker and Larry Lemke of NASA's Ames Research Center, based their claim on methane signatures found in Mars's atmosphere resembling the methane production of some forms of primitive life on Earth, as well as on their own study of primitive life near the Rio Tinto river in Spain. NASA officials soon distanced NASA from the scientists' claims, and Stoker herself backed off from her initial assertions. In November 2011, NASA launched the Mars Science Laboratory that landed the Curiosity rover on Mars. It is designed to assess the past and present habitability on Mars using a variety of scientific instruments. The rover landed on Mars at Gale Crater in August 2012. A group of scientists at Cornell University started a catalog of microorganisms, with the way each one reacts to sunlight. The goal is to help with the search for similar organisms in exoplanets, as the starlight reflected by planets rich in such organisms would have a specific spectrum, unlike that of starlight reflected from lifeless planets. If Earth was studied from afar with this system, it would reveal a shade of green, as a result of the abundance of plants with photosynthesis. In August 2011, NASA studied meteorites found on Antarctica, finding adenine, guanine, hypoxanthine, and xanthine. Adenine and guanine are components of DNA, and the others are used in other biological processes. The studies ruled out pollution of the meteorites on Earth, as those components would not be freely available the way they were found in the samples. This discovery suggests that several organic molecules that serve as building blocks of life may be generated within asteroids and comets. In October 2011, scientists reported that cosmic dust contains complex organic compounds ("amorphous organic solids with a mixed aromatic-aliphatic structure") that could be created naturally, and rapidly, by stars. It is still unclear if those compounds played a role in the creation of life on Earth, but Sun Kwok, of the University of Hong Kong, thinks so. "If this is the case, life on Earth may have had an easier time getting started as these organics can serve as basic ingredients for life." In August 2012, and in a world first, astronomers at Copenhagen University reported the detection of a specific sugar molecule, glycolaldehyde, in a distant star system. The molecule was found around the protostellar binary IRAS 16293-2422, which is located 400 light years from Earth. Glycolaldehyde is needed to form ribonucleic acid, or RNA, which is similar in function to DNA. This finding suggests that complex organic molecules may form in stellar systems prior to the formation of planets, eventually arriving on young planets early in their formation. In December 2023, astronomers reported the first time discovery, in the plumes of Enceladus, moon of the planet Saturn, of hydrogen cyanide, a possible chemical essential for life as we know it, as well as other organic molecules, some of which are yet to be better identified and understood. According to the researchers, "these [newly discovered] compounds could potentially support extant microbial communities or drive complex organic synthesis leading to the origin of life." Although most searches are focused on the biology of extraterrestrial life, an extraterrestrial intelligence capable enough to develop a civilization may be detectable by other means as well. Technology may generate technosignatures, effects on the native planet that may not be caused by natural causes. There are three main types of techno-signatures considered: interstellar communications, effects on the atmosphere, and planetary-sized structures such as Dyson spheres. Organizations such as the SETI Institute search the cosmos for potential forms of communication. They started with radio waves, and now search for laser pulses as well. The challenge for this search is that there are natural sources of such signals as well, such as gamma-ray bursts and supernovae, and the difference between a natural signal and an artificial one would be in its specific patterns. Astronomers intend to use artificial intelligence for this, as it can manage large amounts of data and is devoid of biases and preconceptions. Besides, even if there is an advanced extraterrestrial civilization, there is no guarantee that it is transmitting radio communications in the direction of Earth. The length of time required for a signal to travel across space means that a potential answer may arrive decades or centuries after the initial message. The atmosphere of Earth is rich in nitrogen dioxide as a result of air pollution, which can be detectable. The natural abundance of carbon, which is also relatively reactive, makes it likely to be a basic component of the development of a potential extraterrestrial technological civilization, as it is on Earth. Fossil fuels may likely be generated and used on such worlds as well. The abundance of chlorofluorocarbons in the atmosphere can also be a clear technosignature, considering their role in ozone depletion. Light pollution may be another technosignature, as multiple lights on the night side of a rocky planet can be a sign of advanced technological development. However, modern telescopes are not strong enough to study exoplanets with the required level of detail to perceive it. The Kardashev scale proposes that a civilization may eventually start consuming energy directly from its local star. This would require giant structures built next to it, called Dyson spheres. Those speculative structures would cause an excess infrared radiation, that telescopes may notice. The infrared radiation is typical of young stars, surrounded by dusty protoplanetary disks that will eventually form planets. An older star such as the Sun would have no natural reason to have excess infrared radiation. The presence of heavy elements in a star's light-spectrum is another potential biosignature; such elements would (in theory) be found if the star were being used as an incinerator/repository for nuclear waste products. Some astronomers search for extrasolar planets that may be conducive to life, narrowing the search to terrestrial planets within the habitable zones of their stars. Since 1992, over four thousand exoplanets have been discovered (6,128 planets in 4,584 planetary systems including 1,017 multiple planetary systems as of 30 October 2025). The extrasolar planets so far discovered range in size from that of terrestrial planets similar to Earth's size to that of gas giants larger than Jupiter. The number of observed exoplanets is expected to increase greatly in the coming years.[better source needed] The Kepler space telescope has also detected a few thousand candidate planets, of which about 11% may be false positives. There is at least one planet on average per star. About 1 in 5 Sun-like stars[a] have an "Earth-sized"[b] planet in the habitable zone,[c] with the nearest expected to be within 12 light-years distance from Earth. Assuming 200 billion stars in the Milky Way,[d] that would be 11 billion potentially habitable Earth-sized planets in the Milky Way, rising to 40 billion if red dwarfs are included. The rogue planets in the Milky Way possibly number in the trillions. The nearest known exoplanet is Proxima Centauri b, located 4.2 light-years (1.3 pc) from Earth in the southern constellation of Centaurus. As of March 2014[update], the least massive exoplanet known is PSR B1257+12 A, which is about twice the mass of the Moon. The most massive planet listed on the NASA Exoplanet Archive is DENIS-P J082303.1−491201 b, about 29 times the mass of Jupiter, although according to most definitions of a planet, it is too massive to be a planet and may be a brown dwarf instead. Almost all of the planets detected so far are within the Milky Way, but there have also been a few possible detections of extragalactic planets. The study of planetary habitability also considers a wide range of other factors in determining the suitability of a planet for hosting life. One sign that a planet probably already contains life is the presence of an atmosphere with significant amounts of oxygen, since that gas is highly reactive and generally would not last long without constant replenishment. This replenishment occurs on Earth through photosynthetic organisms. One way to analyse the atmosphere of an exoplanet is through spectrography when it transits its star, though this might only be feasible with dim stars like white dwarfs. History and cultural impact The modern concept of extraterrestrial life is based on assumptions that were not commonplace during the early days of astronomy. The first explanations for the celestial objects seen in the night sky were based on mythology. Scholars from Ancient Greece were the first to consider that the universe is inherently understandable and rejected explanations based on supernatural incomprehensible forces, such as the myth of the Sun being pulled across the sky in the chariot of Apollo. They had not developed the scientific method yet and based their ideas on pure thought and speculation, but they developed precursor ideas to it, such as that explanations had to be discarded if they contradict observable facts. The discussions of those Greek scholars established many of the pillars that would eventually lead to the idea of extraterrestrial life, such as Earth being round and not flat. The cosmos was first structured in a geocentric model that considered that the sun and all other celestial bodies revolve around Earth. However, they did not consider them as worlds. In Greek understanding, the world was composed by both Earth and the celestial objects with noticeable movements. Anaximander thought that the cosmos was made from apeiron, a substance that created the world, and that the world would eventually return to the cosmos. Eventually two groups emerged, the atomists that thought that matter at both Earth and the cosmos was equally made of small atoms of the classical elements (earth, water, fire and air), and the Aristotelians who thought that those elements were exclusive of Earth and that the cosmos was made of a fifth one, the aether. Atomist Epicurus thought that the processes that created the world, its animals and plants should have created other worlds elsewhere, along with their own animals and plants. Aristotle thought instead that all the earth element naturally fell towards the center of the universe, and that would make it impossible for other planets to exist elsewhere. Under that reasoning, Earth was not only in the center, it was also the only planet in the universe. Cosmic pluralism, the plurality of worlds, or simply pluralism, describes the philosophical belief in numerous "worlds" in addition to Earth, which might harbor extraterrestrial life. The earliest recorded assertion of extraterrestrial human life is found in ancient scriptures of Jainism. There are multiple "worlds" mentioned in Jain scriptures that support human life. These include, among others, Bharat Kshetra, Mahavideh Kshetra, Airavat Kshetra, and Hari kshetra. Medieval Muslim writers like Fakhr al-Din al-Razi and Muhammad al-Baqir supported cosmic pluralism on the basis of the Qur'an. Chaucer's poem The House of Fame engaged in medieval thought experiments that postulated the plurality of worlds. However, those ideas about other worlds were different from the current knowledge about the structure of the universe, and did not postulate the existence of planetary systems other than the Solar System. When those authors talk about other worlds, they talk about places located at the center of their own systems, and with their own stellar vaults and cosmos surrounding them. The Greek ideas and the disputes between atomists and Aristotelians outlived the fall of the Greek empire. The Great Library of Alexandria compiled information about it, part of which was translated by Islamic scholars and thus survived the end of the Library. Baghdad combined the knowledge of the Greeks, the Indians, the Chinese and its own scholars, and the knowledge expanded through the Byzantine Empire. From there it eventually returned to Europe by the time of the Middle Ages. However, as the Greek atomist doctrine held that the world was created by random movements of atoms, with no need for a creator deity, it became associated with atheism, and the dispute intertwined with religious ones. Still, the Church did not react to those topics in a homogeneous way, and there were stricter and more permissive views within the church itself. The first known mention of the term 'panspermia' was in the writings of the 5th-century BC Greek philosopher Anaxagoras. He proposed the idea that life exists everywhere. By the time of the late Middle Ages there were many known inaccuracies in the geocentric model, but it was kept in use because naked eye observations provided limited data. Nicolaus Copernicus started the Copernican Revolution by proposing that the planets revolve around the sun rather than Earth. His proposal had little acceptance at first because, as he kept the assumption that orbits were perfect circles, his model led to as many inaccuracies as the geocentric one. Tycho Brahe improved the available data with naked-eye observatories, which worked with highly complex sextants and quadrants. Tycho could not make sense of his observations, but Johannes Kepler did: orbits were not perfect circles, but ellipses. This knowledge benefited the Copernican model, which worked now almost perfectly. The invention of the telescope a short time later, perfected by Galileo Galilei, clarified the final doubts, and the paradigm shift was completed. Under this new understanding, the notion of extraterrestrial life became feasible: if Earth is but just a planet orbiting around a star, there may be planets similar to Earth elsewhere. The astronomical study of distant bodies also proved that physical laws are the same elsewhere in the universe as on Earth, with nothing making the planet truly special. The new ideas were met with resistance from the Catholic church. Galileo was tried for the heliocentric model, which was considered heretical, and forced to recant it. The best-known early-modern proponent of ideas of extraterrestrial life was the Italian philosopher Giordano Bruno, who argued in the 16th century for an infinite universe in which every star is surrounded by its own planetary system. Bruno wrote that other worlds "have no less virtue nor a nature different to that of our earth" and, like Earth, "contain animals and inhabitants". Bruno's belief in the plurality of worlds was one of the charges leveled against him by the Venetian Holy Inquisition, which tried and executed him. The heliocentric model was further strengthened by the postulation of the theory of gravity by Sir Isaac Newton. This theory provided the mathematics that explains the motions of all things in the universe, including planetary orbits. By this point, the geocentric model was definitely discarded. By this time, the use of the scientific method had become a standard, and new discoveries were expected to provide evidence and rigorous mathematical explanations. Science also took a deeper interest in the mechanics of natural phenomena, trying to explain not just the way nature works but also the reasons for working that way. There was very little actual discussion about extraterrestrial life before this point, as the Aristotelian ideas remained influential while geocentrism was still accepted. When it was finally proved wrong, it not only meant that Earth was not the center of the universe, but also that the lights seen in the sky were not just lights, but physical objects. The notion that life may exist in them as well soon became an ongoing topic of discussion, although one with no practical ways to investigate. The possibility of extraterrestrials remained a widespread speculation as scientific discovery accelerated. William Herschel, the discoverer of Uranus, was one of many 18th–19th-century astronomers who believed that the Solar System is populated by alien life. Other scholars of the period who championed "cosmic pluralism" included Immanuel Kant and Benjamin Franklin. At the height of the Enlightenment, even the Sun and Moon were considered candidates for extraterrestrial inhabitants. Speculation about life on Mars increased in the late 19th century, following telescopic observation of apparent Martian canals – which soon, however, turned out to be optical illusions. Despite this, in 1895, American astronomer Percival Lowell published his book Mars, followed by Mars and its Canals in 1906, proposing that the canals were the work of a long-gone civilisation. Spectroscopic analysis of Mars's atmosphere began in earnest in 1894, when U.S. astronomer William Wallace Campbell showed that neither water nor oxygen was present in the Martian atmosphere. By 1909 better telescopes and the best perihelic opposition of Mars since 1877 conclusively put an end to the canal hypothesis. As a consequence of the belief in the spontaneous generation there was little thought about the conditions of each celestial body: it was simply assumed that life would thrive anywhere. This theory was disproved by Louis Pasteur in the 19th century. Popular belief in thriving alien civilisations elsewhere in the solar system still remained strong until Mariner 4 and Mariner 9 provided close images of Mars, which debunked forever the idea of the existence of Martians and decreased the previous expectations of finding alien life in general. The end of the spontaneous generation belief forced investigation into the origin of life. Although abiogenesis is the more accepted theory, a number of authors reclaimed the term "panspermia" and proposed that life was brought to Earth from elsewhere. Some of those authors are Jöns Jacob Berzelius (1834), Kelvin (1871), Hermann von Helmholtz (1879) and, somewhat later, by Svante Arrhenius (1903). The science fiction genre, although not so named during the time, developed during the late 19th century. The expansion of the genre of extraterrestrials in fiction influenced the popular perception over the real-life topic, making people eager to jump to conclusions about the discovery of aliens. Science marched at a slower pace, some discoveries fueled expectations and others dashed excessive hopes. For example, with the advent of telescopes, most structures seen on the Moon or Mars were immediately attributed to Selenites or Martians, and later ones (such as more powerful telescopes) revealed that all such discoveries were natural features. A famous case is the Cydonia region of Mars, first imaged by the Viking 1 orbiter. The low-resolution photos showed a rock formation that resembled a human face, but later spacecraft took photos in higher detail that showed that there was nothing special about the site. The search and study of extraterrestrial life became a science of its own, astrobiology. Also known as exobiology, this discipline is studied by the NASA, the ESA, the INAF, and others. Astrobiology studies life from Earth as well, but with a cosmic perspective. For example, abiogenesis is of interest to astrobiology, not because of the origin of life on Earth, but for the chances of a similar process taking place in other celestial bodies. Many aspects of life, from its definition to its chemistry, are analyzed as either likely to be similar in all forms of life across the cosmos or only native to Earth. Astrobiology, however, remains constrained by the current lack of extraterrestrial life-forms to study, as all life on Earth comes from the same ancestor, and it is hard to infer general characteristics from a group with a single example to analyse. The 20th century came with great technological advances, speculations about future hypothetical technologies, and an increased basic knowledge of science by the general population thanks to science divulgation through the mass media. The public interest in extraterrestrial life and the lack of discoveries by mainstream science led to the emergence of pseudosciences that provided affirmative, if questionable, answers to the existence of aliens. Ufology claims that many unidentified flying objects (UFOs) would be spaceships from alien species, and ancient astronauts hypothesis claim that aliens would have visited Earth in antiquity and prehistoric times but people would have failed to understand it by then. Most UFOs or UFO sightings can be readily explained as sightings of Earth-based aircraft (including top-secret aircraft), known astronomical objects or weather phenomenons, or as hoaxes. Looking beyond the pseudosciences, Lewis White Beck strove to elevate the level of public discourse on the topic of extraterrestrial life by tracing the evolution of philosophical thought over the centuries from ancient times into the modern era. His review of the contributions made by Lucretius, Plutarch, Aristotle, Copernicus, Immanuel Kant, John Wilkins, Charles Darwin and Karl Marx demonstrated that even in modern times, humanity could be profoundly influenced in its search for extraterrestrial life by subtle and comforting archetypal ideas which are largely derived from firmly held religious, philosophical and existential belief systems. On a positive note, however, Beck further argued that even if the search for extraterrestrial life proves to be unsuccessful, the endeavor itself could have beneficial consequences by assisting humanity in its attempt to actualize superior ways of living here on Earth. By the 21st century, it was accepted that multicellular life in the Solar System can only exist on Earth, but the interest in extraterrestrial life increased regardless. This is a result of the advances in several sciences. The knowledge of planetary habitability allows to consider on scientific terms the likelihood of finding life at each specific celestial body, as it is known which features are beneficial and harmful for life. Astronomy and telescopes also improved to the point exoplanets can be confirmed and even studied, increasing the number of search places. Life may still exist elsewhere in the Solar System in unicellular form, but the advances in spacecraft allow to send robots to study samples in situ, with tools of growing complexity and reliability. Although no extraterrestrial life has been found and life may still be just a rarity from Earth, there are scientific reasons to suspect that it can exist elsewhere, and technological advances that may detect it if it does. Many scientists are optimistic about the chances of finding alien life. In the words of SETI's Frank Drake, "All we know for sure is that the sky is not littered with powerful microwave transmitters". Drake noted that it is entirely possible that advanced technology results in communication being carried out in some way other than conventional radio transmission. At the same time, the data returned by space probes, and giant strides in detection methods, have allowed science to begin delineating habitability criteria on other worlds, and to confirm that at least other planets are plentiful, though aliens remain a question mark. The Wow! signal, detected in 1977 by a SETI project, remains a subject of speculative debate. On the other hand, other scientists are pessimistic. Jacques Monod wrote that "Man knows at last that he is alone in the indifferent immensity of the universe, whence which he has emerged by chance". In 2000, geologist and paleontologist Peter Ward and astrobiologist Donald Brownlee published a book entitled Rare Earth: Why Complex Life is Uncommon in the Universe.[better source needed] In it, they discussed the Rare Earth hypothesis, in which they claim that Earth-like life is rare in the universe, whereas microbial life is common. Ward and Brownlee are open to the idea of evolution on other planets that is not based on essential Earth-like characteristics such as DNA and carbon. As for the possible risks, theoretical physicist Stephen Hawking warned in 2010 that humans should not try to contact alien life forms. He warned that aliens might pillage Earth for resources. "If aliens visit us, the outcome would be much as when Columbus landed in America, which didn't turn out well for the Native Americans", he said. Jared Diamond had earlier expressed similar concerns. On 20 July 2015, Hawking and Russian billionaire Yuri Milner, along with the SETI Institute, announced a well-funded effort, called the Breakthrough Initiatives, to expand efforts to search for extraterrestrial life. The group contracted the services of the 100-meter Robert C. Byrd Green Bank Telescope in West Virginia in the United States and the 64-meter Parkes Telescope in New South Wales, Australia. On 13 February 2015, scientists (including Geoffrey Marcy, Seth Shostak, Frank Drake and David Brin) at a convention of the American Association for the Advancement of Science, discussed Active SETI and whether transmitting a message to possible intelligent extraterrestrials in the Cosmos was a good idea; one result was a statement, signed by many, that a "worldwide scientific, political and humanitarian discussion must occur before any message is sent". Government responses The 1967 Outer Space Treaty and the 1979 Moon Agreement define rules of planetary protection against potentially hazardous extraterrestrial life. COSPAR also provides guidelines for planetary protection. A committee of the United Nations Office for Outer Space Affairs had in 1977 discussed for a year strategies for interacting with extraterrestrial life or intelligence. The discussion ended without any conclusions. As of 2010, the UN lacks response mechanisms for the case of an extraterrestrial contact. One of the NASA divisions is the Office of Safety and Mission Assurance (OSMA), also known as the Planetary Protection Office. A part of its mission is to "rigorously preclude backward contamination of Earth by extraterrestrial life." In 2016, the Chinese Government released a white paper detailing its space program. According to the document, one of the research objectives of the program is the search for extraterrestrial life. It is also one of the objectives of the Chinese Five-hundred-meter Aperture Spherical Telescope (FAST) program. In 2020, Dmitry Rogozin, the head of the Russian space agency, said the search for extraterrestrial life is one of the main goals of deep space research. He also acknowledged the possibility of existence of primitive life on other planets of the Solar System. The French space agency has an office for the study of "non-identified aero spatial phenomena". The agency is maintaining a publicly accessible database of such phenomena, with over 1600 detailed entries. According to the head of the office, the vast majority of entries have a mundane explanation; but for 25% of entries, their extraterrestrial origin can neither be confirmed nor denied. In 2020, chairman of the Israel Space Agency Isaac Ben-Israel stated that the probability of detecting life in outer space is "quite large". But he disagrees with his former colleague Haim Eshed who stated that there are contacts between an advanced alien civilisation and some of Earth's governments. In fiction Although the idea of extraterrestrial peoples became feasible once astronomy developed enough to understand the nature of planets, they were not thought of as being any different from humans. Having no scientific explanation for the origin of mankind and its relation to other species, there was no reason to expect them to be any other way. This was changed by the 1859 book On the Origin of Species by Charles Darwin, which proposed the theory of evolution. Now with the notion that evolution on other planets may take other directions, science fiction authors created bizarre aliens, clearly distinct from humans. A usual way to do that was to add body features from other animals, such as insects or octopuses. Costuming and special effects feasibility alongside budget considerations forced films and TV series to tone down the fantasy, but these limitations lessened since the 1990s with the advent of computer-generated imagery (CGI), and later on as CGI became more effective and less expensive. Real-life events sometimes captivate people's imagination and this influences the works of fiction. For example, during the Barney and Betty Hill incident, the first recorded claim of an alien abduction, the couple reported that they were abducted and experimented on by aliens with oversized heads, big eyes, pale grey skin, and small noses, a description that eventually became the grey alien archetype once used in works of fiction. See also Notes References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Israel_Antiquities_Authority] | [TOKENS: 735]
Contents Israel Antiquities Authority The Israel Antiquities Authority (IAA, Hebrew: רשות העתיקות rashut ha-'atiqot; Arabic: دائرة آثار إسرائيل, before 1990, the Israel Department of Antiquities) is an independent Israeli governmental authority responsible for enforcing the 1978 Law of Antiquities. The IAA regulates excavation and conservation, and promotes research. The National Campus for the Archaeology of Israel is the new home of the IAA, located on Museum Hill in Jerusalem. In 2025, the almost completed Campus started offering guided tours. History The Israel Department of Antiquities and Museums (IDAM) of the Ministry of Education was founded on July 26, 1948, after the establishment of the State of Israel. It took over the functions of the Department of Antiquities of the British Mandate in Israel and Palestine. Originally, its activities were based on the British Mandate Department of Antiquities ordinances. IDAM was the statutory authority responsible for Israel's antiquities and for the administration of small museums. Its functions included curation of the state collection of antiquities, storing of the state collection, maintaining a list of registered antiquities sites, inspecting antiquities sites and registering newly discovered sites, conducting salvage and rescue operations of endangered antiquities sites, maintaining an archaeological library (the state library), maintaining an archive. The Israel Antiquities Authority (IAA) was created from the IDAM by the Knesset (Israeli parliament) in a 1990 statute. Amir Drori became its first director. The IAA fulfilled the statutory obligations of the IDAM and in its early days was greatly expanded from the core number of workers in IDAM to a much larger complement, and to include the functions of the Archaeological Survey of Israel project, ending the activity of the Association for the Archaeological Survey of Israel (1964-1988). The period of expansion lasted for a number of years, but was followed by a period in which diminished fiscal resources and a reduction in funding led to large cutbacks in the size of its work force and its activities. Publications The IDAM and the IAA have published the results of excavations in a number of journals and other publications: The National Campus for the Archaeology of Israel The Jay and Jeanie Schottenstein National Campus for the Archaeology of Israel is the future building of the IAA, aiming to concentrate all centralized administrative offices into one structure. The campus, designed by architect Moshe Safdie, is planned on 20,000 square meters between the Israel Museum and the Bible Lands Museum in Jerusalem. It also aims to exhibit approximately two million ancient artifacts and make them accessible to the public, as well as serve as a center for research, education, demonstration, display, and explanation of Israel's cultural heritage across its various cultural and religious spectrums, throughout human history. Directors Restoration work The IAA's restoration team, consisting as of 2010 of six members, restores potsherds, textiles, metal objects and other objects related to the material culture of the country discovered in archaeological excavations. Unlike their peers around the world, the team in Israel is barred by Israeli law from working with human remains. See also References External links 31°46′27.54″N 35°12′7.94″E / 31.7743167°N 35.2022056°E / 31.7743167; 35.2022056
========================================
[SOURCE: https://en.wikipedia.org/wiki/Rakugo] | [TOKENS: 2243]
Contents Rakugo Rakugo (落語; literally 'story with a fall') is a form of Japanese verbal comedy, traditionally performed in yose theatres. The lone storyteller (落語家, rakugoka) sits on a raised platform, a kōza (高座). Using only a paper fan (扇子, sensu) and a small cloth (手拭, tenugui) as props, and without standing up from the seiza sitting position, the rakugo artist depicts a long and complicated comical (or sometimes sentimental) story. The story always involves the dialogue of two or more characters. The difference between the characters is depicted only through change in pitch, tone, and a slight turn of the head. Description The speaker is in the middle of the stage, and their purpose is to stimulate the general hilarity with tone and limited, yet specific body gestures. The monologue always ends with a narrative stunt (punch line) known as ochi (落ち; lit. "fall") or sage (下げ; lit. "lowering"), consisting of a sudden interruption of the wordplay flow. Twelve kinds of ochi are codified and recognized, with more complex variations having evolved through time from the more basic forms. Early rakugo has developed into various styles, including the shibaibanashi (芝居噺; theatre discourses), the ongyokubanashi (音曲噺; musical discourses), the kaidanbanashi (怪談噺; ghost discourses, see kaidan), and ninjōbanashi (人情噺; sentimental discourses). In many of these forms the ochi, which is essential to the original rakugo, is absent. Rakugo has been described as "a sitcom with one person playing all the parts" by Noriko Watanabe, assistant professor in the Department of Modern Languages and Comparative Literature at Baruch College. Lexical background The precursor of rakugo was called karukuchi (軽口; literally 'light-mouth').: 38 The oldest appearance of the kanji which refers specifically to this type of performance dates back to 1787, but at the time the characters themselves (落とし噺) were normally read as otoshibanashi ("dropping story"). In the middle of the Meiji period (1868–1912) the expression rakugo first started being used,: 45 and it came into common usage only in the Shōwa period (1926–1989). History One of the predecessors of rakugo is considered to be a humorous story in setsuwa. The Konjaku Monogatarishū and the Uji Shūi Monogatari were setsuwa collections compiled from the Heian period (794–1185) to the Kamakura period (1185–1333); they contained many funny stories, and Japanese Buddhist monks preached Buddhism by quoting them. In Makura no Sōshi, it is described that the monks had gained a reputation for their beautiful voices and narrative arts. The direct ancestor of rakugo is a humorous story among the stories narrated by otogishū in the Sengoku Period (1467–1615) . Otogishū were scholars, Buddhist monks and tea masters who served daimyo (feudal lord), and their duty was to give lectures on books to daimyo and to be a partner for chatting. Anrakuan Sakuden, who was an otogishū and a monk of the Jōdo-shū, is often said to be the originator of rakugo, and his 8 volumes of Seisui Sho contain 1000 stories, including the original stories of rakugo. Around 1670 in the Edo period (1603–1867), three storytellers appeared who were regarded as the first rakugoka. Tsuyuno Gorobe in Kyoto, Yonezawa Hikohachi in Osaka, and Shikano Buzaemon in Edo built simple huts around the same age and began telling funny stories to the general public for a price. Rakugo in this period was called Tsujibanashi, but once it lost popularity, rakugo declined for about 100 years. In 1786, Utei Enba presided over a rakugo show at a ryōtei, a traditional Japanese catering venue, in Mukōjima. He is regarded as the father of the restoration of rakugo. His performances led to the establishment of the first theater dedicated to rakugo (yose) by Sanshōtei Karaku and Sanyūtei Enshō, and the revival of rakugo. During the Edo period, thanks to the emergence of the merchant class of the chōnin, rakugo spread to the lower classes. Many groups of performers were formed, and collections of texts were finally printed. During the 17th century the actors were known as hanashika (found written as 噺家, 咄家, or 話家; "storyteller"), corresponding to the modern term, rakugoka (落語家; "person of the falling word"). Before the advent of modern rakugo there were the kobanashi (小噺): short comical vignettes ending with an ochi, popular between the 17th and the 19th centuries. These were enacted in small public venues, or in the streets, and printed and sold as pamphlets. The origin of kobanashi is to be found in the Kinō wa kyō no monogatari (Yesterday Stories Told Today, c. 1620), the work of an unknown author collecting approximately 230 stories describing the common class. Types of ochi ’’Niwaka ochi’’: An ochi using a pun, it is also called 'Jiguchi Ochi.' ’’Hyoshi ochi’’: An ochi that uses repeated punchlines. ’’Sakasa ochi’’: An ochi with a twist punchline, one where roles are reversed ’’Kangae ochi’’: A punchline that is hard to understand but people will laugh after pondering for a while. ‘’Mawari ochi’’: A punchline that ends the story by returning to the beginning. ’’Mitate ochi’’: An ochi that uses unexpected punchlines. ’’Manuke ochi’’: An ochi that ends the story with a dumb or ridiculous joke ’’Totan ochi’’: An ochi using a signature phrase. ’’Buttsuke ochi’’: An ending with a punch line based on a misunderstanding. ’’Shigusa ochi’’: A punchline that uses a physical gesture. Important contributors Many artists contributed to the development of rakugo. Some were simply performers, but many also composed original works. Among the more famous rakugoka of the Tokugawa period were performers like Anrakuan Sakuden (1554–1642), the author of the Seisuishō (Laughter to Chase Away Sleep, 1628), a collection of more than 1,000 stories. In Edo (today's Tokyo) there also lived Shikano Buzaemon [ja] (1649–1699) who wrote the Shikano Buzaemon kudenbanashi (Oral Instruction Discourses of Shikano Buzaemon) and the Shika no makifude (The Deer's Brush, 1686), a work containing 39 stories, eleven of which are about the kabuki milieu. Tatekawa Enba I [ja] (1743–1822) was author of the Rakugo rokugi (The Six Meanings of Rakugo). Kyoto was the home of Tsuyu no Gorobei I [ja] (1643–1703), who is considered the father of the rakugo tradition of the Kamigata area (Kamigata rakugo (上方落語)). His works are included in the Karukuchi tsuyu ga hanashi (Jocular Tsuyu's Stories, date of composition unknown), containing many word games, episodes from the lives of famous literary authors, and plays on the different dialects from the Tokyo, Osaka, and Kyoto areas. Of a similar structure is the Karukuchi gozen otoko (One-liners: An Important Storyteller, date of publication unknown) in which are collected the stories of Yonezawa Hikohachi I [ja], who lived in Ōsaka towards the end of the 17th century. An example from Yonezawa Hikohachi's collection: A man faints in a bathing tub. In the great confusion following, a doctor arrives who takes his pulse and calmly gives the instructions: "Pull the plug and let the water out." Once the water has flowed completely out of the tub he says: "Fine. Now put a lid on it and carry the guy to the cemetery." For the poor man is already dead. The joke becomes clearer when one notes that a Japanese traditional bathing tub is shaped like a coffin. Current rakugo artists include Tachibanaya Enzō, Katsura Bunshi VI, Tachibanaya Takezō II, Tatekawa Shinosuke and Hayashiya Shōzō IX. Furthermore, many people regarded as more mainstream comedians originally trained as rakugoka apprentices, even adopting stage names given to them by their masters. Some examples include Akashiya Sanma, Shōfukutei Tsurube II, and Shōfukutei Shōhei. Another famous rakugo performer, Shijaku Katsura II, was known outside Japan for his performances of rakugo in English. English Rakugo performances have been studied for how they convey traditional Japanese cultural values through adapted scripts, making the art form more accessible while preserving its original narrative style. Titles and repertoire Rakugo stories are generally divided into two categories: classical repertoire stories (koten rakugo, 古典落語) and original stories (shinsaku rakugo, 新作落語). Koten rakugo consists of traditional tales that, in principle, can be adapted and performed by any storyteller. In contrast, shinsaku rakugo refers to new, original works created by individual performers. As the copyright holders, these performers must grant permission before their stories can be performed by others. While some classical repertoire stories are attributed to specific authors, these authors have often been deceased for a considerable time, allowing the stories to enter the shared repertoire. Notable examples of classical repertoire stories include: Notable rakugoka See also References Further reading External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/RBG_PAC] | [TOKENS: 8230]
Contents Political activities of Elon Musk Elon Musk has been actively involved in politics, particularly in the United States and Europe, throughout the majority of his business career. Despite historically donating to and voting for both Democrats and Republicans, his political contributions have since shifted almost entirely to right-wing candidates and politicians, outright stating in 2022 that he would no longer support Democrats. In the time since, Musk has become more vocal about his views, frequently promoting falsehoods about election fraud. As a result, he has been described as conservative, although he rejects the label and calls himself a moderate. Musk played a significant role in the 2024 United States presidential election by establishing a political action committee (PAC) in support of Donald Trump for his campaign, making him the election's largest donor with over US$277 million. Following Trump's 2024 victory, Musk was appointed to co-run a new temporary government organization popularly known as the Department of Government Efficiency (DOGE), serving until May 2025, when Musk departed from the Department. In 2024, he started supporting international far-right political parties, activists, and causes. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. His international political activities have been scrutinized, particularly in Europe, with some saying his actions and comments appear as "foreign interference" in domestic affairs. Musk's comments and actions have received increasing criticism from some of the governments and leaders of European countries, in particular regarding his support of Alternative for Germany during the 2025 German federal elections. United States Musk became a U.S. citizen in 2002, founding SpaceX that year using his share of the profits from PayPal's sale to eBay. From 2008 to 2013, Musk flew to Washington, D.C. forty times, according to biographer Ashlee Vance. At a Vanity Fair event with Y Combinator president Sam Altman in 2015, Musk said he was "involved in politics as little as possible". Prior to the 2016 United States presidential election, Musk donated to Hillary Clinton and later said he voted for her. In November 2016, Musk criticized Donald Trump as "not the right guy" in an interview with CNBC. The following month, Trump appointed Musk to his Strategic and Policy Forum. Musk later endorsed Trump for the 2024 United States presidential election. A 2012 report from the Sunlight Foundation, a nonpartisan group that tracks government spending, found that since 2002, SpaceX had spent more than $4 million on lobbying the United States Congress and more than $800,000 in political contributions to Democrats and Republicans. As for Musk specifically, the same report said that "SpaceX's campaign to win political support has been systematic and sophisticated," and that "unlike most tech-startups, SpaceX has maintained a significant lobbying presence in Washington almost since day 1" and that "Musk himself has donated roughly $725,000 to various campaigns since 2002. In 2004, he contributed $2,000 to President George W. Bush's reelection campaign, maxing out (over $100,000) to Barack Obama's reelection campaign and donated $5,000 to Republican Sen. Marco Rubio, who represents Florida, a state critical to the space industry. [...] All told, Musk and SpaceX gave out roughly $250,000 in the 2012 election cycle." In 2017, Musk made his public first entry into major American politics, speaking at the National Governors Association at the invitation of Brian Sandoval. By January 2017, Musk had met with Trump at Trump Tower to argue for the SpaceX Mars colonization program. That month, he told Gizmodo that he was among several "voices of reason" for Trump. In June, he resigned from his positions on Trump's advisory councils after the United States withdrew from the Paris Agreement. Musk donated US$50,000 to an organization benefiting Republican members of the House of Representatives and an additional US$38,900 to a separate group ahead of the 2018 House of Representatives elections, defending his decision in order to "maintain dialogue". The following year, he offered an endorsement of Andrew Yang's presidential campaign and tacitly supported Kanye West's campaign in the 2020 presidential election, but said that he attempted to convince West to postpone his campaign in an interview with Maureen Dowd. Speaking to Kara Swisher in September 2020, Musk considered voting for Trump if Joe Biden was not a viable candidate, though he later said he voted for Biden. In May 2022, Musk stated that the Democratic Party had become the "party of division and hate" and that he would vote for Republicans, later urging voters vote for Republicans in that year's midterm elections in order to counter a Democratic presidency. In October 2024, The Wall Street Journal reported that Musk had given over US$50 million to Citizens for Sanity, a political organization that targeted Democrats on issues such as medical care for transgender children and illegal immigration. He publicly endorsed Ron DeSantis in his presidential campaign the following year, hosting DeSantis in a Twitter Spaces event that was marred by technical issues. In August 2023, entrepreneur Vivek Ramaswamy suggested that Musk should serve as a presidential advisor for his presidential bid. That fall he supported Republican Ron DeSantis for the 2024 US presidential election, giving $10 million to his campaign in 2023, and hosted DeSantis's campaign announcement on a Twitter Spaces event. In January 2024, Musk participated in a forum with Democratic presidential primary challenger, Representative Dean Phillips regarding his presidential bid. According to The Washington Post, Musk expressed support for Trump at an event hosted by businessman Nelson Peltz. In March 2024, The New York Times reported that he had met Trump at Mar-a-Lago; Musk believed that Biden should be defeated in the 2024 presidential election at the time. Approximately thirty minutes after a would-be assassin shot and wounded Trump at a campaign rally near Butler, Pennsylvania, in July, Musk endorsed Trump. He congratulated Trump's decision to name senator JD Vance of Ohio as his running mate. During the transition period after Trump was re-elected, Musk became omnipresent at Mar-a-Lago, where his activities led The Washington Post to refer to him as "somewhere between unofficial co-president and 'first buddy'". In December, after Musk helped to kill a government funding bill, his outsize influence led some politicians and political commentators to start calling him "President Musk", or variants like "Shadow President Elon Musk". AI-generated images using this theme circulated on social media, such as one showing Musk being inaugurated while Trump held the bible on which Musk took his oath. Another, titled "President Musk", showed Trump as a puppet and Musk as the puppeteer. Cartoonists also created works with Musk as president, accompanied by Trump. He has been regarded as an éminence grise by various sources. Musk attended Trump's inauguration in the United States Capitol rotunda with his son X Æ A-Xii, seated in close proximity to Trump. After the inauguration, he spoke at Trump's inaugural rally at Capital One Arena. During his speech, Musk clasped his right hand against his chest and raised out his arm above his head with his palm facing down, and turned around to perform the gesture to the audience behind him before saying, "My heart goes out to you." The gesture led to an unresolved controversy over whether or not he was giving a Nazi salute. In February 2025, Musk attended the Conservative Political Action Conference. Argentine President Javier Milei gave Musk a chainsaw, which is a symbol that Milei has used in campaigns for cutting public spending. In April 2025, Musk engaged in a very public dispute over tariffs with other figures in the Trump administration, particularly Peter Navarro who Musk called "truly a moron" and "dumber than a sack of bricks." Navarro had labeled Tesla a "car assembler" as opposed to a car manufacturer and dismissed Musk's opinions on tariffs as self interest. When asked to comment on the dispute Press Secretary Karoline Leavitt responded "Boys will be boys, and we will let their public sparring continue." In September 2025, Musk called the Anti-Defamation League a "hate group" and accused it of being anti-Christian in nature. In October 2025, Musk publicly described New York Democratic mayoral candidate Zohran Mamdani as "the future of the Democratic Party". In November he described Mamdani as "a charismatic swindler". In June 2024, The New York Times revealed that a super PAC known as America PAC had spent at least US$6.6 million since the May 2024 to support Trump in the 2024 election. A website had been established collecting voters' information and encouraging them to vote. The Wall Street Journal reported in July that Musk said he would commit a monthly donation of US$45 million to the super PAC; after the Journal published its article, Musk disputed its accuracy. In October, a political action committee known as RBG PAC, in reference to the deceased Supreme Court justice Ruth Bader Ginsburg, spent at least US$20 million on advertisements associating Trump with Ginsburg and her pro-abortion stance, according to a Federal Election Commission (FEC) filing. Clara Spera, Ginsburg's granddaughter, criticized the group in a statement; Ginsburg denounced Trump in an interview with The New York Times in 2016 and did not want to be replaced on the Court until Trump was out of office. An FEC filing in December revealed that Musk was RBG PAC's only donor, with a contribution of $20.5 million. By the end of Trump's presidential campaign, Musk had spent $277 million supporting Trump and allied Republicans, making the largest individual political donor of the 2024 election and the largest individual political donor since at least 2010 outside of candidates funding their own campaigns. Musk's donations primarily went to his super PAC, America PAC. Musk also launched a $1 million a day giveaway for swing state voters the Justice Department warned could potentially violate federal election laws. In March 2025, Musk-backed Building America's Future poured considerable ad money into the Wisconsin Supreme Court judge race, favoring Trump Republican Brad Schimel over Democrat Susan Crawford, a race expected to be the most expensive state supreme court race ever. Crawford subsequently defeated Schimel in the election. On August 12, 2024, Musk held a discussion on X Spaces with Trump in which he suggested a "government efficiency commission" that he would serve on to ensure taxes were spent appropriately, an idea that Trump supported. The following month, The Washington Post reported that Trump had discussed the concept of a commission, led by business executives, to regulate government spending. At a speech before the Economic Club of New York that month, Trump publicly called for a commission to audit the federal government that would be led by Musk. After his victory in the 2024 election in November, Trump announced that he would establish the Department of Government Efficiency and appoint Musk and entrepreneur Vivek Ramaswamy to lead it. Prior to its formation, ideological differences between the men led to Ramaswamy, who will enter the 2026 Ohio gubernatorial election, exiting the initiative. In the days leading up to Trump's inauguration, representatives from the Department of Government Efficiency were sent to several federal agencies. By the inauguration, Musk had received a government email address and was prepared to begin working in the West Wing, according to The New York Times; prior reporting had indicated Musk would operate the Department of Government Efficiency from the Eisenhower Executive Office Building, but he had objected to an apparent lessened level of access with Trump. In an executive order Trump signed after his inauguration, the United States Digital Service was renamed to the United States DOGE Service and a temporary organization was established beneath the DOGE Service. Each federal agency will be assigned a "DOGE team" of special government employees, a classification of temporary workers. DOGE has reportedly taken over agencies and gained access to sensitive data, including the Treasury department, Office of Personnel Management and General Services Administration. DOGE has also reportedly shut down the USAID. Musk claimed credit for the shutdown of the agency, saying that he and his colleagues "spent the weekend feeding USAID into the woodchipper." Potentially violating the Impoundment Control Act, Musk also boasted on X of shutting down payments that were approved by Congress. In a call hosted on X on February 3 with senators Joni Ernst and Mike Lee, Musk proposed the non-enforcement or repeal of all federal regulations as a default position of the Trump administration. After Musk took questions about DOGE in the Oval Office on February 11, sometimes holding his four year-old son on his shoulders, cartoonists portrayed Musk holding Trump on his shoulders like a child. The February 24 Time magazine cover portrayed Musk sitting behind the Resolute desk as if he were president. Musk would depart from the U.S. government on May 30, 2025. In July 2025, Musk announced the creation of a new political party, the America Party, widely seen as a rebuke to president Donald Trump as part of his feud with Trump. Musk stated that the new party would focus on deficit reduction and initially target just a few significant political races. He has advocated for the party to be fiscally conservative. Musk has remained vague on specifics about the platform, with pundits claiming the party's platform will just be his personal views. In August 2025, Musk reportedly reconsidered the formation of the party and contemplated backing JD Vance in the 2028 presidential election. June 5, 2025 Following the 2024 United States presidential election, a feud between Musk and Trump began in June 2025, over provisions in the then-proposed One Big Beautiful Bill Act. The dispute escalated on June 5 after Trump publicly criticized Musk in a meeting with German chancellor Friedrich Merz. Amid a series of posts on X chastising Trump, Musk proposed a political party to represent "the 80 percent in the middle", attaching a survey allowing users to vote "yes" or "no". The poll ended the following day with 80.4% of the 5.6 million respondents voting yes. Minutes after declaring that a party should be established based on the poll, Musk named it the "America Party". Should we create the America Party? July 4, 2025 Prior to the party's announcement, several political parties and individuals in opposition to the two-party system attempted to garner Musk's support. In an interview with Politico Magazine, Andrew Yang sought to work with him to establish a political party together or to support Yang's Forward Party; Musk previously endorsed Yang in his 2020 presidential campaign. Libertarian National Committee chair Steven Nekhaila proposed a partnership with Musk in July. According to Reuters, Musk was "serious" about establishing the America Party despite seeking a resolution with Trump and outreach from the Trump administration. As the One Big Beautiful Bill Act neared a vote in the Senate in late June, he vowed to form the America Party if the bill passed and promised to support primary challenges to Republicans who voted in favor of the bill; the One Big Beautiful Bill Act returned to the House of Representatives days later, where it passed on July 3 along largely party lines. Musk outlined a potential electoral strategy. He referred to both the Republican and Democratic parties as a "uniparty," criticizing them for failing to offer real alternatives. In the days leading up to his announcement, Musk had discussed a political party with allies in conceptual—but not pragmatic—talks, according to The New York Times. On July 4, Musk held a second survey on X, asking if the American people wanted "independence from the two-party (some would say uniparty) system". The poll would end the following day on July 5, with 65.4% voting in favor of the new party. On the same day, Musk announced that he had established the America Party. United Press International reported on July 5 that the party was not yet registered with the Federal Election Commission, which oversees U.S. federal elections. Several days later, Musk noted on X that one supposed filing was fake. On July 6, it was announced that Mark Cuban and Anthony Scaramucci would be "interested" in supporting the party, and offering aid to get the party on state ballots. That same day, Treasury Secretary Scott Bessent stated that "I imagine that those boards of directors [of his companies] did not like [Musk's] announcement yesterday, and will be encouraging him to focus on his business activities, not his political activities." In a July 7 post to Truth Social, Donald Trump said he was "saddened to watch Elon Musk go completely 'off the rails,' essentially becoming a TRAIN WRECK over the past five weeks." On July 30, James Fishback, a Tesla investor, reported that the launch of the America Party had stalled and that Musk had neither filed with the FEC to form the new party, nor endorsed any candidates on the America Party ballot line. The following month, Musk reportedly reconsidered the formation of the party, which would have been damaging to his relationship with the vice president, JD Vance. Musk, who was born in South Africa, is ineligible to run for the presidency or the vice presidency of the United States under the provisions of the United States Constitution. He is eligible to run for other offices, such as United States senator or representative, as well as to be a political party chair. Musk has not stated who will be the party chair or the nature of its structure. Nate Cohn, a New York Times politics analyst, posited that the America Party could gain legitimacy in the 2028 elections if dissatisfaction with the two-party system mounted over dismal economic conditions incurred by Donald Trump and Joe Biden. Vox argued that Musk could negatively affect the Republican Party in the 2026 elections by focusing on competitive seats in the House of Representatives and the Senate, garnering a coalition of voters that lean futurist and technolibertarian. The New York Times pointed out that, while "opinion polling has long shown that Americans are hungry for an option beyond the two major political parties", this has not translated into genuine support for third parties. Efforts by idealistic reformists to create a third party, including Ross Perot and his Reform Party in the 1990s, and the more recent Unite America and No Labels movements, as well as Andrew Yang's Forward Party, have all failed. In the same article, The New York Times also highlighted that Musk's claims of fundamentally reshaping American politics with a third party "suggest he had spent little time studying state ballot-access and federal campaign-finance laws" noting the "labyrinthine system" required for getting a name on a ballot in singular states let alone all 50 as well as pointing out the fact that New York has a ban on parties using the word "American" in their names. The New York Times concluded that it would've been easier, and cheaper, for Musk to attempt a hijacking of the Libertarian Party, which already has elements loyal to Musk, than to start his own political party. MSNBC also stated that "Unlimited money won't make up for not understanding election laws, political science or American history" in regards to Musk's party, also pointing out that the best third-party performance in recent years was Gary Johnson in 2016 which earned 3.3% of the vote. MSNBC also pointed out Yang and the Forward Party, Perot and the Reform Party, and No Labels, while also pointing out the failed Unity '08 and 2012's Americans Elect campaigns alongside Robert F. Kennedy Jr.'s 2024 bid for President. MSNBC concluded that Musk is centering his entire party on winning over Trump-supporting Republicans, and that this would never happen if Trump is his main opponent. Democratic support for Musk's third party includes Dean Phillips, who finished in second place amongst the candidates in the 2024 Democratic presidential primaries and was a U.S. representative from Minnesota's 3rd congressional district. He's personally stated an interest in meeting with Musk to discuss his plans for a third party, having recently cited concerns that Democrats further to the left like Zohran Mamdani are a "grave threat" to the party. Alex Burns, writing for POLITICO, however, remained optimistic that Musk may find success if he makes clear, partisan positions on issues, calling them "targets of opportunity." Burns suggested various policies that Musk could pursue, including: focusing on opposing Trump's tariffs and "championing free trade", promoting fiscal conservatism, and advocating for technocratic interests. In early July, Musk stated that the America Party could focus on "two or three Senate seats" and "eight to ten House districts" to serve as the "deciding vote on contentious laws" and represent general will. He later suggested that the party would run in the 2026 elections, comparing his strategy to that used by the Greek general Epaminondas in the Battle of Leuctra, a "concentrated force at a precise location on the battlefield". Although Epaminondas won the Battle of Leuctra his success wouldn't last and would be killed at Mantinea. He also stated that the party would caucus separately from both the Democratic and the Republican parties and that "legislative discussions would be had with both parties" afterward. The possibility of Musk establishing a viable third party is widely perceived as difficult. The Washington Post noted that, although Musk has sufficient wealth to establish a political party, complications in his business career, his inability to influence the 2025 Wisconsin Supreme Court election, and his declining popularity as a result of his work at the Department of Government Efficiency could hinder his efforts. The America Party would face practical challenges of ballot access, with each state having its own rules on access. A YouGov poll released on July 14 showed that although 45% of Americans thought a viable third party was "necessary", only 11% of Americans would support Musk's party. A July 15 poll by Echelon Insights asking which party you would vote for, found that while only 4.95% of voters would vote for Musk's party, that this almost exclusively pulled from the Republican party allowing the Democrats to win with a comfortable margin. A Quinnipiac University poll released on July 16 showed that 77% of Americans would not consider joining the party, with only 17% saying they would. A CNN poll released on July 17 showed that the vast majority of both adults (74%) and voters in general (77%) oppose Musk's new party, and that only 17% of voters would consider joining the party. CNN compared the results to Perot, noting that unlike Musk more than 50% of Americans supported him forming the Reform Party while only 37% of voters opposed his party. Argentina Following the 2023 Argentine general election, Musk congratulated Javier Milei for his victory, writing that "prosperity is ahead for Argentina". Prior to his inauguration, Musk met with Milei; in a televised interview, Milei said that Musk was "extremely interested in lithium". In February 2024, Ente Nacional de Comunicaciones authorized Starlink, among other satellite internet providers, to operate in the country. Milei has sought to encourage lithium investment in Argentina, including a law that would give foreign investors in the mining industry tax cuts and various benefits for thirty years. In April, Milei and Musk met at a Gigafactory in Austin, Texas, in which they agreed to hold a "big event" in Argentina to promote freedom; The Wall Street Journal noted that the meeting followed praise that Musk offered for Milei, including for his speech at the World Economic Forum, as he sought to strengthen relations to gain access to Argentina's lithium resources and the country's approval for Starlink. According to then-ambassador to the United States Gerardo Werthein, Milei and Musk discussed lithium during the meeting. In May, Musk posted, "I recommend investing in Argentina" on X, after meeting with Milei in Los Angeles for an investors conference. Canada In January 2025, Musk stepped into Canadian national politics, praising the Conservative Party's Pierre Poilievre and mocking outgoing Prime Minister Justin Trudeau of the Liberal Party. Canadians angry with Musk over his influence in Trump's administration, particularly the attack on Canadian sovereignty, have called for the government to revoke Musk's Canadian citizenship and passport. Musk responded in a tweet that was later deleted that "Canada is not a real country." A petition to the federal government sponsored by MP Charlie Angus ran from February to June 2025 garnered nearly 377,000 signatures from Canadian citizens. China and Taiwan Musk's companies, including Tesla and SpaceX, have faced stiff competition in China, where Musk has sought to develop his ventures. The Taiwanese government began discussing the use of Starlink in the country in 2019, but discussions later faltered after SpaceX representatives pressured government officials to change a majority ownership law requiring telecommunications ventures to be owned by a local business. According to Bloomberg News, Musk sought to own the entire venture. In response to the instability of Starlink in the Russo-Ukrainian War, Taiwan began to develop its own communications network without SpaceX. Then-president Tsai Ing-wen pledged NT$40 billion for an Internet network managed by the Taiwanese government, later partnering with the Luxembourgish network company SES. In November 2023, Chunghwa Telecom announced a partnership with the French satellite operator Eutelsat. In September, Musk compared Taiwan to Hawaii, claiming that the island is an "integral part" of China; Joseph Wu, Taiwan's then-foreign minister, said that Taiwan was "certainly not for sale". Germany Musk has prominently endorsed far-right German political party Alternative für Deutschland (AfD), in particular through activities on X. While initial endorsements of party-adjacent social media accounts started as early as late 2023, Musk started actively and openly promoting the party in late 2024 during the prelude of the 2025 German federal elections throughout various activities. In December 2024, Musk responded to a tweet by German far-right influencer Naomi Seibt with "Only the AfD can save Germany". He reiterated his endorsement shortly after, adding the claim that other political parties have "utterly failed the people". Following the 2024 Magdeburg car attack shortly after his initial endorsement of AfD, Musk called Olaf Scholz, chancellor of Germany at the time, "a "fool" in a tweet on the platform, and called for his resignation. A few days later, on New Year's Day, he also attacked Frank-Walter Steinmeier, calling him an "anti-democratic tyrant". In January 2025, Musk agreed to host a Space with Alice Weidel, the leader of AfD and expected chancellor candidate of the party at the time, to promote the party and its positions. In it, responding to concerns about AfD being right-wing, Weidel proposed the thesis that Adolf Hitler was actually a communist, and not right-wing. The claim and Musk not contradicting it have been heavily criticized. The stream itself has been observed by European authorities and led to investigations about whether Musk's promotion of the party is a violation of the Digital Services Act regarding X's ranking algorithms unlawfully preferring AfD-adjacent accounts and content. Previous to his official endorsement, there have been several other incidences of Musk interacting with users on X specifically promoting AfD since at least late 2023, including high-ranking party members. Notably, in April 2024, chair of local AfD Thuringia, Björn Höcke, was facing trial for using slogans historically in use by Nazi Germany's Sturmabteilung, which is punishable by law in Germany. A post by him on X about the situation was commented on by Musk, to which Höcke responded by further defending his use of the slogans. Höcke's original tweet received over 1,000 responses, which observers noted was considerably boosted by Musk's engagement. It was also noted that it was very unusual for Höcke to be tweeting in English instead of German, as well as it being unlikely that Musk would have noticed Höcke's tweets organically. A few weeks after this incident, as a response to a tweet about the European Parliament elections in 2024, Musk voiced doubts about the classification of AfD as a far-right political party, stating the policies he has heard about "don't sound extremist". In his New Year's Eve address, chancellor Olaf Scholz said that the election will "not be decided by the owners of social media channels". In Der Spiegel, deputy chancellor Robert Habeck accused Musk of mounting a "frontal attack on our democracy". Christiane Hoffmann [de], a deputy government spokeswoman, accused Musk of attempting to influence the federal election. Musk's endorsement has also been called into question by other business people, including Bill Gates and German investor Frank Thelen, who previously was a strong supporter of Musk. His involvement in German politics in particular has furthermore led to several German companies as well as other organizations ending their business relationships with him, as well as closing their social media accounts on X. It has been observed by Bundesdatenschau, a project that analyzes the influence of German political parties on X, that AfD as well as associated accounts by party leaders such as Alice Weidel have considerably gained in reach on the platform since Musk's endorsements in 2025 compared to 2024. In December 2024, German daily newspaper of record Die Welt published an op-ed authored by Musk, where he reiterated his endorsement of Alternative für Deutschland previously stated on X, arguing that "traditional parties have failed" as well as calling into doubt the classification of AfD as far-right. The op-ed was published along with an op-ed countering his arguments authored by Jan Philipp Burgard [de], one of the editors-in-chief of the newspaper. In direct response to the publication, several journalists and Die Welt employees resigned, including Eva Marie Kogel [de], department head of op-ed at the time. Additionally, several editors previously opposed the publication, being concerned both with considering it paid electoral campaigning for AfD as well as potential brand damage to the newspaper and the core values of its publishing company Axel Springer SE. The op-ed was widely criticized both by German media as well as internationally. Some German publications called into question whether the untranslated English version of the op-ed was generated by XAI's own AI model Grok. In a response to the criticism and resignations at Die Welt, Burgard and Ulf Poschardt [de], editor-in-chief at the time of the op-ed's publication, defended the publication as an expression of freedom of speech. During a campaign rally of Alternative für Deutschland in January 2025, Musk also joined for a short speech through a video call. Therein, he reiterated his previous statements of party support from X and his op-ed in Die Welt, as well as calling into question the European Union's regulatory reach. In addition, Musk criticized Germany's culture of remembrance, stating that there is "too much of a focus on past guilt and we need to move beyond that". Following Musk's speech, Weidel thanked him for his statements, praised the second presidency of Donald Trump that Musk is closely associated with, and ended her statement with "Make Germany Great Again". Musk's statements on remembrance culture have been widely criticized, in particular due to their timing shortly before the 80th anniversary of the liberation of the Auschwitz concentration camp. Olaf Scholz condemned Musk's comments as well as his general push of European far-right parties. Donald Tusk, prime minister of Poland, called Musk's words on remembrance culture "too familiar and ominous". Steffen Seibert, German ambassador to Israel at the time, stated that "Musk doesn't seem to know [Germany] well at all" and that people are not made felt guilty for crimes committed by the Nazis, contrary to his claims. Israel In September 2023, prime minister Benjamin Netanyahu met with Musk at a Tesla factory in Fremont, California. In November, Musk and Netanyahu toured Kfar Aza, a kibbutz that was attacked by Hamas during the October 7 attacks. Netanyahu invited Musk to attend his address to the United States Congress in July 2024. In December, president Isaac Herzog called Musk to discuss a resolution to the Gaza war hostage crisis. Iran In November 2024, The New York Times reported that Musk had met Amir Saeid Iravani in New York to discuss Iran–United States relations. In January 2025, Italian journalist Cecilia Sala was released from an Iranian prison, followed by Iranian engineer Mohammad Abedini Najafabadi, who was detained on behalf of the United States Department of Justice for allegedly producing drone technology used in the Tower 22 drone attack, from an Italian prison. According to The New York Times, Musk contacted Iravani to seek Sala's release. Italian prime minister Giorgia Meloni denied Musk's involvement in her release, citing a "diplomatic triangulation" between U.S., Italy, and Iran. Italy In April 2025, Musk joined by video-call a meeting of the populist right-wing Lega party, led by Deputy Prime Minister Matteo Salvini. Russia and Ukraine In February 2022, following the Russian invasion of Ukraine, Ukrainian minister Mykhailo Fedorov requested Musk activate Starlink in the country. Additional Starlink terminals arrived two days later. In September, Musk refused to activate a geofence in Crimea, resulting in several boats being damaged. By October, Musk publicly questioned the sustainability of financing Starlink in Ukraine, but said he could continue funding access. That month, Musk posted a poll on Twitter outlining a peace plan that included the secession of Ukrainian territory to Russia; in response, Ukrainian president Volodymyr Zelensky posted his own poll criticizing him. In February 2023, Musk stated that SpaceX would prohibit Starlink's use in long-range drone strikes. According to Fedorov, Musk provided his private messages to Walter Isaacson for Elon Musk. In October 2024, The Wall Street Journal reported that Musk had maintained regular contact with Russian president Vladimir Putin for at least two years. In one discussion, Putin asked Musk not to activate Starlink over Taiwan as a favor to Xi Jinping, the general secretary of the Chinese Communist Party. The following month, U.S. senators Jack Reed and Jeanne Shaheen of the Senate Committee on Armed Services sent a letter to U.S. attorney general Merrick Garland and Robert Storch, the inspector general of the Department of Defense, requesting that they reassess Musk's security clearance. United Kingdom Musk has been a primary backer of Rupert Lowe's Restore Britain movement. In December 2024, the leader of the right-wing populist party Reform UK, Nigel Farage, stated that the party was discussing a donation from Musk after they met at Mar-a-Lago with party treasurer Nick Candy. The negotiations were denounced by Conservative Party leader Kemi Badenoch, who stated that the donation would be "counterproductive". In an interview with The Telegraph that month, Farage argued that Musk could court the youth vote for the party and assist in defeating the Conservative Party. In January 2025, Musk expressed support for Tommy Robinson, a far-right anti-Islam activist who was sentenced to over a year in prison on contempt of court charges. In response, several politicians who supported Brexit urged allies of president-elect Donald Trump to not endorse Robinson. Musk subsequently paid for Robinson's legal fees when he was prosecuted for failure to disclose his device security credentials under anti-terrorism law (Robinson was acquitted). Musk later abruptly disparaged Farage as the leader of the Reform Party as not having "what it takes" after Farage disavowed Robinson. On July 29, 2024, a mass stabbing occurred at a dance studio in Southport in which three children were killed and ten other people were injured, later resulting in the violent disorder across the country. After Starmer said "large social media companies and those who run them" were contributing to the disorder, Musk criticised him for not condemning all participants in the riots and only blaming the far-right. Musk also responded to a tweet which said the riots were due to "mass migration and open borders" by tweeting, "Civil war is inevitable". His comments were condemned by Starmer's official spokesman. Musk further said Starmer was responsible for a "two-tier" policing system which did not protect all communities in the United Kingdom, and subsequently shared a conspiracy theory that Starmer's government was planning to build detainment camps in the Falkland Islands to hold far-right rioters. In response, Starmer said: "my focus is on ensuring our communities are safe. That is my sole focus. I think it's very important for us all to support the police in what they're doing". In January 2025, Musk began commenting on several child abuse scandals in the UK, including the Rotherham child sexual exploitation scandal, in which many young girls were exploited by predominantly British Pakistani men. Musk said that Jess Phillips, Parliamentary Under-Secretary of State for Safeguarding and Violence Against Women and Girls, "deserves to be in prison", when she denied requests for a public inquiry into child sexual exploitation in Oldham (Greater Manchester). He claimed that Phillips was a "rape genocide apologist" by rejecting calls for a national investigation. In addition, Musk castigated Keir Starmer, who served as the director of public prosecutions when the abuses were publicized, though he had published guidelines about handling the sexual exploitation of children. During an event for the National Health Service, Starmer indirectly denounced Musk for "lies and misinformation". Musk has continued to amplify fear about immigrants raping children comparing the convicted racist extremist Tommy Robinson to a protective warrior of Gondor that the British people must get behind or they will die: “When Tolkien wrote about the hobbits, he was referring to the gentlefolk of the English shires, who don’t realize the horrors that take place far away. They were able to live their lives in peace and tranquility, but only because they were protected by the hard men of Gondor. What happened to the nice man who was brutally murdered while walking his dog will happen to all of England if the tide of illegal immigration is not turned. It is time for the English to ally with the hard men, like Tommy Robinson, and fight for their survival or they shall surely all perish." Spain In early 2025, Elon Musk publicly endorsed Spain's far-right party, Vox, predicting its success in the country's upcoming elections, stating "Vox will win the next election." Musk expressed his support through a series of posts on X (formally known as Twitter), where he praised the party's stance on immigration, economic deregulation, and its opposition to European Union policies that he deemed restrictive to technological innovation. His endorsement sparked significant debate in Spain and across Europe, as political analysts viewed it as part of his broader engagement with right-wing movements globally. Musk's comment led to speculation about potential business interests tied to his endorsement, particularly regarding Spain's lithium mining industry, which is crucial for Tesla, Inc.'s electric vehicle battery production. Some analysts suggested that a Vox-led government might pursue policies more favorable to Musk's business operations in Europe, including regulatory adjustments or tax incentives for tech investments. However, neither Musk nor Vox confirmed any direct discussions about such agreements. The endorsement also drew criticism from political opponents, who accused Musk of using his vast online influence to interfere in European democratic process. Spanish Prime Minister Pedro Sánchez dismissed Musk's comments as foreign meddling, while European Union officials expressed concerns about the growing role of tech billionaires in shaping political discourse. Despite the controversy, Vox leaders welcomed Musk's support, leveraging it in their campaign to appeal to younger, business-oriented voters. References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Non-player_character#cite_note-10] | [TOKENS: 1785]
Contents Non-player character A non-player character (NPC) is a character in a game that is not controlled by a player. The term originated in traditional tabletop role-playing games where it applies to characters controlled by the gamemaster, or referee, rather than by another player. In video games, this usually means a computer-controlled character that has a predetermined set of behaviors that potentially will impact gameplay, but will not necessarily be the product of true artificial intelligence. Role-playing games In traditional tabletop role-playing games (RPG) such as Dungeons & Dragons, an NPC is a character portrayed by the gamemaster (GM). While the player characters (PCs) form the narrative's protagonists, non-player characters can be thought of as the "supporting cast" or "extras" of a roleplaying narrative. Non-player characters populate the fictional world of the game, and can fill any role not occupied by a player character. Non-player characters might be allies, bystanders, or competitors to the PCs. NPCs can also be traders who trade currency for things such as equipment or gear. NPCs thus vary in their level of detail. Some may be only a brief description ("You see a man in a corner of the tavern"), while others may have complete game statistics and backstories. There is some debate about how much work a gamemaster should put into an important NPC's statistics; some players prefer to have every NPC completely defined with stats, skills, and gear, while others define only what is immediately necessary and fill in the rest as the game proceeds. There is also some debate regarding the importance of fully defined NPCs in any given role-playing game, but there is consensus that the more "real" the NPCs feel, the more fun players will have interacting with them in character. In some games and in some circumstances, a player who is without a player character can temporarily take control of an NPC. Reasons for this vary, but often arise from the player not maintaining a PC within the group and playing the NPC for a session or from the player's PC being unable to act for some time (for example, because the PC is injured or in another location). Although these characters are still designed and normally controlled by the gamemaster, when players are allowed to temporarily control these non-player characters, it gives them another perspective on the plot of the game. Some systems, such as Nobilis, encourage this in their rules.[citation needed] Many game systems have rules for characters sustaining positive allies in the form of NPC followers, hired hands, or other dependents stature to the PC (player character). Characters may sometimes help in the design, recruitment, or development of NPCs. In the Champions game (and related games using the Hero System), a character may have a DNPC, or "dependent non-player character". This is a character controlled by the GM, but for which the player character is responsible in some way, and who may be put in harm's way by the PC's choices. Video games The term "non-player character" is also used in video games to describe entities not under the direct control of a player. The term carries a connotation that the character is not hostile towards players; hostile characters are referred to as enemies, mobs, or creeps. NPC behavior in computer games is usually scripted and automatic, triggered by certain actions or dialogue with the player characters. In certain multiplayer games (Neverwinter Nights and Vampire: The Masquerade series, for example) a player that acts as the GM can "possess" both player and non-player characters, controlling their actions to further the storyline. More complex games, such as the aforementioned Neverwinter Nights, allow the player to customize the NPCs' behavior by modifying their default scripts or creating entirely new ones. In some online games, such as massively multiplayer online role-playing games, NPCs may be entirely unscripted, and are essentially regular character avatars controlled by employees of the game company. These "non-players" are often distinguished from player characters by avatar appearance or other visual designation, and often serve as in-game support for new players. In other cases, these "live" NPCs are virtual actors, playing regular characters that drive a continuing storyline (as in Myst Online: Uru Live). In earlier RPGs, NPCs only had monologues. This is typically represented by a dialogue box, floating text, cutscene, or other means of displaying the NPCs' speech or reaction to the player. [citation needed] NPC speeches of this kind are often designed to give an instant impression of the character of the speaker, providing character vignettes, but they may also advance the story or illuminate the world around the PC. Similar to this is the most common form of storytelling, non-branching dialogue, in which the means of displaying NPC speech are the same as above, but the player character or avatar responds to or initiates speech with NPCs. In addition to the purposes listed above, this enables the development of the player character. More advanced RPGs feature interactive dialogue, or branching dialogue (dialogue trees). An example are the games produced by Black Isle Studios and White Wolf, Inc.; every one of their games is multiple-choice roleplaying. When talking to an NPC, the player is presented with a list of dialogue options and may choose between them. Each choice may result in a different response from the NPC. These choices may affect the course of the game, as well as the conversation. At the least, they provide a reference point to the player of their character's relationship with the game world. Ultima is an example of a game series that has advanced from non-branching (Ultima III: Exodus and earlier) to branching dialogue (from Ultima IV: Quest of the Avatar and on). Other role-playing games with branching dialogues include Cosmic Soldier, Megami Tensei, Fire Emblem, Metal Max, Langrisser, SaGa, Ogre Battle, Chrono, Star Ocean, Sakura Wars, Mass Effect, Dragon Age, Radiant Historia, and several Dragon Quest and Final Fantasy games. Certain video game genres revolve almost entirely around interactions with non-player characters, including visual novels such as Ace Attorney and dating sims such as Tokimeki Memorial, usually featuring complex branching dialogues and often presenting the player's possible responses word-for-word as the player character would say them. Games revolving around relationship-building, including visual novels, dating sims such as Tokimeki Memorial, and some role-playing games such as Persona, often give choices that have a different number of associated "mood points" that influence a player character's relationship and future conversations with a non-player character. These games often feature a day-night cycle with a time scheduling system that provides context and relevance to character interactions, allowing players to choose when and if to interact with certain characters, which in turn influences their responses during later conversations. In 2023, Replica Studios unveiled its AI-developed NPCs for the Unreal Engine 5, in cooperation with OpenAI, which enable players to have an interactive conversation with unplayable characters. "NPC streaming"—livestreaming while mimicking the behaviors of an NPC—became popular on TikTok in 2023 and was largely popularized by livestreamer Pinkydoll. Other usage From around 2018, the term NPC became an insult, primarily online, to suggest that a person is unable to form thoughts or opinions of their own. This is sometimes illustrated with a grey-faced, expressionless version of the Wojak meme. Monetization NPC streaming is a type of livestream that allows users to participate in and shape the content they are viewing in real time. It has become widely popular as influencers and users of social media platforms such as TikTok utilize livestreams to act as non-player characters. "Viewers in NPC live streams take on the role of puppeteers, influencing the creator's next move." This phenomenon has been on the rise as viewers are actively involved in what they are watching, by purchasing digital "gifts" and sending them directly to the streamer. In return, the streamer will briefly mimic a character or act. This phenomenon has become a trend starting from July 2023, as influencers make profits from this new internet character. Pinkydoll, a TikTok influencer, gained 400,000 followers the same month that she started NPC streaming, while her livestreams began to earn her as much as $7,000 in a day. NPC streaming gives creators a new avenue to earn money online. Despite this, certain creators are quitting due to certain stigmas that come with the strategy. For example, a pioneer of the NPC trend, Malik Ambersley has been robbed, accosted by police, and gotten into fights due to the controversial nature of his act. See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Lithium-ion_battery] | [TOKENS: 15775]
Contents Lithium-ion battery A lithium-ion battery or Li-ion battery is a type of rechargeable battery that uses the reversible intercalation of Li+ ions into electronically conducting solids to store energy. Compared to other types of rechargeable batteries, they generally have higher specific energy, energy density, and energy efficiency and a longer cycle life and calendar life. In the three decades after Li-ion batteries were first sold in 1991, their volumetric energy density increased threefold while their cost dropped tenfold. In late 2024, global demand passed 1 terawatt-hour per year, while production capacity was more than twice that. The invention and commercialization of Li-ion batteries has had a large impact on technology, as recognized by the 2019 Nobel Prize in Chemistry. Li-ion batteries have enabled portable consumer electronics, laptop computers, cellular phones, and electric cars. They are used for grid-scale energy storage and in military and aerospace applications. M. Stanley Whittingham conceived intercalation electrodes in the 1970s and created the first rechargeable lithium-ion battery, based on a titanium disulfide cathode and a lithium-aluminium anode, although it suffered from safety problems and was never commercialized. John Goodenough expanded on this work in 1980 by using lithium cobalt oxide as a cathode. The first prototype of the modern Li-ion battery, which uses a carbonaceous anode rather than lithium metal, was developed by Akira Yoshino in 1985 and commercialized by a Sony and Asahi Kasei team led by Yoshio Nishi in 1991. Whittingham, Goodenough, and Yoshino were awarded the 2019 Nobel Prize in Chemistry for their contributions to the development of lithium-ion batteries. Lithium-ion batteries can be a fire or explosion hazard as they contain flammable electrolytes. Progress has been made in the development and manufacturing of safer lithium-ion batteries. Lithium-ion solid-state batteries are being developed to eliminate the flammable electrolyte. Recycled batteries can create toxic waste, including from toxic metals, and are a fire risk. Lithium and other minerals can have significant issues in mining, with lithium being water intensive in often arid regions and other minerals used in some Li-ion chemistries potentially being conflict minerals such as cobalt. Environmental issues have encouraged some researchers to improve mineral efficiency and find alternatives such as lithium iron phosphate lithium-ion chemistries or non-lithium-based battery chemistries such as sodium-ion and iron-air batteries. "Li-ion battery” encompasses battery types of at least 12 chemistries. Lithium-ion cells can be manufactured to optimize energy density or power density. Handheld electronics mostly use lithium polymer batteries (with a polymer gel as an electrolyte), a lithium cobalt oxide (LiCoO2) cathode material, and a graphite anode, which together offer high energy density. Lithium iron phosphate (LiFePO4), lithium manganese oxide (LiMn2O4 spinel, or Li2MnO3-based lithium-rich layered materials, LMR-NMC), and lithium nickel manganese cobalt oxide (LiNiMnCoO2 or NMC) may offer longer life and a higher discharge rate. NMC and its derivatives are widely used in the electrification of transport, one of the main technologies (combined with renewable energy) for reducing greenhouse gas emissions from vehicles. History One of the earliest examples of research into lithium-ion batteries is a CuF2/Li battery developed by NASA in 1965. The breakthrough that produced the earliest form of the modern Li-ion battery was made by British chemist M. Stanley Whittingham in 1974, who first used titanium disulfide (TiS2) as a cathode material, which has a layered structure that can take in lithium ions without significant changes to its crystal structure. Exxon tried to commercialize this battery in the late 1970s, but found the synthesis expensive and complex, as TiS2 is sensitive to moisture and releases toxic hydrogen sulfide (H2S) gas on contact with water. More prohibitively, the batteries were also prone to spontaneously catch fire due to the presence of metallic lithium in the cells. For this, and other reasons, Exxon discontinued the development of Whittingham's lithium-titanium disulfide battery. In 1980, working in separate groups Ned A. Godshall et al., and, shortly thereafter, Koichi Mizushima and John B. Goodenough, after testing a range of alternative materials, replaced TiS2 with lithium cobalt oxide (LiCoO2, or LCO), which has a similar layered structure but offers a higher voltage and is much more stable in air. This material would later be used in the first commercial Li-ion battery, although it did not, on its own, resolve the persistent issue of flammability. These early attempts to develop rechargeable Li-ion batteries used lithium metal anodes, which were ultimately abandoned due to safety concerns, as lithium metal is unstable and prone to dendrite formation, which can cause short-circuiting. The eventual solution was to use an intercalation anode, similar to that used for the cathode, which prevents the formation of lithium metal during battery charging. The first to demonstrate lithium ion reversible intercalation into graphite anodes was Jürgen Otto Besenhard in 1974. Besenhard used organic solvents such as carbonates, however these solvents decomposed rapidly providing short battery cycle life. Later, in 1980, Rachid Yazami used a solid organic electrolyte, polyethylene oxide, which was more stable. In 1985, Akira Yoshino at Asahi Kasei Corporation discovered that petroleum coke, a less graphitized form of carbon, can reversibly intercalate Li-ions at a low potential of ~0.5 V relative to Li+ /Li without structural degradation. Its structural stability originates from its amorphous carbon regions, which serve as covalent joints to pin the layers together. Although it has a lower capacity compared to graphite (~Li0.5C6, 186 mAh g–1), it became the first commercial intercalation anode for Li-ion batteries owing to its cycling stability. In 1987, Yoshino patented what would become the first commercial lithium-ion battery using this anode. He used Goodenough's previously reported LiCoO2 as the cathode and a carbonate ester-based electrolyte. The battery was assembled in the discharged state, which made it safer and cheaper to manufacture. In 1991, using Yoshino's design, Sony began producing and selling the world's first rechargeable lithium-ion batteries. The following year, a joint venture between Toshiba and Asahi Kasei Co. also released a lithium-ion battery. Significant improvements in energy density were achieved in the 1990s by replacing Yoshino's soft carbon anode first with hard carbon and later with graphite. In 1990, Jeff Dahn and two colleagues at Dalhousie University (Canada) reported reversible intercalation of lithium ions into graphite in the presence of ethylene carbonate solvent (which is solid at room temperature and is mixed with other solvents to make a liquid). This represented the final innovation of the era that created the basic design of the modern lithium-ion battery. In 2010, global lithium-ion battery production capacity was 20 gigawatt-hours. By 2016, it was 28 GWh, with 16.4 GWh in China. Global production capacity was 767 GWh in 2020, with China accounting for 75%. Production in 2021 is estimated by various sources to be between 200 and 600 GWh, and predictions for 2023 range from 400 to 1,100 GWh. In 2012, John B. Goodenough, Rachid Yazami and Akira Yoshino received the 2012 IEEE Medal for Environmental and Safety Technologies for developing the lithium-ion battery; Goodenough, Whittingham, and Yoshino were awarded the 2019 Nobel Prize in Chemistry "for the development of lithium-ion batteries". Jeff Dahn received the ECS Battery Division Technology Award (2011) and the Yeager award from the International Battery Materials Association (2016). In April 2025, CATL unveiled its Shenxing Plus battery, the first lithium iron phosphate (LFP) battery claiming a range of over 1,000 km (620 miles) on a single charge. The company also stated the battery supports 4C ultra-fast charging, allowing for 600 km of range to be added in 10 minutes, marking a significant advance in making long-range, fast-charging LFP batteries viable for the mass market. Design Generally, the negative electrode of a conventional lithium-ion cell is made from graphite. The positive electrode is typically a metal oxide or phosphate. The electrolyte is a lithium salt in an organic solvent. The negative electrode (which is the anode when the cell is discharging) and the positive electrode (which is the cathode when discharging) are prevented from shorting by a separator. The electrodes are connected to the powered circuit through two pieces of metal called current collectors. The negative and positive electrodes swap their electrochemical roles (anode and cathode) when the cell is charged. Despite this, in discussions of battery design the negative electrode of a rechargeable cell is often just called "the anode" and the positive electrode "the cathode". In its fully lithiated state of LiC6, graphite correlates to a theoretical capacity of 1339 coulombs per gram (372 mAh/g). The positive electrode is generally one of three materials: a layered oxide (such as lithium cobalt oxide), a polyanion (such as lithium iron phosphate) or a spinel (such as lithium manganese oxide). More experimental materials include graphene-containing electrodes, although these remain far from commercially viable due to their high cost. Lithium reacts vigorously with water to form lithium hydroxide (LiOH) and hydrogen gas. Thus, a non-aqueous electrolyte is typically used, and a sealed container rigidly excludes moisture from the battery pack. The non-aqueous electrolyte is typically a mixture of organic carbonates such as ethylene carbonate and propylene carbonate containing complexes of lithium ions. Ethylene carbonate is essential for making solid electrolyte interphase on the carbon anode, but since it is solid at room temperature, a liquid solvent (such as propylene carbonate or diethyl carbonate) is added. The electrolyte salt is almost always[citation needed] lithium hexafluorophosphate (LiPF6), which combines good ionic conductivity with chemical and electrochemical stability. The hexafluorophosphate anion is essential for passivating the aluminium current collector used for the positive electrode. A titanium tab is ultrasonically welded to the aluminium current collector. Other salts like lithium perchlorate (LiClO4), lithium tetrafluoroborate (LiBF4), and lithium bis(trifluoromethanesulfonyl)imide (LiC2F6NO4S2) are frequently used in research in tab-less coin cells, but are not usable in larger format cells, often because they are not compatible with the aluminium current collector. Copper (with a spot-welded nickel tab) is used as the current collector at the negative electrode. Current collector design and surface treatments may take various forms: foil, mesh, foam (dealloyed), etched (wholly or selectively), and coated (with various materials) to improve electrical characteristics. Depending on materials choices, the voltage, energy density, life, and safety of a lithium-ion cell can change dramatically. Current effort has been exploring the use of novel architectures using nanotechnology to improve performance. Areas of interest include nano-scale electrode materials and alternative electrode structures. The reactants in the electrochemical reactions in a lithium-ion cell are the materials of the electrodes, both of which are compounds containing lithium atoms. Although many thousands of different materials have been investigated for use in lithium-ion batteries, only a very small number are commercially usable. All commercial Li-ion cells use intercalation compounds as active materials. The negative electrode is usually graphite, although silicon is often mixed in to increase the capacity. The electrolyte is usually lithium hexafluorophosphate, dissolved in a mixture of organic carbonates. A number of different materials are used for the positive electrode, such as LiCoO2, LiFePO4, and lithium nickel manganese cobalt oxides. During cell discharge the negative electrode is the anode and the positive electrode the cathode: electrons flow from the anode to the cathode through the external circuit. An oxidation half-reaction at the anode produces positively charged lithium ions and negatively charged electrons. The oxidation half-reaction may also produce uncharged material that remains at the anode. Lithium ions move through the electrolyte; electrons move through the external circuit toward the cathode where they recombine with the cathode material in a reduction half-reaction. The electrolyte provides a conductive medium for lithium ions but does not partake in the electrochemical reaction. The reactions during discharge lower the chemical potential of the cell, so discharging transfers energy from the cell to wherever the electric current dissipates its energy, mostly in the external circuit. During charging these reactions and transports go in the opposite direction: electrons move from the positive electrode to the negative electrode through the external circuit. To charge the cell the external circuit has to provide electrical energy. This energy is then stored as chemical energy in the cell (with some loss, e. g., due to coulombic efficiency lower than 1). Both electrodes allow lithium ions to move in and out of their structures with a process called insertion (intercalation) or extraction (deintercalation), respectively. As the lithium ions "rock" back and forth between the two electrodes, these batteries are also known as "rocking-chair batteries" or "swing batteries" (a term given by some European industries). The following equations exemplify the chemistry (left to right: discharging, right to left: charging). The negative electrode half-reaction for the graphite is The positive electrode half-reaction in the lithium-doped cobalt oxide substrate is The full reaction being The overall reaction has its limits. Overdischarging supersaturates lithium cobalt oxide, leading to the production of lithium oxide, possibly by the following irreversible reaction: Overcharging up to 5.2 volts leads to the synthesis of cobalt (IV) oxide, as evidenced by x-ray diffraction: The cell's energy is equal to the voltage times the charge. Each gram of lithium represents Faraday's constant/6.941, or 13,901 coulombs. At 3 V, this gives 41.7 kJ per gram of lithium, or 11.6 kWh per kilogram of lithium. This is slightly more than the heat of combustion of gasoline; however, lithium-ion batteries as a whole are still significantly heavier per unit of energy due to the additional materials used in production. Note that the cell voltages involved in these reactions are larger than the potential at which an aqueous solutions would electrolyze. During discharge, lithium ions (Li+) carry the current within the battery cell from the negative to the positive electrode, through the non-aqueous electrolyte and separator diaphragm. During charging, an external electrical power source applies an over-voltage (a voltage greater than the cell's own voltage) to the cell, forcing electrons to flow from the positive to the negative electrode. The lithium ions also migrate (through the electrolyte) from the positive to the negative electrode where they become embedded in the porous electrode material in a process known as intercalation. Energy losses arising from electrical contact resistance at interfaces between electrode layers and at contacts with current collectors can be as high as 20% of the entire energy flow of batteries under typical operating conditions. The charging procedures for single Li-ion cells, and complete Li-ion batteries, are slightly different: During the constant current phase, the charger applies a constant current to the battery at a steadily increasing voltage, until the top-of-charge voltage limit per cell is reached. During the balance phase, the charger/battery reduces the charging current (or cycles the charging on and off to reduce the average current) while the state of charge of individual cells is brought to the same level by a balancing circuit until the battery is balanced. Balancing typically occurs whenever one or more cells reach their top-of-charge voltage before the other(s), as it is generally inaccurate to do so at other stages of the charge cycle. This is most commonly done by passive balancing, which dissipates excess charge as heat via resistors connected momentarily across the cells to be balanced. Active balancing is less common, more expensive, but more efficient, returning excess energy to other cells (or the entire pack) via a DC-DC converter or other circuitry. Balancing most often occurs during the constant voltage stage of charging, switching between charge modes until complete. The pack is usually fully charged only when balancing is complete, as even a single cell group lower in charge than the rest will limit the entire battery's usable capacity to that of its own. Balancing can last hours or even days, depending on the magnitude of the imbalance in the battery. During the constant voltage phase, the charger applies a voltage equal to the maximum cell voltage times the number of cells in series to the battery, as the current gradually declines towards 0, until the current is below a set threshold of about 3% of initial constant charge current. Periodic topping charge about once per 500 hours. Top charging is recommended to be initiated when voltage goes below 4.05 V/cell. Failure to follow current and voltage limitations can result in excessive coulombic heating of the battery, and in the case of overcharge to voltages higher than designed can lead to an explosion. Charging temperature limits for Li-ion are stricter than the operating limits. Lithium-ion chemistry performs well at elevated temperatures but prolonged exposure to heat reduces battery life. Li‑ion batteries offer good charging performance at cooler temperatures and may even allow "fast-charging" within a temperature range of 5 to 45 °C (41 to 113 °F). Charging should be performed within this temperature range. At temperatures from 0 to 5 °C charging is possible, but the charge current should be reduced. During a low-temperature (under 0 °C) charge, the slight temperature rise above ambient due to the internal cell resistance is beneficial. High temperatures during charging may lead to battery degradation and charging at temperatures above 45 °C will degrade battery performance, whereas at lower temperatures the internal resistance of the battery may increase, resulting in slower charging and thus longer charging times.[better source needed] Batteries gradually self-discharge even if not connected and delivering current. Li-ion rechargeable batteries have a self-discharge rate typically stated by manufacturers to be 1.5–2% per month. The rate increases with temperature and state of charge. A 2004 study found that for most cycling conditions self-discharge was primarily time-dependent; however, after several months of stand on open circuit or float charge, state-of-charge dependent losses became significant. The self-discharge rate did not increase monotonically with state-of-charge, but dropped somewhat at intermediate states of charge. Self-discharge rates may increase as batteries age. In 1999, self-discharge per month was measured at 8% at 21 °C, 15% at 40 °C, 31% at 60 °C. By 2007, monthly self-discharge rate was estimated at 2% to 3%, and 2–3% by 2016. By comparison, the self-discharge rate for NiMH batteries dropped, as of 2017, from up to 30% per month for previously common cells to about 0.08–0.33% per month for low self-discharge NiMH batteries, and is about 10% per month in NiCd batteries. Transition metal oxides (TMOs) are widely used as cathode materials in lithium-ion batteries as the variable oxidation state of transition metal cations allows oxides of these metals to reversibly host lithium ions (Li⁺) and undergo efficient redox (reduction-oxidation) reactions. While Oxygen ions are commonly assumed to remain in a 2- oxidation state, the role of oxygen redox in facilitating the lithium insertion is now recognized as instrumental in the performance of lithium ion battery cathodes. The layered or framework structures of TMOs allow Li⁺ insertion/extraction during charging/discharging, while their transition metals and oxygen anions participate in electron transfer, enabling high energy density and stability. Three classes of cathode materials in lithium-ion batteries have been commercialized: (1) layered oxides, (2) spinel oxides and (3) oxoanion complexes. All of them were discovered by John Goodenough and his collaborators. LiCoO2 was used in the first commercial lithium-ion battery made by Sony in 1991. The layered oxides have a pseudo-tetrahedral structure comprising layers made of MO6 octahedra separated by interlayer spaces that allow for two-dimensional lithium-ion diffusion.[citation needed] The band structure of LixCoO2 allows for true electronic (rather than polaronic) conductivity. However, due to an overlap between the Co4+ t2g d-band with the O2- 2p-band, the x must be >0.5, otherwise O2 evolution occurs. This limits the charge capacity of this material to ~140 mA h g−1. Several other first-row (3d) transition metals also form layered LiMO2 salts. Some can be directly prepared from lithium oxide and M2O3 (e.g. for M=Ti, V, Cr, Co, Ni), while others (M= Mn or Fe) can be prepared by ion exchange from NaMO2. LiVO2, LiMnO2 and LiFeO2 suffer from structural instabilities (including mixing between M and Li sites) due to a low energy difference between octahedral and tetrahedral environments for the metal ion M. For this reason, they are not used in lithium-ion batteries. However, Na+ and Fe3+ have sufficiently different sizes that NaFeO2 can be used in sodium-ion batteries. Similarly, LiCrO2 shows reversible lithium (de)intercalation around 3.2 V with 170–270 mAh/g. However, its cycle life is short, because of disproportionation of Cr4+ followed by translocation of Cr6+ into tetrahedral sites. On the other hand, NaCrO2 shows a much better cycling stability. LiTiO2 shows Li+ (de)intercalation at a voltage of ~1.5 V, which is too low for a cathode material. These problems leave LiCoO2 and LiNiO2 as the only practical layered oxide materials for lithium-ion battery cathodes. The cobalt-based cathodes show high theoretical specific (per-mass) charge capacity, high volumetric capacity, low self-discharge, high discharge voltage, and good cycling performance. Unfortunately, they suffer from a high cost of the material. For this reason, the current trend among lithium-ion battery manufacturers is to switch to cathodes with higher Ni content and lower Co content. In addition to a lower (than cobalt) cost, nickel-oxide based materials benefit from the two-electron redox chemistry of Ni: in layered oxides comprising nickel (such as nickel-cobalt-manganese NCM and nickel-cobalt-aluminium oxides NCA), Ni cycles between the oxidation states +2 and +4 (in one step between +3.5 and +4.3 V), cobalt- between +2 and +3, while Mn (usually >20%) and Al (typically, only 5% is needed) remain in +4 and 3+, respectively. Thus increasing the Ni content increases the cyclable charge. For example, NCM111 shows 160 mAh/g, while LiNi0.8Co0.1Mn0.1O2 (NCM811) and LiNi0.8Co0.15Al0.05O2 (NCA) deliver a higher capacity of ~200 mAh/g. NCM and NCA batteries are collectively called Ternary Lithium Batteries. It is worth mentioning so-called "lithium-rich" cathodes that can be produced from traditional NCM (LiMO2, where M=Ni, Co, Mn) layered cathode materials upon cycling them to voltages/charges corresponding to Li:M<0.5. Under such conditions a new semi-reversible redox transition at a higher voltage with ca. 0.4-0.8 electrons/metal site charge appears. This transition involves non-binding electron orbitals centered mostly on O atoms. Despite significant initial interest, this phenomenon did not result in marketable products because of the fast structural degradation (O2 evolution and lattice rearrangements) of such "lithium-rich" phases. LiMn2O4 adopts a cubic lattice, which allows for three-dimensional lithium-ion diffusion. Manganese cathodes are attractive because manganese is less expensive than cobalt or nickel. The operating voltage of Li-LiMn2O4 battery is 4 V, and ca. one lithium per two Mn ions can be reversibly extracted from the tetrahedral sites, resulting in a practical capacity of <130 mA h g–1. However, Mn3+ is not a stable oxidation state, as it tends to disporportionate into insoluble Mn4+ and soluble Mn2+. LiMn2O4 can also intercalate more than 0.5 Li per Mn at a lower voltage around +3.0 V. However, this results in an irreversible phase transition due to Jahn-Teller distortion in Mn3+:t2g3eg1, as well as disproportionation and dissolution of Mn3+. An important improvement of Mn spinel are related cubic structures of the LiMn1.5Ni0.5O4 type, where Mn exists as Mn4+ and Ni cycles reversibly between the oxidation states +2 and +4. This materials show a reversible Li-ion capacity of ca. 135 mAh/g around 4.7 V. Although such high voltage is beneficial for increasing the specific energy of batteries, the adoption of such materials is currently hindered by the lack of suitable high-voltage electrolytes. In general, materials with a high nickel content are favored in 2023, because of the possibility of a 2-electron cycling of Ni between the oxidation states +2 and +4. LiV2O4 (lithium vanadium oxide) operates as a lower (ca. +3.0 V) voltage than LiMn2O4, suffers from similar durability issues, is more expensive, and thus is not considered of practical interest. Around 1980 Manthiram discovered that oxoanions (molybdates and tungstates in that particular case) cause a substantial positive shift in the redox potential of the metal-ion compared to oxides. In addition, these oxoanionic cathode materials offer better stability/safety than the corresponding oxides. However, they also suffer from poor electronic conductivity due to the long distance between redox-active metal centers, which slows down the electron transport. This necessitates the use of small (less than 200 nm) cathode particles and coating each particle with a layer of electronically-conducting carbon. This reduces the packing density of these materials. Although numerous combinations of oxoanions (sulfate, phosphate, silicate) with various metals (mostly Mn, Fe, Co, Ni) have been studied, LiFePO4 is the only one that has been commercialized. Although it was originally used primarily for stationary energy storage due to its lower energy density compared to layered oxides, it has begun to be widely used in electric vehicles since the 2020s. Negative electrode materials are traditionally constructed from graphite and other carbon materials, although newer silicon-based materials are being increasingly used (see Nanowire battery). In 2016, 89% of lithium-ion batteries contained graphite (43% artificial and 46% natural), 7% contained amorphous carbon (either soft carbon or hard carbon), 2% contained lithium titanate (LTO) and 2% contained silicon or tin-based materials. These materials are used because they are abundant, electrically conducting and can intercalate lithium ions to store electrical charge with modest volume expansion (~10%). Graphite is the dominant material because of its low intercalation voltage and excellent performance. Various alternative materials with higher capacities have been proposed, but they usually have higher voltages, which reduces energy density. Low voltage is the key requirement for anodes; otherwise, the excess capacity is useless in terms of energy density. Pure Si can present a capacity density around 4200 mAh/g, but it will undergo a severe volume expansion (>300%), so it often being mixed with graphite. Another approach used carbon-coated 15 nm thick crystal silicon flakes. The tested half-cell achieved 1200 mAh/g over 800 cycles. As graphite is limited to a maximum capacity of 372 mAh/g much research has been dedicated to the development of materials that exhibit higher theoretical capacities and overcoming the technical challenges that presently encumber their implementation. The extensive 2007 Review Article by Kasavajjula et al. summarizes early research on silicon-based anodes for lithium-ion secondary cells. In particular, Hong Li et al. showed in 2000 that the electrochemical insertion of lithium ions in silicon nanoparticles and silicon nanowires leads to the formation of an amorphous Li–Si alloy. The same year, Bo Gao and his doctoral advisor, Professor Otto Zhou described the cycling of electrochemical cells with anodes comprising silicon nanowires, with a reversible capacity ranging from at least approximately 900 to 1500 mAh/g. Diamond-like carbon coatings can increase retention capacity by 40% and cycle life by 400% for lithium based batteries. To improve the stability of the lithium anode, several approaches to installing a protective layer have been suggested. Silicon is beginning to be looked at as an anode material because it can accommodate significantly more lithium ions, storing up to 10 times the electric charge, however this alloying between lithium and silicon results in significant volume expansion (ca. 400%), which causes catastrophic failure for the cell. Silicon has been used as an anode material but the insertion and extraction of Li + {\displaystyle {\ce {\scriptstyle Li+}}} can create cracks in the material. These cracks expose the Si surface to an electrolyte, causing decomposition and the formation of a solid electrolyte interphase (SEI) on the new Si surface (crumpled graphene encapsulated Si nanoparticles). This SEI will continue to grow thicker, deplete the available Li + {\displaystyle {\ce {\scriptstyle Li+}}} , and degrade the capacity and cycling stability of the anode. In addition to carbon- and silicon- based anode materials for lithium-ion batteries, high-entropy metal oxide materials are being developed. These conversion (rather than intercalation) materials comprise an alloy (or subnanometer mixed phases) of several metal oxides performing different functions. For example, Zn and Co can act as electroactive charge-storing species, Cu can provide an electronically conducting support phase and MgO can prevent pulverization. Liquid electrolytes in lithium-ion batteries consist of lithium salts, such as LiPF6, LiBF4 or LiClO4 in an organic solvent, such as ethylene carbonate, dimethyl carbonate, and diethyl carbonate. A liquid electrolyte acts as a conductive pathway for the movement of cations passing from the negative to the positive electrodes during discharge. Typical conductivities of liquid electrolyte at room temperature (20 °C (68 °F)) are in the range of 10 mS/cm, increasing by approximately 30–40% at 40 °C (104 °F) and decreasing slightly at 0 °C (32 °F). The combination of linear and cyclic carbonates (e.g., ethylene carbonate (EC) and dimethyl carbonate (DMC)) offers high conductivity and solid electrolyte interphase (SEI)-forming ability. While EC forms a stable SEI, it is not a liquid at room temperature, only becoming a liquid with the addition of additives such as the previously mentioned DMC or diethyl carbonate (DEC) or ethyl methyl carbonate (EMC). Organic solvents easily decompose on the negative electrodes during charge. When appropriate organic solvents are used as the electrolyte, the solvent decomposes on initial charging and forms a solid layer called the solid electrolyte interphase, which is electrically insulating, yet provides significant ionic conductivity, behaving as a solid electrolyte. The interphase prevents further decomposition of the electrolyte after the second charge as it grows thick enough to prevent electron tunneling after the first charge cycle. For example, ethylene carbonate is decomposed at a relatively high voltage, 0.7 V vs. lithium, and forms a dense and stable interface. Composite electrolytes based on POE (poly(oxyethylene)) provide a relatively stable interface. It can be either solid (high molecular weight) and be applied in dry Li-polymer cells, or liquid (low molecular weight) and be applied in regular Li-ion cells. Room-temperature ionic liquids (RTILs) are another approach to limiting the flammability and volatility of organic electrolytes. The term solid electrolyte interphase was first coined by Peled in 1979 to describe the layer of insoluble products deposited on alkali and alkaline earth cathodes in non-aqueous batteries (NAB). However, Dey and Sullivan had noted previously in 1970 that graphite, in a lithium metal half cell using propylene carbonate (PC), reduced the electrolyte during discharge at a rate which linearly increased with the current. They proposed that the following reaction was taking place: The same reaction was later proposed by Fong et al in 1990, where they theorized that the carbonate ion was reacting with the lithium to form lithium carbonate, which was then forming a passivating layer on the surface of the graphite. PC is no longer used in batteries today as the molecules can intercalate into the graphite layers and react with the lithium there to form propylene and acts to delaminate the graphite. The insulating properties of the SEI allow the battery to reach more extreme voltage gaps without simply reducing the electrolyte. This ability of the SEI to improve the voltage window of batteries was discovered almost by accident but plays a vital role in high voltage batteries today. Recent advances in battery technology involve using a solid as the electrolyte material. The most promising of these are ceramics. Solid ceramic electrolytes are mostly lithium metal oxides, which allow lithium-ion transport through the solid more readily due to the intrinsic lithium. The main benefit of solid electrolytes is that there is no risk of leaks, which is a serious safety issue for batteries with liquid electrolytes. Solid ceramic electrolytes can be further broken down into two main categories: ceramic and glassy. Ceramic solid electrolytes are highly ordered compounds with crystal structures that usually have ion transport channels. Common ceramic electrolytes are lithium super ion conductors (LISICON) and perovskites. Glassy solid electrolytes are amorphous atomic structures made up of similar elements to ceramic solid electrolytes but have higher conductivities overall due to higher conductivity at grain boundaries. Both glassy and ceramic electrolytes can be made more ionically conductive by substituting sulfur for oxygen. The larger radius of sulfur and its higher ability to be polarized allow higher conductivity of lithium. This contributes to conductivities of solid electrolytes are nearing parity with their liquid counterparts, with most on the order of 0.1 mS/cm and the best at 10 mS/cm. An efficient and economic way to tune targeted electrolytes properties is by adding a third component in small concentrations, known as an additive. By adding the additive in small amounts, the bulk properties of the electrolyte system will not be affected whilst the targeted property can be significantly improved. The numerous additives that have been tested can be divided into the following three distinct categories: (1) those used for SEI chemistry modifications; (2) those used for enhancing the ion conduction properties; (3) those used for improving the safety of the cell (e.g. prevent overcharging).[citation needed] Electrolyte alternatives have also played a significant role, for example the lithium polymer battery. Polymer electrolytes are promising for minimizing the dendrite formation of lithium. Polymers are supposed to prevent short circuits and maintain conductivity. The ions in the electrolyte diffuse because there are small changes in the electrolyte concentration. Linear diffusion is only considered here. The change in concentration c, as a function of time t and distance x, is In this equation, D is the diffusion coefficient for the lithium ion. It has a value of 7.5×10−10 m2/s in the LiPF6 electrolyte. The value for ε, the porosity of the electrolyte, is 0.724. Dry electrode manufacturing is a solvent-free electrode preparation process that serves as an alternative to the traditional slurry coating method for lithium-ion batteries. Unlike conventional methods requiring liquid solvents such as N-methylpyrrolidone (NMP) to mix active materials, dry electrode technology relies on mechanical mixing, dry coating, and compaction to form a dense electrode structure. The typical preparation of dry electrodes involves three steps: Dry mixing: Active materials, conductive agents, and binders are uniformly blended under solvent-free conditions. Dry coating: The powder mixture is evenly coated onto the current collector surface under shear force. Compression/Calendering: The coated layer is compressed to achieve the target thickness and sufficient mechanical strength. The dry electrode process eliminates the need for drying equipment, NMP recovery systems, or procedures for handling thick slurries since it operates entirely without solvents. Its advantages include: significantly reduced energy consumption and manufacturing costs, elimination of the toxic solvent NMP, enhanced environmental sustainability, and the ability to produce thicker electrodes with higher loading capacities. Dry electrodes typically utilize polytetrafluoroethylene (PTFE) as a binder. Under shear stress, PTFE forms a network of elongated fibers that permeates the entire electrode structure. This PTFE fiber network provides the electrode with exceptional mechanical strength, flexibility, and particle adhesion, thereby compensating for structural deficiencies in the absence of slurry mixing. Recent studies have introduced biomass additives such as starch, cellulose, and flour to enhance the pore structure and flexibility of electrodes. These additives promote intermolecular crosslinking, reduce tortuosity, and improve electrolyte wettability. For instance, incorporating 1 wt.% flour into PTFE dry electrodes significantly increases mechanical strength, accelerates lithium-ion transport, and enhances high-rate performance. Leveraging the synergistic interaction between PTFE fiber networks and biomass additives, dry electrodes demonstrate: reduced bending, accelerated lithium-ion transport, enhanced rate performance in high-voltage cathodes (e.g., NCM811), superior cycling stability due to diminished particle cracking, and more uniform electrolyte permeation with improved interfacial stability. These findings indicate that dry electrode technology holds significant potential for scalable, sustainable battery manufacturing. Although dry electrode manufacturing excels in environmental friendliness and high energy density, it still faces challenges in industrialization and mass production. First, high-thickness electrodes may exhibit localized density variations during pressing, leading to reduced cycle life or unstable electrochemical performance. Second, PTFE binders carry higher costs, and while biomass additives help improve pore structure, their ratios require optimization to balance performance and processing stability. Furthermore, scaling up laboratory-scale preparation methods to industrial production necessitates addressing issues such as coating uniformity, consistent pressing, and quality control. Future development directions include: researching low-cost or biodegradable binder alternatives; developing thick electrode designs that balance high energy density with mechanical stability; introducing automated quality monitoring technologies to support mass production; and utilizing advanced characterization methods to optimize pore structure and ion transport properties. These improvements are expected to drive the widespread adoption of dry-process electrode technology in commercial lithium-ion batteries. Battery designs and formats Lithium-ion batteries may have multiple levels of structure. Small batteries consist of a single battery cell. Larger batteries connect cells in parallel into a module and connect modules in series and parallel into a pack. Multiple packs may be connected in series to increase the voltage. Batteries may be equipped with temperature sensors, heating/cooling systems, voltage regulator circuits, voltage taps, and charge-state monitors. These components address safety risks like overheating and short circuiting. On the macrostructral level (length scale 0.1–5 mm) almost all commercial lithium-ion batteries comprise foil current collectors (aluminium for cathode and copper for anode). Copper is selected for the anode, because lithium does not alloy with it. Aluminum is used for the cathode, because it passivates in LiPF6 electrolytes. Li-ion cells are available in various form factors, which can generally be divided into four types: Cells with a cylindrical shape are made in a characteristic "swiss roll" manner (known as a "jelly roll" in the US), which means it is a single long "sandwich" of the positive electrode, separator, negative electrode, and separator rolled into a single spool. The result is encased in a container. One advantage of cylindrical cells is faster production speed. One disadvantage can be a large radial temperature gradient at high discharge rates. The absence of a case gives pouch cells the highest gravimetric energy density; however, many applications require containment to prevent expansion when their state of charge (SOC) level is high, and for general structural stability. Both rigid plastic and pouch-style cells are sometimes referred to as prismatic cells due to their rectangular shapes. Three basic battery types are used in 2020s-era electric vehicles: cylindrical cells (e.g., Tesla), prismatic pouch (e.g., from LG), and prismatic can cells (e.g., from LG, Samsung, Panasonic, and others). Lithium-ion flow batteries have been demonstrated that suspend the cathode or anode material in an aqueous or organic solution. As of 2014, the smallest Li-ion cell was pin-shaped with a diameter of 3.5 mm and a weight of 0.6 g, made by Panasonic. A coin cell form factor is available for LiCoO2 cells, usually designated with a "LiR" prefix. The average voltage of LCO (lithium cobalt oxide) chemistry is 3.6v if made with hard carbon cathode and 3.7v if made with graphite cathode. Comparatively, the latter has a flatter discharge voltage curve.: 25–26 Uses Lithium-ion batteries are used in a multitude of applications, including consumer electronics, toys, power tools, and electric vehicles. More niche uses include backup power in telecommunications applications. Lithium-ion batteries are also frequently discussed as a potential option for grid energy storage, although as of 2020, they were not yet cost-competitive at scale. Some submarines have also been equipped with lithium-ion batteries. Performance Because lithium-ion batteries can have a variety of positive and negative electrode materials, the energy density and voltage vary accordingly. The open-circuit voltage is higher than in aqueous batteries (such as lead–acid, nickel–metal hydride and nickel–cadmium). Internal resistance increases with both cycling and age, although this depends strongly on the voltage and temperature the batteries are stored at. Rising internal resistance causes the voltage at the terminals to drop under load, which reduces the maximum current draw. Eventually, increasing resistance will leave the battery in a state such that it can no longer support the normal discharge currents requested of it without unacceptable voltage drop or overheating. Batteries with a lithium iron phosphate positive and graphite negative electrodes have a nominal open-circuit voltage of 3.2 V and a typical charging voltage of 3.6 V. Lithium nickel manganese cobalt (NMC) oxide positives with graphite negatives have a 3.7 V nominal voltage with a 4.2 V maximum while charging. The charging procedure is performed at constant voltage with current-limiting circuitry (i.e., charging with constant current until a voltage of 4.2 V is reached in the cell and continuing with a constant voltage applied until the current drops close to zero). Typically, the charge is terminated at 3% of the initial charge current. In the past, lithium-ion batteries could not be fast-charged and needed at least two hours to fully charge. Current-generation cells can be fully charged in 45 minutes or less. In 2015 researchers demonstrated a small 600 mAh capacity battery charged to 68 percent capacity in two minutes and a 3,000 mAh battery charged to 48 percent capacity in five minutes. The latter battery has an energy density of 620 W·h/L. The device employed heteroatoms bonded to graphite molecules in the anode. Performance of manufactured batteries has improved over time. For example, from 1991 to 2005 the energy capacity per price of lithium-ion batteries improved more than ten-fold, from 0.3 W·h per dollar to over 3 W·h per dollar. In the period from 2011 to 2017, progress has averaged 7.5% annually. Overall, between 1991 and 2018, prices for all types of lithium-ion cells (in dollars per kWh) fell approximately 97%. Over the same time period, energy density more than tripled. Efforts to increase energy density contributed significantly to cost reduction. Energy density can also be increased by improvements in the chemistry if the cell, for instance, by full or partial replacement of graphite with silicon. Silicon anodes enhanced with graphene nanotubes to eliminate the premature degradation of silicon open the door to reaching record-breaking battery energy density of up to 350 Wh/kg and lowering EV prices to be competitive with ICEs. Differently sized cells of the same format (shape) with the same chemistry may have different energy densities. Jelly roll cells usually have a higher energy density than coin or prismatic cells of the same Ah, because of a tighter/compresses packing of the cell layers. Prismatic cells are physically larger and heavier than cylindrical cells and can hold as much as more than 20 cylindrical cells. Among cylindrical cells, those with a larger size have a larger energy density, albeit the exact value strongly depends on the thickness of the electrode layers. The disadvantage of large cells is decrease of the heat transfer from the cell to its surroundings. The table below shows the result of an experimental evaluation of a "high-energy" type 3.0 Ah 18650 NMC cell in 2021, round-trip efficiency which compared the energy going into the cell and energy extracted from the cell from 100% (4.2v) SoC to 0% SoC (cut off 2.0v). A round-trip efficiency is the percent of energy that can be used relative to the energy that went into charging the battery. Characterization of a cell in a different experiment in 2017 reported round-trip efficiency of 85.5% at 2C and 97.6% at 0.1C Lifespan The lifespan of a lithium-ion battery is typically defined as the number of full charge-discharge cycles to reach a failure threshold in terms of capacity loss or impedance rise. Manufacturers' datasheets typically uses the word "cycle life" to specify lifespan in terms of the number of cycles to reach 80% of the rated battery capacity. Simply storing lithium-ion batteries in the charged state also reduces their capacity (the amount of cyclable Li+) and increases the cell resistance (primarily due to the continuous growth of the solid electrolyte interface on the anode). Calendar life is used to represent the whole life cycle of battery involving both the cycle and inactive storage operations. Battery cycle life is affected by many different stress factors including temperature, discharge current, charge current, and state of charge ranges (depth of discharge). Batteries are not fully charged and discharged in real applications such as smartphones, laptops and electric cars and hence defining battery life via full discharge cycles can be misleading. To avoid this confusion, researchers sometimes use cumulative discharge defined as the total amount of charge (Ah) delivered by the battery during its entire life or equivalent full cycles, which represents the summation of the partial cycles as fractions of a full charge-discharge cycle. Battery degradation during storage is affected by temperature and battery state of charge (SOC) and a combination of full charge (100% SOC) and high temperature (usually > 50 °C) can result in a sharp capacity drop and gas generation. Multiplying the battery cumulative discharge by the rated nominal voltage gives the total energy delivered over the life of the battery. From this one can calculate the cost per kWh of the energy (including the cost of charging). Over their lifespan, batteries degrade gradually leading to reduced cyclable charge (a.k.a. Ah capacity) and increased resistance (the latter translates into a lower operating cell voltage). Several degradation processes occur in lithium-ion batteries, some during cycling, some during storage, and some all the time: Degradation is strongly temperature-dependent: degradation at room temperature is minimal but increases for batteries stored or used in high temperature (usually > 35 °C) or low temperature (usually < 5 °C) environments. Also, battery life in room temperature is maximal. High charge levels also hasten capacity loss. Frequent charge to > 90% and discharge to < 10% may also hasten capacity loss.[citation needed] Keeping the li-ion battery status to about 60% to 80% can reduce the capacity loss. In a study, scientists provided 3D imaging and model analysis to reveal main causes, mechanics, and potential mitigations of the problematic degradation of the batteries over charge cycles. They found "[p]article cracking increases and contact loss between particles and carbon-binder domain are observed to correlate with the cell degradation" and indicates that "the reaction heterogeneity within the thick cathode caused by the unbalanced electron conduction is the main cause of the battery degradation over cycling".[additional citation(s) needed] The most common degradation mechanisms in lithium-ion batteries include: These are shown in the figure on the right. A change from one main degradation mechanism to another appears as a knee (slope change) in the capacity vs. cycle number plot. Most studies of lithium-ion battery aging have been done at elevated (50–60 °C) temperatures in order to complete the experiments sooner. Under these storage conditions, fully charged nickel-cobalt-aluminum and lithium-iron phosphate cells lose ca. 20% of their cyclable charge in 1–2 years. It is believed that the aforementioned anode aging is the most important degradation pathway in these cases. On the other hand, manganese-based cathodes show a (ca. 20–50%) faster degradation under these conditions, probably due to the additional mechanism of Mn ion dissolution. At 25 °C the degradation of lithium-ion batteries seems to follow the same pathway(s) as the degradation at 50 °C, but with half the speed. In other words, based on the limited extrapolated experimental data, lithium-ion batteries are expected to lose irreversibly ca. 20% of their cyclable charge in 3–5 years or 1000–2000 cycles at 25 °C. Lithium-ion batteries with titanate anodes do not suffer from SEI growth, and last longer (>5000 cycles) than graphite anodes. However, in complete cells other degradation mechanisms (i.e. the dissolution of Mn3+ and the Ni2+/Li+ place exchange, decomposition of PVDF binder and particle detachment) show up after 1000–2000 days, and the use titanate anode does not improve full cell durability in practice. A more detailed description of some of these mechanisms is provided below: Depending on the electrolyte and additives, common components of the SEI layer that forms on the anode include a mixture of lithium oxide, lithium fluoride and semicarbonates (e.g., lithium alkyl carbonates). At elevated temperatures, alkyl carbonates in the electrolyte decompose into insoluble species such as Li2CO3 that increases the film thickness. This increases cell impedance and reduces cycling capacity. Gases formed by electrolyte decomposition can increase the cell's internal pressure and are a potential safety issue in demanding environments such as mobile devices. Below 25 °C, plating of metallic Lithium on the anodes and subsequent reaction with the electrolyte is leading to loss of cyclable Lithium. Extended storage can trigger an incremental increase in film thickness and capacity loss. Charging at greater than 4.2 V can initiate Li+ plating on the anode, producing irreversible capacity loss. Electrolyte degradation mechanisms include hydrolysis and thermal decomposition. At concentrations as low as 10 ppm, water begins catalyzing a number of degradation products that can affect the electrolyte, anode and cathode. LiPF6 participates in an equilibrium reaction with LiF and PF5. Under typical conditions, the equilibrium lies far to the left. However the presence of water generates substantial LiF, an insoluble, electrically insulating product. LiF binds to the anode surface, increasing film thickness. LiPF6 hydrolysis yields PF5, a strong Lewis acid that reacts with electron-rich species, such as water. PF5 reacts with water to form hydrofluoric acid (HF) and phosphorus oxyfluoride. Phosphorus oxyfluoride in turn reacts to form additional HF and difluorohydroxy phosphoric acid. HF converts the rigid SEI film into a fragile one. On the cathode, the carbonate solvent can then diffuse onto the cathode oxide over time, releasing heat and potentially causing thermal runaway. Decomposition of electrolyte salts and interactions between the salts and solvent start at as low as 70 °C. Significant decomposition occurs at higher temperatures. At 85 °C transesterification products, such as dimethyl-2,5-dioxahexane carboxylate (DMDOHC) are formed from EC reacting with DMC. Batteries generate heat when being charged or discharged, especially at high currents. Large battery packs, such as those used in electric vehicles, are generally equipped with thermal management systems that maintain a temperature between 15 °C (59 °F) and 35 °C (95 °F). Pouch and cylindrical cell temperatures depend linearly on the discharge current. Poor internal ventilation may increase temperatures. For large batteries consisting of multiple cells, non-uniform temperatures can lead to non-uniform and accelerated degradation. In contrast, the calendar life of LiFePO4 cells is not affected by high charge states. The IEEE standard 1188–1996 recommends replacing lithium-ion batteries in an electric vehicle, when their charge capacity drops to 80% of the nominal value. In what follows, we shall use the 20% capacity loss as a comparison point between different studies. We shall note, nevertheless, that the linear model of degradation (the constant % of charge loss per cycle or per calendar time) is not always applicable, and that a "knee point", observed as a change of the slope, and related to the change of the main degradation mechanism, is often observed. Safety The problem of lithium-ion battery safety was recognized even before these batteries were first commercially released in 1991. The two main reasons for lithium-ion battery fires and explosions are related to processes on the negative electrode (anode when discharging, cathode when charging). During a normal battery charge lithium ions intercalate into graphite. However, if the charge is too fast or the temperature is too low lithium metal starts plating on the negative electrode, and the resulting dendrites can penetrate the battery separator, internally short-circuit the cell, and result in high electric current, heating and ignition. In other mechanisms, an explosive reaction between the negative electrode material (LiC6) and the solvent (liquid organic carbonate) occurs even at open circuit, provided that the electrode temperature exceeds a certain threshold above 70 °C. Nowadays, all reputable manufacturers[who?] employ at least two safety devices in all their lithium-ion batteries of an 18650 format or larger: a current interrupt (CID) device and a positive temperature coefficient (PTC) device. The CID comprises two metal disks that make an electric contact with each other. When pressure inside the cell increases, the distance between the two disks increases too and they lose the electric contact with each other, thus terminating the electric current through the battery. The PTC device is made of an electrically conducting polymer. When the current through the PTC device increases, the polymer gets hot, and its electric resistance rises sharply, thus reducing the current through the battery. Lithium-ion batteries can be a safety hazard since they contain a flammable electrolyte and may become pressurized if they become damaged. A battery cell charged too quickly could cause a short circuit, leading to overheating, explosions, and fires. A Li-ion battery fire can be started due to Because of these risks, testing standards are more stringent than those for acid-electrolyte batteries, requiring both a broader range of test conditions and additional battery-specific tests, and there are shipping limitations imposed by safety regulators. There have been battery-related recalls by some companies, including the 2016 Samsung Galaxy Note 7 recall for battery fires. Lithium-ion batteries have a flammable liquid electrolyte. A faulty battery can cause a serious fire. Faulty chargers can affect the safety of the battery because they can destroy the battery's protection circuit. While charging at temperatures below 0 °C, the negative electrode of the cells gets plated with pure lithium, which can compromise the safety of the whole pack. Short-circuiting a battery will cause the cell to overheat and possibly to catch fire. Smoke from thermal runaway in a Li-ion battery is both flammable and toxic. Batteries are tested according to the UL 9540A fire standard, and the TS-800 standard also tests fire propagation from one battery container to adjacent containers. Around 2010, large lithium-ion batteries were introduced in place of other chemistries to power systems on some aircraft; as of January 2014[update], there had been at least four serious lithium-ion battery fires, or smoke, on the Boeing 787 passenger aircraft, introduced in 2011, which did not cause crashes but had the potential to do so. UPS Airlines Flight 6 crashed in Dubai after its payload of batteries spontaneously ignited. To reduce fire hazards, research projects are intended to develop non-flammable electrolytes. If a lithium-ion battery is damaged, crushed, or subjected to a higher electrical load without having overcharge protection, problems may arise. External short circuit can trigger a battery explosion. Such incidents can occur when lithium-ion batteries are not disposed of through the appropriate channels, but are thrown away with other waste. The way they are treated by recycling companies can damage them and cause fires, which in turn can lead to large-scale conflagrations. Twelve such fires were recorded in Swiss recycling facilities in 2023. If overheated or overcharged, Li-ion batteries may suffer thermal runaway and cell rupture. During thermal runaway, internal degradation and oxidization processes can keep cell temperatures above 500 °C, with the possibility of igniting secondary combustibles, as well as leading to leakage, explosion or fire in extreme cases. To reduce these risks, many lithium-ion cells (and battery packs) contain fail-safe circuitry that disconnects the battery when its voltage is outside the safe range of 3–4.2 V per cell, or when overcharged or discharged. Lithium battery packs, whether constructed by a vendor or the end-user, without effective battery management circuits are susceptible to these issues. Poorly designed or implemented battery management circuits also may cause problems; it is difficult to be certain that any particular battery management circuitry is properly implemented. Lithium-ion cells are susceptible to stress by voltage ranges outside of safe ones between 2.5 and 3.65/4.1/4.2 or 4.35 V (depending on the components of the cell). Exceeding this voltage range results in premature aging and in safety risks due to the reactive components in the cells. When stored for long periods the small current draw of the protection circuitry may drain the battery below its shutoff voltage; normal chargers may then be useless since the battery management system (BMS) may retain a record of this battery (or charger) "failure". Many types of lithium-ion cells cannot be charged safely below 0 °C, as this can result in plating of lithium on the anode of the cell, which may cause complications such as internal short-circuit paths. Other safety features are required[by whom?] in each cell: These features are required because the negative electrode produces heat during use, while the positive electrode may produce oxygen. However, these additional devices occupy space inside the cells, add points of failure, and may irreversibly disable the cell when activated. Further, these features increase costs compared to nickel metal hydride batteries, which require only a hydrogen/oxygen recombination device and a back-up pressure valve. Contaminants inside the cells can defeat these safety devices. Also, these features can not be applied to all kinds of cells, e.g., prismatic high-current cells cannot be equipped with a vent or thermal interrupt. High-current cells must not produce excessive heat or oxygen, lest there be a failure, possibly violent. Instead, they must be equipped with internal thermal fuses which act before the anode and cathode reach their thermal limits. Replacing the lithium cobalt oxide positive electrode material in lithium-ion batteries with a lithium metal phosphate such as lithium iron phosphate (LFP) improves cycle counts, shelf life and safety, but lowers capacity. As of 2006, these safer lithium-ion batteries were mainly used in electric cars and other large-capacity battery applications, where safety is critical. In 2016, an LFP-based energy storage system was chosen to be installed in Paiyun Lodge on Mt.Jade (Yushan) (the highest lodge in Taiwan). As of June 2024, the system was still operating safely. In 2006, approximately 10 million Sony batteries used in laptops were recalled, including those in laptops from Dell, Sony, Apple, Lenovo, Panasonic, Toshiba, Hitachi, Fujitsu and Sharp. The batteries were found to be susceptible to internal contamination by metal particles during manufacture. Under some circumstances, these particles could pierce the separator, causing a dangerous short circuit. IATA estimates that over a billion lithium metal and lithium-ion cells are flown each year. Some kinds of lithium batteries may be prohibited aboard aircraft because of the fire hazard. Some postal administrations restrict air shipping (including EMS) of lithium and lithium-ion batteries, either separately or installed in equipment. In 2023, most commercial Li-ion batteries employed alkylcarbonate solvent(s) to assure the formation solid electrolyte interface on the negative electrode. Since such solvents are readily flammable, there has been active research to replace them with non-flammable solvents or to add fire suppressants. Another source of hazard is hexafluorophosphate anion, which is needed to passivate the negative current collector made of aluminium. Hexafluorophosphate reacts with water and releases volatile and toxic hydrogen fluoride. Efforts to replace hexafluorophosphate have been less successful. Supply chain The electric vehicle supply chain comprises the mining and refining of raw materials and the manufacturing processes that produce batteries and other components for electric vehicles. Li-ion battery production is heavily concentrated, with 60% coming from China in 2024. In the 1990s, the United States was the World's largest miner of lithium minerals, contributing to 1/3 of the total production. By 2010 Chile replaced the USA the leading miner, thanks to the development of lithium brines in Salar de Atacama. By 2024, Australia and China joined Chile as the top 3 miners. Extraction of lithium, nickel, and cobalt, manufacture of solvents, and mining byproducts present significant environmental and health hazards. Lithium extraction can be fatal to aquatic life due to water pollution. It is known to cause surface water contamination, drinking water contamination, respiratory problems, ecosystem degradation and landscape damage. It also leads to unsustainable water consumption in arid regions (1.9 million liters per ton of lithium). Massive byproduct generation of lithium extraction also presents unsolved problems, such as large amounts of magnesium and lime waste. Lithium mining takes place in North and South America, Asia, South Africa, Australia, and China. Cobalt for Li-ion batteries is largely mined in the Congo (see also Mining industry of the Democratic Republic of the Congo). Open-pit cobalt mining has led to deforestation and habitat destruction in the Democratic Republic of Congo. Open-pit nickel mining has led to environmental degradation and pollution in developing countries such as the Philippines and Indonesia. In 2024, nickel mining and processing was one of the main causes of deforestation in Indonesia. Manufacturing a kg of Li-ion battery takes about 67 megajoule (MJ) of energy. The global warming potential of lithium-ion batteries manufacturing strongly depends on the energy source used in mining and manufacturing operations, and is difficult to estimate, but one 2019 study estimated 73 kg CO2e/kWh. Effective recycling can reduce the carbon footprint of the production significantly. The recycling of lithium-ion batteries is a growing but underdeveloped industry. Despite their value, global recycling rates remain low; the International Energy Agency estimated in 2024 that only 5% of used electric vehicle batteries were recycled worldwide. Li-ion battery elements including iron, copper, nickel, and cobalt can be recycled, but mining new materials often remains cheaper and easier than collecting, transporting, and processing spent batteries. However, since 2018, recycling processes have improved significantly, and recovering lithium, manganese, aluminum, and graphite is now possible at industrial scales. Accumulation of battery waste presents technical challenges and health hazards. Since the environmental impact of electric cars is heavily affected by the production of lithium-ion batteries, the development of efficient ways to repurpose waste is crucial. Recycling is a multi-step process, starting with the storage of batteries before disposal, followed by manual testing, disassembling, and finally the chemical separation of battery components. Re-use of the battery is preferred over complete recycling as there is less embodied energy in the process. As these batteries are a lot more reactive than classical vehicle waste like tire rubber, there are significant risks to stockpiling used batteries. The pyrometallurgical method uses a high-temperature furnace to reduce the components of the metal oxides in the battery to an alloy of Co, Cu, Fe, and Ni. This is the most common and commercially established method of recycling and can be combined with other similar batteries to increase smelting efficiency and improve thermodynamics. The metal current collectors aid the smelting process, allowing whole cells or modules to be melted at once. The product of this method is a collection of metallic alloy, slag, and gas. At high temperatures, the polymers used to hold the battery cells together burn off and the metal alloy can be separated through a hydrometallurgical process into its separate components. The slag can be further refined or used in the cement industry. The process is relatively risk-free and the exothermic reaction from polymer combustion reduces the required input energy. However, in the process, the plastics, electrolytes, and lithium salts will be lost. This method involves the use of aqueous solutions to remove the desired metals from the cathode. The most common reagent is sulfuric acid. Factors that affect the leaching rate include the concentration of the acid, time, temperature, solid-to-liquid-ratio, and reducing agent. It is experimentally proven that H2O2 acts as a reducing agent to speed up the rate of leaching through the reaction: Once leached, the metals can be extracted through precipitation reactions controlled by changing the pH level of the solution. Cobalt, the most expensive metal, can then be recovered in the form of sulfate, oxalate, hydroxide, or carbonate. More recently,[when?] recycling methods experiment with the direct reproduction of the cathode from the leached metals. In these procedures, concentrations of the various leached metals are premeasured to match the target cathode and then the cathodes are directly synthesized. The main issues with this method, however, are the large volume of solvent required and the high cost of neutralization. Although it is easy to shred up the battery, mixing the cathode and anode at the beginning complicates the process, so they will also need to be separated. Unfortunately, the current design of batteries makes the process extremely complex and it is difficult to separate the metals in a closed-loop battery system. Shredding and dissolving may occur at different locations. Direct recycling is the removal of the cathode or anode from the electrode, reconditioned, and then reused in a new battery. Mixed metal-oxides can be added to the new electrode with very little change to the crystal morphology. The process generally involves the addition of new lithium to replenish the loss of lithium in the cathode due to degradation from cycling. Cathode strips are obtained from the dismantled batteries, then soaked in NMP, and undergo sonication to remove excess deposits. It is treated hydrothermally with a solution containing LiOH/Li2SO4 before annealing. This method is extremely cost-effective for noncobalt-based batteries as the raw materials do not make up the bulk of the cost. Direct recycling avoids the time-consuming and expensive purification steps, which is great for low-cost cathodes such as LiMn2O4 and LiFePO4. For these cheaper cathodes, most of the cost, embedded energy, and carbon footprint is associated with the manufacturing rather than the raw material. It is experimentally shown that direct recycling can reproduce similar properties to pristine graphite. The drawback of the method lies in the condition of the retired battery. In the case where the battery is relatively healthy, direct recycling can cheaply restore its properties. However, for batteries where the state of charge is low, direct recycling may not be worth the investment. The process must also be tailored to the specific cathode composition, and therefore the process must be configured to one type of battery at a time. Lastly, in a time with rapidly developing battery technology, the design of a battery today may no longer be desirable a decade from now, rendering direct recycling ineffective. Physical materials separation recovered materials by mechanical crushing and exploiting physical properties of different components such as particle size, density, ferromagnetism and hydrophobicity. Copper, aluminum and steel casing can be recovered by sorting. The remaining materials, called "black mass", which is composed of nickel, cobalt, lithium and manganese, need a secondary treatment to recover. For the biological metals reclamation or bio-leaching, the process uses microorganisms to digest metal oxides selectively. Then, recyclers can reduce these oxides to produce metal nanoparticles. Although bio-leaching has been used successfully in the mining industry, this process is still nascent to the recycling industry and plenty of opportunities exists for further investigation. Electrolyte recycling consists of two phases. The collection phase extracts the electrolyte from the spent Li-ion battery. This can be achieved through mechanical processes, distillation, freezing, solvent extraction, and supercritical fluid extraction. Due to the volatility, flammability, and sensitivity of the electrolyte, the collection process poses a greater difficulty than the collection process for other components of a Li-ion battery. The next phase consists of separation/electrolyte regeneration. Separation consists of isolating the individual components of the electrolyte. This approach is often used for the direct recovery of the Li salts from the organic solvents. In contrast, regeneration of the electrolyte aims to preserve the electrolyte composition by removing impurities which can be achieved through filtration methods. The recycling of the electrolytes, which consists 10–15 wt.% of the Li-ion battery, provides both an economic and environmental benefits. These benefits include the recovery of the valuable Li-based salts and the prevention of hazardous compounds, such as volatile organic compounds (VOCs) and carcinogens, being released into the environment. Compared to electrode recycling, less focus is placed on recycling the electrolyte of Li-ion batteries which can be attributed to lower economic benefits and greater process challenges. Such challenges can include the difficulty associated with recycling different electrolyte compositions, removing side products accumulated from electrolyte decomposition during its runtime, and removal of electrolyte adsorbed onto the electrodes. Due to these challenges, current pyrometallurgical methods of Li-ion battery recycling forgo electrolyte recovery, releasing hazardous gases upon heating. However, due to high energy consumption and environmental impact, future recycling methods are being directed away from this approach. Extraction of raw materials for lithium-ion batteries may present dangers to local people, especially land-based indigenous populations. Cobalt sourced from the Democratic Republic of the Congo is often mined by workers using hand tools with few safety precautions, resulting in frequent injuries and deaths. Pollution from these mines has exposed people to toxic chemicals that health officials believe to cause birth defects and breathing difficulties. Human rights activists have alleged, and investigative journalism reported confirmation, that child labor is used in these mines. A study of relationships between lithium extraction companies and indigenous peoples in Argentina indicated that the state may not have protected indigenous peoples' right to free prior and informed consent, and that extraction companies generally controlled community access to information and set the terms for discussion of the projects and benefit sharing. Development of the Thacker Pass lithium mine in Nevada, USA has met with protests and lawsuits from several indigenous tribes who have said they were not provided free prior and informed consent and that the project threatens cultural and sacred sites. Links between resource extraction and missing and murdered indigenous women have also prompted local communities to express concerns that the project will create risks to indigenous women. Protestors have been occupying the site of the proposed mine since January, 2021. Research Researchers are actively working to improve the power density, safety, cycle durability (battery life), recharge time, cost, flexibility, and other characteristics, as well as research methods and uses, of these batteries. Solid-state batteries are being researched as a breakthrough in technological barriers. Currently, solid-state batteries are expected to be the most promising next-generation battery, and various companies are working to popularize them. Research areas for lithium-ion batteries include extending lifetime, increasing energy density, improving safety, reducing cost, and increasing charging speed, among others. Research has been under way in the area of non-flammable electrolytes as a pathway to increased safety based on the flammability and volatility of the organic solvents used in the typical electrolyte. Strategies include aqueous lithium-ion batteries, ceramic solid electrolytes, polymer electrolytes, ionic liquids, and heavily fluorinated systems. One of the ways to improve batteries is to combine the various cathode materials. This allows researchers to improve on the qualities of a material, while limiting the negatives. One possibility is coating lithium nickel manganese oxide with lithium iron phosphate through resonant acoustic mixing. The resulting material benefits from an increase electrochemical performance and improved capacity retention. Similar work was done with iron (III) phosphate. As it is now accepted that not only transition metals, but also anions in cathodes participate in redox activity necessary for Lithium insertion and removal, the design of cathode materials with diverse transition metal cations increasingly consider also oxygen redox reactions in lithium-ion battery cathodes and how these may enhance capacity beyond transition metal limitations, with computational studies using density functional theory helping to optimize materials while minimizing structural degradation. Advances in anionic redox understanding have led to stabilization strategies like surface fluorination, improving cycling stability and safety. See also References External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Sarugaku] | [TOKENS: 504]
Contents Sarugaku Sarugaku (猿楽; "monkey music") was a form of theatre popular in Japan during the 11th to 14th centuries. One of its predecessors was a sangaku [ja], a form of entertainment reminiscent of the modern-day circus, consisting mostly of acrobatics, juggling, and pantomime, sometimes combined with drum dancing. Sangaku came from China to Japan in the 8th century and there mingled with indigenous traditions, particularly the harvest celebrations of dengaku. In the 11th century, the form began to favor comic sketches while other elements faded away. By the late 12th century, the term "sarugaku" had come to include comic dialogues based on word play (toben), improvised comic party dances (ranbu), short plays involving several actors, and musical arrangements based on courtesan traditions. During the 13th century, there was increased standardization of words, gestures, musical arrangements, and program combinations; as well as the adoption of the guild (za) system to which all present-day Noh schools can be traced. Kyōgen also developed from sarugaku. Of particular significance is the development of sarugaku troupes in Yamato around Nara and Kyoto during the Kamakura and early Muromachi periods. In particular, the sarugaku Noh troupe Yuzaki, led by Kan'ami, performed in 1374 before the young shōgun Ashikaga Yoshimitsu. The success of this one performance and the resultant shogunal patronage lifted the artform permanently out of the mists of its plebeian past. From then, the term sarugaku gave way to the current nomenclature, Noh. The Japanese term "Sarugaku" is also used in other contexts to refer to a job or profession that seems to debase the employee or to treat him or her as a source of entertainment rather than as a professional. According to William Scott Wilson, Saragaku translates to "monkey music", is an ancient form of drama, and is the predecessor to Noh. Takuan Sōhō states as fact that "the emperor's recitation is given like Sarugaku". References This article related to the music of Japan is a stub. You can help Wikipedia by adding missing information. This theatre-related article is a stub. You can help Wikipedia by adding missing information.
========================================
[SOURCE: https://en.wikipedia.org/wiki/British_Academy_of_Film_and_Television_Arts] | [TOKENS: 3922]
Contents British Academy of Film and Television Arts The British Academy of Film and Television Arts (BAFTA, /ˈbæftə/) is an independent trade association and charity that supports, develops, and promotes the arts of film, television and video games in the United Kingdom. In addition to its annual award ceremonies, BAFTA has an international programme of learning events and initiatives offering access to talent through workshops, masterclasses, scholarships, lectures, and mentoring schemes in the United Kingdom and the United States. BAFTA's annual film awards ceremony, the British Academy Film Awards, has been held since 1949, while its annual television awards ceremony, the British Academy Television Awards, has been held since 1955. Their third ceremony, the British Academy Games Awards, was first presented in 2004. Origins BAFTA started out as the British Film Academy, founded in 1947 by a group of directors: David Lean, Alexander Korda, Roger Manvell, Laurence Olivier, Emeric Pressburger, Michael Powell, Michael Balcon, Carol Reed, and other major figures of the British film industry. David Lean was the founding chairman. The first Film Awards ceremony took place in May 1949, honouring the films The Best Years of Our Lives, Odd Man Out and The World Is Rich. The Guild of Television Producers and Directors was set up in 1953 with the first awards ceremony in October 1954, and in 1958 merged with the British Film Academy to form the Society of Film and Television Arts, whose inaugural meeting was held at Buckingham Palace and presided over by the Duke of Edinburgh.[citation needed] 195 Piccadilly The Society of Film and Television Arts acquired the historic Prince's Hall facilities at 195 Piccadilly, following the Royal Institute of Painters in Water Colours' move to the Mall Galleries. Queen Elizabeth, The Duke of Edinburgh, The Princess Royal and The Earl Mountbatten of Burma opened the organisation's headquarters in 1976, and it became the British Academy of Film and Television Arts in March 1976. In 2016 BAFTA embarked on an extensive renovation of the Grade II listed property. Benedetti Architects oversaw a £33M+ remodel, doubling its original capacity with an additional floor, raising and restoring two large Victorian rooflight structures and decorative plasterwork, creating an entire floor devoted to BAFTA's learning and new talent programmes, and revamping the property's food, beverage, and events operations in order to maximize revenue to sustain property maintenance and operations. The new facilities were reopened in 2022. Organisational structure BAFTA is a membership organisation comprising approximately 8,000 individuals worldwide who are creatives and professionals working in and making a contribution to the film, television and games industries in the UK. In 2005, it placed an overall cap on worldwide voting membership which stood at approximately 6,500 as of 2017[update]. BAFTA does not receive any funding from the government; it relies on income from membership subscriptions, individual donations, trusts, foundations and corporate partnerships to support its ongoing outreach work. BAFTA has offices in Scotland and Wales in the UK, in Los Angeles and New York in the United States and runs events in Hong Kong and mainland China. Amanda Berry served as chief executive of the organisation between December 2000 and October 2022. Jane Millichip has held the position since October 2022. Initiatives In addition to its high-profile awards ceremonies, BAFTA manages a year-round programme of educational events and initiatives including film screenings and Q&As, tribute evenings, interviews, lectures, and debates with major industry figures. With over 250 events a year, BAFTA's stated aim is to inspire and inform the next generation of talent by providing a platform for some of the world's most talented practitioners to pass on their knowledge and experience.[peacock prose] BAFTA runs a number of scholarship programmes across the UK, United States and Asia. Launched in 2012, the UK programme enables talented British citizens who are in need of financial support to take an industry-recognised course in film, television or games in the UK. Each BAFTA Scholar receives up to £12,000 towards their annual course fees, and mentoring support from a BAFTA member and free access to BAFTA events around the UK. Since 2013, three students every year have received one of the Prince William Scholarships in Film, Television and Games, supported by BAFTA and Warner Bros. These scholarships are awarded in the name of Prince William, Duke of Cambridge in his role as president of BAFTA. In the U.S., BAFTA Los Angeles offers financial support and mentorship to British graduate students studying in the US, as well as scholarships to provide financial aid to local LA students from the inner city. BAFTA New York's Media Studies Scholarship Program, set up in 2012, supports students pursuing media studies at undergraduate and graduate level institutions within the New York City area and includes financial aid and mentoring opportunities. Since 2015, BAFTA has been offering scholarships for British citizens to study in China, and vice versa. In 2011 BAFTA founded the organisation Albert, which promotes sustainable film and television production through its carbon footprint calculator and certification. All BBC, ITV, Channel 4, UKTV, Sky and Netflix productions in the UK are required to register their carbon footprint using the Albert carbon calculator, and the BBC requires all television commissions to be Albert certified. Awards BAFTA presents awards for film, television and games, including children's entertainment, at a number of annual ceremonies across the UK and in Los Angeles. The BAFTA award trophy is a mask, designed by American sculptor Mitzi Cunliffe. When the Guild merged with the British Film Academy to become the Society of Film and Television Arts, later the British Academy of Film and Television Arts, the first "BAFTA award" was presented to Sir Charles Chaplin on his Academy Fellowship that year. A BAFTA award – including the bronze mask and marble base – weighs 3.7 kg (8.2 lb) and measures 27 cm (11 in) high × 14 cm (5.5 in) wide × 8 cm (3.1 in) deep; the mask itself measures 16 cm (6.3 in) high × 14 cm (5.5 in) wide. They are made of phosphor bronze and cast in a Middlesex foundry. In 2017, the British Academy of Film and Television Arts introduced new entry rules for British films starting from the 2018/19 season to foster diversity. BAFTA's annual film awards ceremony is known as the British Academy Film Awards, or "the BAFTAs", and reward the best work of any nationality seen on British cinema screens during the preceding year. In 1949 the British Film Academy, as it was then known, presented the first awards for films made in 1947 and 1948. Since 2008 the ceremony has been held at the Royal Opera House in London's Covent Garden. It had been held in the Odeon cinema on Leicester Square since 2000. Since 2017, the BAFTA ceremony has been held at the Royal Albert Hall. The ceremony had been performed during April or May of each year, but beginning 2002 it has been held in February to precede the Academy Awards (Oscars) in the United States, making the BAFTA Film Awards a major precursor of the eventual annual results of the Oscar ceremonies since. In order for a film to be considered for a BAFTA nomination, its first public exhibition must be displayed in a cinema and it must have a UK theatrical release for no fewer than seven days of the calendar year that corresponds to the upcoming awards. A movie must be of feature-length and movies from all countries are eligible in all categories, with the exception of the Alexander Korda Award for Outstanding British Film and Outstanding Debut which are for British films or individuals only. The British Academy Television Awards ceremony usually takes place during April or May, with its sister ceremony, the British Academy Television Craft Awards, usually occurring within a few weeks of it. The Television Awards, celebrating the best TV programmes and performances of the past year, are also often referred to simply as "the BAFTAs" or, to differentiate them from the movie awards, the "BAFTA Television Awards". They have been awarded annually since 1954. The first ever ceremony consisted of six categories. Until 1958, they were awarded by the Guild of Television Producers and Directors. From 1968 until 1997, BAFTA's Film and Television Awards were presented together, but from 1998 onwards they were presented at two separate ceremonies. The Television Craft Awards celebrate the talent behind the programmes, such as individuals working in visual effects, production, and costume design. Only British programmes are eligible – with the potential exception of the publicly voted Audience Award – but any cable, satellite, terrestrial or digital television stations broadcasting in the UK are eligible to submit entries, as are independent production companies who have produced programming for the channels. Individual performances can either be entered by the performers themselves or by the broadcasters. The programmes being entered must have been broadcast on or between 1 January and 31 December of the year preceding the awards ceremony. Since 2014 the "BAFTA Television Awards" have been open to TV programmes which are only broadcast online. The British Academy Games Awards ceremony traditionally takes place in March, shortly after the Film Awards ceremony in February. BAFTA first recognised video games and other interactive media at its inaugural BAFTA Interactive Entertainment Awards ceremony during 1998, the first major change of its rules since the admittance of television thirty years earlier. Among the first winning games were GoldenEye 007, Gran Turismo and interactive comedy MindGym, sharing the spotlight with the BBC News Online website which won the news category four years consecutively. These awards allowed the academy to recognise new forms of entertainment that were engaging new audiences and challenging traditional expressions of creativity. During 2003, the sheer ubiquity of interactive forms of entertainment and the breadth of genres and types of video games outgrew the combined ceremony, and the event was divided into the BAFTA Video Games Awards and the BAFTA Interactive Awards Despite making headlines with high-profile winners like Halo 2 and Half-Life 2 the interactive division was discontinued and disappeared from BAFTA's publicity material after only two ceremonies. During 2006, BAFTA announced its decision "to give video games equal status with film and television", and the academy now advertises video games as its third major topic in recognition of its importance as an art form of moving images. The same year the ceremony was performed at The Roundhouse by Chalk Farm Road in North London on 5 October and was televised for the first time on 17 October and was broadcast on the digital channel E4. Between 2009 and 2019, the ceremonies have been performed at the London Hilton Park Lane and Tobacco Dock, and have been hosted by Dara Ó Briain and Rufus Hound. In 2020, as a result of the COVID-19 pandemic, it was announced that the ceremony was changing format from a live red-carpet ceremony at the Queen Elizabeth Hall in London to an online show. The online show was presented by Dara Ó Briain from his home and was watched by 720,000 globally. In 2021 the 17th British Academy Games Awards was hosted by arts and entertainment presenter Elle Osili-Wood and was watched by a global audience of 1.5 million. The British Academy Children's Awards are presented annually during November to reward excellence in the art forms of the moving image intended for children. They have been awarded annually since 1969 except for 2020 and 2021 due to the COVID-19 pandemic. The academy has a history of recognising and rewarding children's programming, presenting two awards at the 1969 ceremony – The Flame of Knowledge Award for Schools Programmes and the Harlequin Award for Children's Programmes. As of 2010[update] the Awards ceremony includes 19 categories across movies, television, video games and online content. Since 2007 the Children's Awards have included a Kids Vote award, voted by children between seven and 14. The CBBC Me and My Movie award, a children's filmmaking initiative to inspire and enable children to make their own movies and tell their own stories, has been discontinued. BAFTA also hosts the annual BAFTA Student Film Awards as showcase for rising industry talent. The animation award was sponsored in 2017 and 2018 by animation studio Laika. Presidents and Vice-Presidents Presidents Vice-Presidents Royal connections William, Prince of Wales has been the President of the Academy since February 2010. The Prince's appointment follows a long tradition of royal involvement with the academy. Prince Philip, Duke of Edinburgh, was the first president of the Society of Film and Television Arts (SFTA) in 1959 to 1965, followed by Earl Mountbatten of Burma and the Princess Royal, who was its president from 1972 to 2001. It was the Queen and the Duke of Edinburgh's generous donation of their share of profits from the film Royal Family that enabled the academy to move to its headquarters at 195 Piccadilly. The Prince of Wales succeeded the Lord Attenborough to become the fifth President in the Academy's history. BAFTA Los Angeles BAFTA North America, founded in 1987 and currently chaired by Joyce Pierpoline, serves as the bridge between the Hollywood and British production and entertainment business communities. The BAFTA Los Angeles location hosts a series of events, including the Britannia Awards, the Awards Season Film and Television Tea Parties in January and September, and the annual Garden Party. BAFTA Los Angeles provides access to screenings, Q&As with creative talent, produces seminars with UK film and television executives and the Heritage Archive, featuring interviews with British members of the film and television industries. The Los Angeles location also hosts the Student Film Awards and has an active Scholarship Program offering financial support and mentorship to UK students studying in the US. It created The Inner City Cinema, a screening program providing free screenings of theatrical films to inner-city areas not served by theatres. The success of Inner City Cinema has led to further free screening programs extended to multiple inner-city parks through the academy's work with both the County of Los Angeles Department of Parks and Recreation (Parks After Dark) and The City of Los Angeles Department of Recreation and Parks (Teen Summer Camps). The Britannia Awards are BAFTA Los Angeles' highest accolade, a "celebration of achievements honouring individuals and companies that have dedicated their careers to advancing the entertainment arts". The Awards began in 1989 and usually take place in October/November every year. There are no awards given to specific movies or TV programmes, only to individuals. During the first ten years, one award was given at each event, named the 'Britannia Award for Excellence in Film', but since 1999 the number of awards has increased. Awards given include "The Stanley Kubrick Britannia Award for Excellence in Film" (the original award was renamed during 2000 to honour director Stanley Kubrick), presented to an individual "upon whose work is stamped the indelible mark of authorship and commitment, and who has lifted the craft to new heights"; "The John Schlesinger Britannia Award for Artistic Excellence in Directing" (added during 2003 in honour of John Schlesinger); the "Britannia Award for British Artist of the Year"; and the "Albert R. Broccoli Britannia Award for Worldwide Contribution to Filmed Entertainment". In select years, the evening has included the "BAFTA Los Angeles Humanitarian Award". The show has been broadcast on TV around the world, including the TV Guide Network and BBC America in the United States. BAFTA Scotland BAFTA Scotland is a branch of the academy located in Glasgow, Scotland. Since 1986, BAFTA has continued to specifically champion the film, television and game industries in Scotland by celebrating excellence, championing new Scottish talent and reaching out to the public. The British Academy Scotland Awards are BAFTA Scotland's annual awards ceremony, celebrating and rewarding the highest achievements in Scottish film, television and games. BAFTA Scotland also produces the annual New Talent Awards ceremony, shining a spotlight on new and emerging Scottish talent in the art forms of moving image. Since they began in 1996, the annual New Talent Awards highlight the creativity that exists in Scotland by recognising and rewarding talented individuals who have started to work in the film, television and games industries. BAFTA Wales BAFTA Cymru is a branch of the academy located in Cardiff, Wales. Formed in 1991, BAFTA Cymru extends the charity's mission across the UK in support of Wales' creative communities. For over 25 years BAFTA Cymru has celebrated Welsh talent across film and television production and craft and performance roles with the British Academy Cymru Awards, its annual awards ceremony that takes place in Cardiff. In 2016, having reviewed the eligibility criteria for last year's awards, BAFTA Cymru now encourages Welsh individuals who have worked on Welsh or UK productions – rather than solely Welsh productions – to enter into any one of the 16 craft and performance categories, "[ensuring] that BAFTA in Wales can recognise the work of talented individuals who are working on network productions in craft or performance roles across the UK." BAFTA New York BAFTA New York founded in 1996, recognises and promotes the achievements of British film and television in New York and all along the East Coast of the US. It hosts feature film, television and documentary screenings, panel discussions, premieres and co-produced events with other established organisations in the film and television industry, and runs an educational outreach program aimed at underserved youth in New York City that includes the BAFTA New York Media Studies Scholarship Program. Controversy On 26 October 2017, Norwegian actress Natassia Malthe accused Harvey Weinstein of raping her in a London hotel after the 2008 BAFTA Awards. On 2 February 2018, BAFTA formally terminated Weinstein's membership. On 29 March 2021, Noel Clarke received the BAFTA Outstanding British Contribution to Cinema Award. This was suspended on 29 April 2021 following the publication of multiple and detailed accounts of sexual misconduct by The Guardian. Clarke has denied all allegations, except one. Clarke said: "I vehemently deny any sexual misconduct or criminal wrongdoing." The allegations were made by twenty different women. BAFTA was informed about the existence of allegations of verbal abuse, bullying and sexual harassment against Clarke thirteen days before presenting Clarke with his award. BAFTA acknowledges it received anonymous emails and reports of allegations via intermediaries, but, as it was not given evidence that would allow it to investigate, was unable to take action. In March 2022, the Metropolitan Police stated that Clarke would not face a criminal investigation due to a lack of evidence of any crimes having been committed. Notable people See also References External links
========================================
[SOURCE: https://en.wikipedia.org/w/index.php?title=Meta_Platforms&action=info] | [TOKENS: 46]
Contents Information for "Meta Platforms" Basic information Page protection Edit history Page properties This page is a member of 28 hidden categories (help): Pages transcluded onto the current version of this page (help): Lint errors External tools
========================================
[SOURCE: https://en.wikipedia.org/wiki/Computer#cite_note-Puers-73] | [TOKENS: 10628]
Contents Computer A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (computation). Modern digital electronic computers can perform generic sets of operations known as programs, which enable computers to perform a wide range of tasks. The term computer system may refer to a nominally complete computer that includes the hardware, operating system, software, and peripheral equipment needed and used for full operation, or to a group of computers that are linked and function together, such as a computer network or computer cluster. A broad range of industrial and consumer products use computers as control systems, including simple special-purpose devices like microwave ovens and remote controls, and factory devices like industrial robots. Computers are at the core of general-purpose devices such as personal computers and mobile devices such as smartphones. Computers power the Internet, which links billions of computers and users. Early computers were meant to be used only for calculations. Simple manual instruments like the abacus have aided people in doing calculations since ancient times. Early in the Industrial Revolution, some mechanical devices were built to automate long, tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II, both electromechanical and using thermionic valves. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power, and versatility of computers have been increasing dramatically ever since then, with transistor counts increasing at a rapid pace (Moore's law noted that counts doubled every two years), leading to the Digital Revolution during the late 20th and early 21st centuries. Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a microprocessor, together with some type of computer memory, typically semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers, etc.), and input/output devices that perform both functions (e.g. touchscreens). Peripheral devices allow information to be retrieved from an external source, and they enable the results of operations to be saved and retrieved. Etymology It was not until the mid-20th century that the word acquired its modern definition; according to the Oxford English Dictionary, the first known use of the word computer was in a different sense, in a 1613 book called The Yong Mans Gleanings by the English writer Richard Brathwait: "I haue [sic] read the truest computer of Times, and the best Arithmetician that euer [sic] breathed, and he reduceth thy dayes into a short number." This usage of the term referred to a human computer, a person who carried out calculations or computations. The word continued to have the same meaning until the middle of the 20th century. During the latter part of this period, women were often hired as computers because they could be paid less than their male counterparts. By 1943, most human computers were women. The Online Etymology Dictionary gives the first attested use of computer in the 1640s, meaning 'one who calculates'; this is an "agent noun from compute (v.)". The Online Etymology Dictionary states that the use of the term to mean "'calculating machine' (of any type) is from 1897." The Online Etymology Dictionary indicates that the "modern use" of the term, to mean 'programmable digital electronic computer' dates from "1945 under this name; [in a] theoretical [sense] from 1937, as Turing machine". The name has remained, although modern computers are capable of many higher-level functions. History Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was most likely a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, likely livestock or grains, sealed in hollow unbaked clay containers.[a] The use of counting rods is one example. The abacus was initially used for arithmetic tasks. The Roman abacus was developed from devices used in Babylonia as early as 2400 BCE. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money. The Antikythera mechanism is believed to be the earliest known mechanical analog computer, according to Derek J. de Solla Price. It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to approximately c. 100 BCE. Devices of comparable complexity to the Antikythera mechanism would not reappear until the fourteenth century. Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th century. The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BCE and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer and gear-wheels was invented by Abi Bakr of Isfahan, Persia in 1235. Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar astrolabe, an early fixed-wired knowledge processing machine with a gear train and gear-wheels, c. 1000 AD. The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation. The planimeter was a manual instrument to calculate the area of a closed figure by tracing over it with a mechanical linkage. The slide rule was invented around 1620–1630, by the English clergyman William Oughtred, shortly after the publication of the concept of the logarithm. It is a hand-operated analog computer for doing multiplication and division. As slide rule development progressed, added scales provided reciprocals, squares and square roots, cubes and cube roots, as well as transcendental functions such as logarithms and exponentials, circular and hyperbolic trigonometry and other functions. Slide rules with special scales are still used for quick performance of routine calculations, such as the E6B circular slide rule used for time and distance calculations on light aircraft. In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automaton) that could write holding a quill pen. By switching the number and order of its internal wheels different letters, and hence different messages, could be produced. In effect, it could be mechanically "programmed" to read instructions. Along with two other complex machines, the doll is at the Musée d'Art et d'Histoire of Neuchâtel, Switzerland, and still operates. In 1831–1835, mathematician and engineer Giovanni Plana devised a Perpetual Calendar machine, which through a system of pulleys and cylinders could predict the perpetual calendar for every year from 0 CE (that is, 1 BCE) to 4000 CE, keeping track of leap years and varying day length. The tide-predicting machine invented by the Scottish scientist Sir William Thomson in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location. The differential analyser, a mechanical analog computer designed to solve differential equations by integration, used wheel-and-disc mechanisms to perform the integration. In 1876, Sir William Thomson had already discussed the possible construction of such calculators, but he had been stymied by the limited output torque of the ball-and-disk integrators. In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that allowed these machines to work. Starting in the 1920s, Vannevar Bush and others developed mechanical differential analyzers. In the 1890s, the Spanish engineer Leonardo Torres Quevedo began to develop a series of advanced analog machines that could solve real and complex roots of polynomials, which were published in 1901 by the Paris Academy of Sciences. Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the "father of the computer", he conceptualized and invented the first mechanical computer in the early 19th century. After working on his difference engine he announced his invention in 1822, in a paper to the Royal Astronomical Society, titled "Note on the application of machinery to the computation of astronomical and mathematical tables". He also designed to aid in navigational calculations, in 1833 he realized that a much more general design, an analytical engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The engine would incorporate an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. The machine was about a century ahead of its time. All the parts for his machine had to be made by hand – this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage's failure to complete the analytical engine can be chiefly attributed to political and financial difficulties as well as his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906. In his work Essays on Automatics published in 1914, Leonardo Torres Quevedo wrote a brief history of Babbage's efforts at constructing a mechanical Difference Engine and Analytical Engine. The paper contains a design of a machine capable to calculate formulas like a x ( y − z ) 2 {\displaystyle a^{x}(y-z)^{2}} , for a sequence of sets of values. The whole machine was to be controlled by a read-only program, which was complete with provisions for conditional branching. He also introduced the idea of floating-point arithmetic. In 1920, to celebrate the 100th anniversary of the invention of the arithmometer, Torres presented in Paris the Electromechanical Arithmometer, which allowed a user to input arithmetic problems through a keyboard, and computed and printed the results, demonstrating the feasibility of an electromechanical analytical engine. During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers. The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson (later to become Lord Kelvin) in 1872. The differential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the elder brother of the more famous Sir William Thomson. The art of mechanical analog computing reached its zenith with the differential analyzer, completed in 1931 by Vannevar Bush at MIT. By the 1950s, the success of digital electronic computers had spelled the end for most analog computing machines, but analog computers remained in use during the 1950s in some specialized applications such as education (slide rule) and aircraft (control systems).[citation needed] Claude Shannon's 1937 master's thesis laid the foundations of digital computing, with his insight of applying Boolean algebra to the analysis and synthesis of switching circuits being the basic concept which underlies all electronic digital computers. By 1938, the United States Navy had developed the Torpedo Data Computer, an electromechanical analog computer for submarines that used trigonometry to solve the problem of firing a torpedo at a moving target. During World War II, similar devices were developed in other countries. Early digital computers were electromechanical; electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939 in Berlin, was one of the earliest examples of an electromechanical relay computer. In 1941, Zuse followed his earlier machine up with the Z3, the world's first working electromechanical programmable, fully automatic digital computer. The Z3 was built with 2000 relays, implementing a 22-bit word length that operated at a clock frequency of about 5–10 Hz. Program code was supplied on punched film while data could be stored in 64 words of memory or supplied from the keyboard. It was quite similar to modern machines in some respects, pioneering numerous advances such as floating-point numbers. Rather than the harder-to-implement decimal system (used in Charles Babbage's earlier design), using a binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time. The Z3 was not itself a universal computer but could be extended to be Turing complete. Zuse's next computer, the Z4, became the world's first commercial computer; after initial delay due to the Second World War, it was completed in 1950 and delivered to the ETH Zurich. The computer was manufactured by Zuse's own company, Zuse KG, which was founded in 1941 as the first company with the sole purpose of developing computers in Berlin. The Z4 served as the inspiration for the construction of the ERMETH, the first Swiss computer and one of the first in Europe. Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the same time that digital calculation replaced analog. The engineer Tommy Flowers, working at the Post Office Research Station in London in the 1930s, began to explore the possible use of electronics for the telephone exchange. Experimental equipment that he built in 1934 went into operation five years later, converting a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum tubes. In the US, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed and tested the Atanasoff–Berry Computer (ABC) in 1942, the first "automatic electronic digital computer". This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory. During World War II, the British code-breakers at Bletchley Park achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was first attacked with the help of the electro-mechanical bombes which were often run by women. To crack the more sophisticated German Lorenz SZ 40/42 machine, used for high-level Army communications, Max Newman and his colleagues commissioned Flowers to build the Colossus. He spent eleven months from early February 1943 designing and building the first Colossus. After a functional test in December 1943, Colossus was shipped to Bletchley Park, where it was delivered on 18 January 1944 and attacked its first message on 5 February. Colossus was the world's first electronic digital programmable computer. It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Colossus Mark I contained 1,500 thermionic valves (tubes), but Mark II with 2,400 valves, was both five times faster and simpler to operate than Mark I, greatly speeding the decoding process. The ENIAC (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the U.S. Although the ENIAC was similar to the Colossus, it was much faster, more flexible, and it was Turing-complete. Like the Colossus, a "program" on the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that came later. Once a program was written, it had to be mechanically set into the machine with manual resetting of plugs and switches. The programmers of the ENIAC were six women, often known collectively as the "ENIAC girls". It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors. The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper, On Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine" and that is now known as a universal Turing machine. He proved that such a machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable. The fundamental concept of Turing's design is the stored program, where all the instructions for computing are stored in memory. Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine. Early computing machines had fixed programs. Changing its function required the re-wiring and re-structuring of the machine. With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set and can store in memory a set of instructions (a program) that details the computation. The theoretical basis for the stored-program computer was laid out by Alan Turing in his 1936 paper. In 1945, Turing joined the National Physical Laboratory and began work on developing an electronic stored-program digital computer. His 1945 report "Proposed Electronic Calculator" was the first specification for such a device. John von Neumann at the University of Pennsylvania also circulated his First Draft of a Report on the EDVAC in 1945. The Manchester Baby was the world's first stored-program computer. It was built at the University of Manchester in England by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948. It was designed as a testbed for the Williams tube, the first random-access digital storage device. Although the computer was described as "small and primitive" by a 1998 retrospective, it was the first working machine to contain all of the elements essential to a modern electronic computer. As soon as the Baby had demonstrated the feasibility of its design, a project began at the university to develop it into a practically useful computer, the Manchester Mark 1. The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first commercially available general-purpose computer. Built by Ferranti, it was delivered to the University of Manchester in February 1951. At least seven of these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam. In October 1947 the directors of British catering company J. Lyons & Company decided to take an active role in promoting the commercial development of computers. Lyons's LEO I computer, modelled closely on the Cambridge EDSAC of 1949, became operational in April 1951 and ran the world's first routine office computer job. The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947, which was followed by Shockley's bipolar junction transistor in 1948. From 1955 onwards, transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialized applications. At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developed transistors instead of valves. Their first transistorized computer and the first in the world, was operational by 1953, and a second version was completed there in April 1955. However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer. That distinction goes to the Harwell CADET of 1955, built by the electronics division of the Atomic Energy Research Establishment at Harwell. The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented at Bell Labs between 1955 and 1960 and was the first truly compact transistor that could be miniaturized and mass-produced for a wide range of uses. With its high scalability, and much lower power consumption and higher density than bipolar junction transistors, the MOSFET made it possible to build high-density integrated circuits. In addition to data processing, it also enabled the practical use of MOS transistors as memory cell storage elements, leading to the development of MOS semiconductor memory, which replaced earlier magnetic-core memory in computers. The MOSFET led to the microcomputer revolution, and became the driving force behind the computer revolution. The MOSFET is the most widely used transistor in computers, and is the fundamental building block of digital electronics. The next great advance in computing power came with the advent of the integrated circuit (IC). The idea of the integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C., on 7 May 1952. The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor. Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958. In his patent application of 6 February 1959, Kilby described his new device as "a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated". However, Kilby's invention was a hybrid integrated circuit (hybrid IC), rather than a monolithic integrated circuit (IC) chip. Kilby's IC had external wire connections, which made it difficult to mass-produce. Noyce also came up with his own idea of an integrated circuit half a year later than Kilby. Noyce's invention was the first true monolithic IC chip. His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium. Noyce's monolithic IC was fabricated using the planar process, developed by his colleague Jean Hoerni in early 1959. In turn, the planar process was based on Carl Frosch and Lincoln Derick work on semiconductor surface passivation by silicon dioxide. Modern monolithic ICs are predominantly MOS (metal–oxide–semiconductor) integrated circuits, built from MOSFETs (MOS transistors). The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962. General Microelectronics later introduced the first commercial MOS IC in 1964, developed by Robert Norman. Following the development of the self-aligned gate (silicon-gate) MOS transistor by Robert Kerwin, Donald Klein and John Sarace at Bell Labs in 1967, the first silicon-gate MOS IC with self-aligned gates was developed by Federico Faggin at Fairchild Semiconductor in 1968. The MOSFET has since become the most critical device component in modern ICs. The development of the MOS integrated circuit led to the invention of the microprocessor, and heralded an explosion in the commercial and personal use of computers. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term "microprocessor", it is largely undisputed that the first single-chip microprocessor was the Intel 4004, designed and realized by Federico Faggin with his silicon-gate MOS IC technology, along with Ted Hoff, Masatoshi Shima and Stanley Mazor at Intel.[b] In the early 1970s, MOS IC technology enabled the integration of more than 10,000 transistors on a single chip. System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of a coin. They may or may not have integrated RAM and flash memory. If not integrated, the RAM is usually placed directly above (known as Package on package) or below (on the opposite side of the circuit board) the SoC, and the flash memory is usually placed right next to the SoC. This is done to improve data transfer speeds, as the data signals do not have to travel long distances. Since ENIAC in 1945, computers have advanced enormously, with modern SoCs (such as the Snapdragon 865) being the size of a coin while also being hundreds of thousands of times more powerful than ENIAC, integrating billions of transistors, and consuming only a few watts of power. The first mobile computers were heavy and ran from mains power. The 50 lb (23 kg) IBM 5100 was an early example. Later portables such as the Osborne 1 and Compaq Portable were considerably lighter but still needed to be plugged in. The first laptops, such as the Grid Compass, removed this requirement by incorporating batteries – and with the continued miniaturization of computing resources and advancements in portable battery life, portable computers grew in popularity in the 2000s. The same developments allowed manufacturers to integrate computing resources into cellular mobile phones by the early 2000s. These smartphones and tablets run on a variety of operating systems and recently became the dominant computing device on the market. These are powered by System on a Chip (SoCs), which are complete computers on a microchip the size of a coin. Types Computers can be classified in a number of different ways, including: A computer does not need to be electronic, nor even have a processor, nor RAM, nor even a hard disk. While popular usage of the word "computer" is synonymous with a personal electronic computer,[c] a typical modern definition of a computer is: "A device that computes, especially a programmable [usually] electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information." According to this definition, any device that processes information qualifies as a computer. Hardware The term hardware covers all of those parts of a computer that are tangible physical objects. Circuits, computer chips, graphic cards, sound cards, memory (RAM), motherboard, displays, power supplies, cables, keyboards, printers and "mice" input devices are all hardware. A general-purpose computer has four main components: the arithmetic logic unit (ALU), the control unit, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by buses, often made of groups of wires. Inside each of these parts are thousands to trillions of small electrical circuits which can be turned off or on by means of an electronic switch. Each circuit represents a bit (binary digit) of information so that when the circuit is on it represents a "1", and when off it represents a "0" (in positive logic representation). The circuits are arranged in logic gates so that one or more of the circuits may control the state of one or more of the other circuits. Input devices are the means by which the operations of a computer are controlled and it is provided with data. Examples include: Output devices are the means by which a computer provides the results of its calculations in a human-accessible form. Examples include: The control unit (often called a control system or central controller) manages the computer's various components; it reads and interprets (decodes) the program instructions, transforming them into control signals that activate other parts of the computer.[e] Control systems in advanced computers may change the order of execution of some instructions to improve performance. A key component common to all CPUs is the program counter, a special memory cell (a register) that keeps track of which location in memory the next instruction is to be read from.[f] The control system's function is as follows— this is a simplified description, and some of these steps may be performed concurrently or in a different order depending on the type of CPU: Since the program counter is (conceptually) just another set of memory cells, it can be changed by calculations done in the ALU. Adding 100 to the program counter would cause the next instruction to be read from a place 100 locations further down the program. Instructions that modify the program counter are often known as "jumps" and allow for loops (instructions that are repeated by the computer) and often conditional instruction execution (both examples of control flow). The sequence of operations that the control unit goes through to process an instruction is in itself like a short computer program, and indeed, in some more complex CPU designs, there is another yet smaller computer called a microsequencer, which runs a microcode program that causes all of these events to happen. The control unit, ALU, and registers are collectively known as a central processing unit (CPU). Early CPUs were composed of many separate components. Since the 1970s, CPUs have typically been constructed on a single MOS integrated circuit chip called a microprocessor. The ALU is capable of performing two classes of operations: arithmetic and logic. The set of arithmetic operations that a particular ALU supports may be limited to addition and subtraction, or might include multiplication, division, trigonometry functions such as sine, cosine, etc., and square roots. Some can operate only on whole numbers (integers) while others use floating point to represent real numbers, albeit with limited precision. However, any computer that is capable of performing just the simplest operations can be programmed to break down the more complex operations into simple steps that it can perform. Therefore, any computer can be programmed to perform any arithmetic operation—although it will take more time to do so if its ALU does not directly support the operation. An ALU may also compare numbers and return Boolean truth values (true or false) depending on whether one is equal to, greater than or less than the other ("is 64 greater than 65?"). Logic operations involve Boolean logic: AND, OR, XOR, and NOT. These can be useful for creating complicated conditional statements and processing Boolean logic. Superscalar computers may contain multiple ALUs, allowing them to process several instructions simultaneously. Graphics processors and computers with SIMD and MIMD features often contain ALUs that can perform arithmetic on vectors and matrices. A computer's memory can be viewed as a list of cells into which numbers can be placed or read. Each cell has a numbered "address" and can store a single number. The computer can be instructed to "put the number 123 into the cell numbered 1357" or to "add the number that is in cell 1357 to the number that is in cell 2468 and put the answer into cell 1595." The information stored in memory may represent practically anything. Letters, numbers, even computer instructions can be placed into memory with equal ease. Since the CPU does not differentiate between different types of information, it is the software's responsibility to give significance to what the memory sees as nothing but a series of numbers. In almost all modern computers, each memory cell is set up to store binary numbers in groups of eight bits (called a byte). Each byte is able to represent 256 different numbers (28 = 256); either from 0 to 255 or −128 to +127. To store larger numbers, several consecutive bytes may be used (typically, two, four or eight). When negative numbers are required, they are usually stored in two's complement notation. Other arrangements are possible, but are usually not seen outside of specialized applications or historical contexts. A computer can store any kind of information in memory if it can be represented numerically. Modern computers have billions or even trillions of bytes of memory. The CPU contains a special set of memory cells called registers that can be read and written to much more rapidly than the main memory area. There are typically between two and one hundred registers depending on the type of CPU. Registers are used for the most frequently needed data items to avoid having to access main memory every time data is needed. As data is constantly being worked on, reducing the need to access main memory (which is often slow compared to the ALU and control units) greatly increases the computer's speed. Computer main memory comes in two principal varieties: RAM can be read and written to anytime the CPU commands it, but ROM is preloaded with data and software that never changes, therefore the CPU can only read from it. ROM is typically used to store the computer's initial start-up instructions. In general, the contents of RAM are erased when the power to the computer is turned off, but ROM retains its data indefinitely. In a PC, the ROM contains a specialized program called the BIOS that orchestrates loading the computer's operating system from the hard disk drive into RAM whenever the computer is turned on or reset. In embedded computers, which frequently do not have disk drives, all of the required software may be stored in ROM. Software stored in ROM is often called firmware, because it is notionally more like hardware than software. Flash memory blurs the distinction between ROM and RAM, as it retains its data when turned off but is also rewritable. It is typically much slower than conventional ROM and RAM however, so its use is restricted to applications where high speed is unnecessary.[g] In more sophisticated computers there may be one or more RAM cache memories, which are slower than registers but faster than main memory. Generally computers with this sort of cache are designed to move frequently needed data into the cache automatically, often without the need for any intervention on the programmer's part. I/O is the means by which a computer exchanges information with the outside world. Devices that provide input or output to the computer are called peripherals. On a typical personal computer, peripherals include input devices like the keyboard and mouse, and output devices such as the display and printer. Hard disk drives, floppy disk drives and optical disc drives serve as both input and output devices. Computer networking is another form of I/O. I/O devices are often complex computers in their own right, with their own CPU and memory. A graphics processing unit might contain fifty or more tiny computers that perform the calculations necessary to display 3D graphics.[citation needed] Modern desktop computers contain many smaller computers that assist the main CPU in performing I/O. A 2016-era flat screen display contains its own computer circuitry. While a computer may be viewed as running one gigantic program stored in its main memory, in some systems it is necessary to give the appearance of running several programs simultaneously. This is achieved by multitasking, i.e. having the computer switch rapidly between running each program in turn. One means by which this is done is with a special signal called an interrupt, which can periodically cause the computer to stop executing instructions where it was and do something else instead. By remembering where it was executing prior to the interrupt, the computer can return to that task later. If several programs are running "at the same time". Then the interrupt generator might be causing several hundred interrupts per second, causing a program switch each time. Since modern computers typically execute instructions several orders of magnitude faster than human perception, it may appear that many programs are running at the same time, even though only one is ever executing in any given instant. This method of multitasking is sometimes termed "time-sharing" since each program is allocated a "slice" of time in turn. Before the era of inexpensive computers, the principal use for multitasking was to allow many people to share the same computer. Seemingly, multitasking would cause a computer that is switching between several programs to run more slowly, in direct proportion to the number of programs it is running, but most programs spend much of their time waiting for slow input/output devices to complete their tasks. If a program is waiting for the user to click on the mouse or press a key on the keyboard, then it will not take a "time slice" until the event it is waiting for has occurred. This frees up time for other programs to execute so that many programs may be run simultaneously without unacceptable speed loss. Some computers are designed to distribute their work across several CPUs in a multiprocessing configuration, a technique once employed in only large and powerful machines such as supercomputers, mainframe computers and servers. Multiprocessor and multi-core (multiple CPUs on a single integrated circuit) personal and laptop computers are now widely available, and are being increasingly used in lower-end markets as a result. Supercomputers in particular often have highly unique architectures that differ significantly from the basic stored-program architecture and from general-purpose computers.[h] They often feature thousands of CPUs, customized high-speed interconnects, and specialized computing hardware. Such designs tend to be useful for only specialized tasks due to the large scale of program organization required to use most of the available resources at once. Supercomputers usually see usage in large-scale simulation, graphics rendering, and cryptography applications, as well as with other so-called "embarrassingly parallel" tasks. Software Software is the part of a computer system that consists of the encoded information that determines the computer's operation, such as data or instructions on how to process the data. In contrast to the physical hardware from which the system is built, software is immaterial. Software includes computer programs, libraries and related non-executable data, such as online documentation or digital media. It is often divided into system software and application software. Computer hardware and software require each other and neither is useful on its own. When software is stored in hardware that cannot easily be modified, such as with BIOS ROM in an IBM PC compatible computer, it is sometimes called "firmware". The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that some type of instructions (the program) can be given to the computer, and it will process them. Modern computers based on the von Neumann architecture often have machine code in the form of an imperative programming language. In practical terms, a computer program may be just a few instructions or extend to many millions of instructions, as do the programs for word processors and web browsers for example. A typical modern computer can execute billions of instructions per second (gigaflops) and rarely makes a mistake over many years of operation. Large computer programs consisting of several million instructions may take teams of programmers years to write, and due to the complexity of the task almost certainly contain errors. This section applies to most common RAM machine–based computers. In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to the instruction following that jump instruction. Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention. Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time, with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. The following example is written in the MIPS assembly language: Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in a fraction of a second. In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode; the command to multiply them would have a different opcode, and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from, each with a unique numerical code. Since the computer's memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of these instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer in the same way as numeric data. The fundamental concept of storing programs in the computer's memory alongside the data they operate on is the crux of the von Neumann, or stored program, architecture. In some cases, a computer might store some or all of its program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches. While it is possible to write computer programs as long lists of numbers (machine language) and while this technique was used with many early computers,[i] it is extremely tedious and potentially error-prone to do so in practice, especially for complicated programs. Instead, each basic instruction can be given a short name that is indicative of its function and easy to remember – a mnemonic such as ADD, SUB, MULT or JUMP. These mnemonics are collectively known as a computer's assembly language. Converting programs written in assembly language into something the computer can actually understand (machine language) is usually done by a computer program called an assembler. A programming language is a notation system for writing the source code from which a computer program is produced. Programming languages provide various ways of specifying programs for computers to run. Unlike natural languages, programming languages are designed to permit no ambiguity and to be concise. They are purely written languages and are often difficult to read aloud. They are generally either translated into machine code by a compiler or an assembler before being run, or translated directly at run time by an interpreter. Sometimes programs are executed by a hybrid method of the two techniques. There are thousands of programming languages—some intended for general purpose programming, others useful for only highly specialized applications. Machine languages and the assembly languages that represent them (collectively termed low-level programming languages) are generally unique to the particular architecture of a computer's central processing unit (CPU). For instance, an ARM architecture CPU (such as may be found in a smartphone or a hand-held videogame) cannot understand the machine language of an x86 CPU that might be in a PC.[j] Historically a significant number of other CPU architectures were created and saw extensive use, notably including the MOS Technology 6502 and 6510 in addition to the Zilog Z80. Although considerably easier than in machine language, writing long programs in assembly language is often difficult and is also error prone. Therefore, most practical programs are written in more abstract high-level programming languages that are able to express the needs of the programmer more conveniently (and thereby help reduce programmer error). High level languages are usually "compiled" into machine language (or sometimes into assembly language and then into machine language) using another computer program called a compiler.[k] High level languages are less related to the workings of the target computer than assembly language, and more related to the language and structure of the problem(s) to be solved by the final program. It is therefore often possible to use different compilers to translate the same high level language program into the machine language of many different types of computer. This is part of the means by which software like video games may be made available for different computer architectures such as personal computers and various video game consoles. Program design of small programs is relatively simple and involves the analysis of the problem, collection of inputs, using the programming constructs within languages, devising or using established procedures and algorithms, providing data for output devices and solutions to the problem as applicable. As problems become larger and more complex, features such as subprograms, modules, formal documentation, and new paradigms such as object-oriented programming are encountered. Large programs involving thousands of line of code and more require formal software methodologies. The task of developing large software systems presents a significant intellectual challenge. Producing software with an acceptably high reliability within a predictable schedule and budget has historically been difficult; the academic and professional discipline of software engineering concentrates specifically on this challenge. Errors in computer programs are called "bugs". They may be benign and not affect the usefulness of the program, or have only subtle effects. However, in some cases they may cause the program or the entire system to "hang", becoming unresponsive to input such as mouse clicks or keystrokes, to completely fail, or to crash. Otherwise benign bugs may sometimes be harnessed for malicious intent by an unscrupulous user writing an exploit, code designed to take advantage of a bug and disrupt a computer's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design.[l] Admiral Grace Hopper, an American computer scientist and developer of the first compiler, is credited for having first used the term "bugs" in computing after a dead moth was found shorting a relay in the Harvard Mark II computer in September 1947. Networking and the Internet Computers have been used to coordinate information between multiple physical locations since the 1950s. The U.S. military's SAGE system was the first large-scale example of such a system, which led to a number of special-purpose commercial systems such as Sabre. In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. The effort was funded by ARPA (now DARPA), and the computer network that resulted was called the ARPANET. Logic gates are a common abstraction which can apply to most of the above digital or analog paradigms. The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a minimum capability (being Turing-complete) is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, any type of computer (netbook, supercomputer, cellular automaton, etc.) is able to perform the same computational tasks, given enough time and storage capacity. In the 20th century, artificial intelligence systems were predominantly symbolic: they executed code that was explicitly programmed by software developers. Machine learning models, however, have a set parameters that are adjusted throughout training, so that the model learns to accomplish a task based on the provided data. The efficiency of machine learning (and in particular of neural networks) has rapidly improved with progress in hardware for parallel computing, mainly graphics processing units (GPUs). Some large language models are able to control computers or robots. AI progress may lead to the creation of artificial general intelligence (AGI), a type of AI that could accomplish virtually any intellectual task at least as well as humans. Professions and organizations As the use of computers has spread throughout society, there are an increasing number of careers involving computers. The need for computers to work well together and to be able to exchange information has spawned the need for many standards organizations, clubs and societies of both a formal and informal nature. See also Notes References Sources External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Non-player_character#cite_note-12] | [TOKENS: 1785]
Contents Non-player character A non-player character (NPC) is a character in a game that is not controlled by a player. The term originated in traditional tabletop role-playing games where it applies to characters controlled by the gamemaster, or referee, rather than by another player. In video games, this usually means a computer-controlled character that has a predetermined set of behaviors that potentially will impact gameplay, but will not necessarily be the product of true artificial intelligence. Role-playing games In traditional tabletop role-playing games (RPG) such as Dungeons & Dragons, an NPC is a character portrayed by the gamemaster (GM). While the player characters (PCs) form the narrative's protagonists, non-player characters can be thought of as the "supporting cast" or "extras" of a roleplaying narrative. Non-player characters populate the fictional world of the game, and can fill any role not occupied by a player character. Non-player characters might be allies, bystanders, or competitors to the PCs. NPCs can also be traders who trade currency for things such as equipment or gear. NPCs thus vary in their level of detail. Some may be only a brief description ("You see a man in a corner of the tavern"), while others may have complete game statistics and backstories. There is some debate about how much work a gamemaster should put into an important NPC's statistics; some players prefer to have every NPC completely defined with stats, skills, and gear, while others define only what is immediately necessary and fill in the rest as the game proceeds. There is also some debate regarding the importance of fully defined NPCs in any given role-playing game, but there is consensus that the more "real" the NPCs feel, the more fun players will have interacting with them in character. In some games and in some circumstances, a player who is without a player character can temporarily take control of an NPC. Reasons for this vary, but often arise from the player not maintaining a PC within the group and playing the NPC for a session or from the player's PC being unable to act for some time (for example, because the PC is injured or in another location). Although these characters are still designed and normally controlled by the gamemaster, when players are allowed to temporarily control these non-player characters, it gives them another perspective on the plot of the game. Some systems, such as Nobilis, encourage this in their rules.[citation needed] Many game systems have rules for characters sustaining positive allies in the form of NPC followers, hired hands, or other dependents stature to the PC (player character). Characters may sometimes help in the design, recruitment, or development of NPCs. In the Champions game (and related games using the Hero System), a character may have a DNPC, or "dependent non-player character". This is a character controlled by the GM, but for which the player character is responsible in some way, and who may be put in harm's way by the PC's choices. Video games The term "non-player character" is also used in video games to describe entities not under the direct control of a player. The term carries a connotation that the character is not hostile towards players; hostile characters are referred to as enemies, mobs, or creeps. NPC behavior in computer games is usually scripted and automatic, triggered by certain actions or dialogue with the player characters. In certain multiplayer games (Neverwinter Nights and Vampire: The Masquerade series, for example) a player that acts as the GM can "possess" both player and non-player characters, controlling their actions to further the storyline. More complex games, such as the aforementioned Neverwinter Nights, allow the player to customize the NPCs' behavior by modifying their default scripts or creating entirely new ones. In some online games, such as massively multiplayer online role-playing games, NPCs may be entirely unscripted, and are essentially regular character avatars controlled by employees of the game company. These "non-players" are often distinguished from player characters by avatar appearance or other visual designation, and often serve as in-game support for new players. In other cases, these "live" NPCs are virtual actors, playing regular characters that drive a continuing storyline (as in Myst Online: Uru Live). In earlier RPGs, NPCs only had monologues. This is typically represented by a dialogue box, floating text, cutscene, or other means of displaying the NPCs' speech or reaction to the player. [citation needed] NPC speeches of this kind are often designed to give an instant impression of the character of the speaker, providing character vignettes, but they may also advance the story or illuminate the world around the PC. Similar to this is the most common form of storytelling, non-branching dialogue, in which the means of displaying NPC speech are the same as above, but the player character or avatar responds to or initiates speech with NPCs. In addition to the purposes listed above, this enables the development of the player character. More advanced RPGs feature interactive dialogue, or branching dialogue (dialogue trees). An example are the games produced by Black Isle Studios and White Wolf, Inc.; every one of their games is multiple-choice roleplaying. When talking to an NPC, the player is presented with a list of dialogue options and may choose between them. Each choice may result in a different response from the NPC. These choices may affect the course of the game, as well as the conversation. At the least, they provide a reference point to the player of their character's relationship with the game world. Ultima is an example of a game series that has advanced from non-branching (Ultima III: Exodus and earlier) to branching dialogue (from Ultima IV: Quest of the Avatar and on). Other role-playing games with branching dialogues include Cosmic Soldier, Megami Tensei, Fire Emblem, Metal Max, Langrisser, SaGa, Ogre Battle, Chrono, Star Ocean, Sakura Wars, Mass Effect, Dragon Age, Radiant Historia, and several Dragon Quest and Final Fantasy games. Certain video game genres revolve almost entirely around interactions with non-player characters, including visual novels such as Ace Attorney and dating sims such as Tokimeki Memorial, usually featuring complex branching dialogues and often presenting the player's possible responses word-for-word as the player character would say them. Games revolving around relationship-building, including visual novels, dating sims such as Tokimeki Memorial, and some role-playing games such as Persona, often give choices that have a different number of associated "mood points" that influence a player character's relationship and future conversations with a non-player character. These games often feature a day-night cycle with a time scheduling system that provides context and relevance to character interactions, allowing players to choose when and if to interact with certain characters, which in turn influences their responses during later conversations. In 2023, Replica Studios unveiled its AI-developed NPCs for the Unreal Engine 5, in cooperation with OpenAI, which enable players to have an interactive conversation with unplayable characters. "NPC streaming"—livestreaming while mimicking the behaviors of an NPC—became popular on TikTok in 2023 and was largely popularized by livestreamer Pinkydoll. Other usage From around 2018, the term NPC became an insult, primarily online, to suggest that a person is unable to form thoughts or opinions of their own. This is sometimes illustrated with a grey-faced, expressionless version of the Wojak meme. Monetization NPC streaming is a type of livestream that allows users to participate in and shape the content they are viewing in real time. It has become widely popular as influencers and users of social media platforms such as TikTok utilize livestreams to act as non-player characters. "Viewers in NPC live streams take on the role of puppeteers, influencing the creator's next move." This phenomenon has been on the rise as viewers are actively involved in what they are watching, by purchasing digital "gifts" and sending them directly to the streamer. In return, the streamer will briefly mimic a character or act. This phenomenon has become a trend starting from July 2023, as influencers make profits from this new internet character. Pinkydoll, a TikTok influencer, gained 400,000 followers the same month that she started NPC streaming, while her livestreams began to earn her as much as $7,000 in a day. NPC streaming gives creators a new avenue to earn money online. Despite this, certain creators are quitting due to certain stigmas that come with the strategy. For example, a pioneer of the NPC trend, Malik Ambersley has been robbed, accosted by police, and gotten into fights due to the controversial nature of his act. See also References
========================================
[SOURCE: https://en.wikipedia.org/wiki/Lod#cite_note-105] | [TOKENS: 4733]
Contents Lod Lod (Hebrew: לוד, fully vocalized: לֹד), also known as Lydda (Ancient Greek: Λύδδα) and Lidd (Arabic: اللِّدّ, romanized: al-Lidd, or اللُّدّ, al-Ludd), is a city 15 km (9+1⁄2 mi) southeast of Tel Aviv and 40 km (25 mi) northwest of Jerusalem in the Central District of Israel. It is situated between the lower Shephelah on the east and the coastal plain on the west. The city had a population of 90,814 in 2023. Lod has been inhabited since at least the Neolithic period. It is mentioned a few times in the Hebrew Bible and in the New Testament. Between the 5th century BCE and up until the late Roman period, it was a prominent center for Jewish scholarship and trade. Around 200 CE, the city became a Roman colony and was renamed Diospolis (Ancient Greek: Διόσπολις, lit. 'city of Zeus'). Tradition identifies Lod as the 4th century martyrdom site of Saint George; the Church of Saint George and Mosque of Al-Khadr located in the city is believed to have housed his remains. Following the Arab conquest of the Levant, Lod served as the capital of Jund Filastin; however, a few decades later, the seat of power was transferred to Ramla, and Lod slipped in importance. Under Crusader rule, the city was a Catholic diocese of the Latin Church and it remains a titular see to this day.[citation needed] Lod underwent a major change in its population in the mid-20th century. Exclusively Palestinian Arab in 1947, Lod was part of the area designated for an Arab state in the United Nations Partition Plan for Palestine; however, in July 1948, the city was occupied by the Israel Defense Forces, and most of its Arab inhabitants were expelled in the Palestinian expulsion from Lydda and Ramle. The city was largely resettled by Jewish immigrants, most of them expelled from Arab countries. Today, Lod is one of Israel's mixed cities, with an Arab population of 30%. Lod is one of Israel's major transportation hubs. The main international airport, Ben Gurion Airport, is located 8 km (5 miles) north of the city. The city is also a major railway and road junction. Religious references The Hebrew name Lod appears in the Hebrew Bible as a town of Benjamin, founded along with Ono by Shamed or Shamer (1 Chronicles 8:12; Ezra 2:33; Nehemiah 7:37; 11:35). In Ezra 2:33, it is mentioned as one of the cities whose inhabitants returned after the Babylonian captivity. Lod is not mentioned among the towns allocated to the tribe of Benjamin in Joshua 18:11–28. The name Lod derives from a tri-consonantal root not extant in Northwest Semitic, but only in Arabic (“to quarrel; withhold, hinder”). An Arabic etymology of such an ancient name is unlikely (the earliest attestation is from the Achaemenid period). In the New Testament, the town appears in its Greek form, Lydda, as the site of Peter's healing of Aeneas in Acts 9:32–38. The city is also mentioned in an Islamic hadith as the location of the battlefield where the false messiah (al-Masih ad-Dajjal) will be slain before the Day of Judgment. History The first occupation dates to the Neolithic in the Near East and is associated with the Lodian culture. Occupation continued in the Levant Chalcolithic. Pottery finds have dated the initial settlement in the area now occupied by the town to 5600–5250 BCE. In the Early Bronze, it was an important settlement in the central coastal plain between the Judean Shephelah and the Mediterranean coast, along Nahal Ayalon. Other important nearby sites were Tel Dalit, Tel Bareqet, Khirbat Abu Hamid (Shoham North), Tel Afeq, Azor and Jaffa. Two architectural phases belong to the late EB I in Area B. The first phase had a mudbrick wall, while the late phase included a circulat stone structure. Later excavations have produced an occupation later, Stratum IV. It consists of two phases, Stratum IVb with mudbrick wall on stone foundations and rounded exterior corners. In Stratum IVa there was a mudbrick wall with no stone foundations, with imported Egyptian potter and local pottery imitations. Another excavations revealed nine occupation strata. Strata VI-III belonged to Early Bronze IB. The material culture showed Egyptian imports in strata V and IV. Occupation continued into Early Bronze II with four strata (V-II). There was continuity in the material culture and indications of centralized urban planning. North to the tell were scattered MB II burials. The earliest written record is in a list of Canaanite towns drawn up by the Egyptian pharaoh Thutmose III at Karnak in 1465 BCE. From the fifth century BCE until the Roman period, the city was a centre of Jewish scholarship and commerce. According to British historian Martin Gilbert, during the Hasmonean period, Jonathan Maccabee and his brother, Simon Maccabaeus, enlarged the area under Jewish control, which included conquering the city. The Jewish community in Lod during the Mishnah and Talmud era is described in a significant number of sources, including information on its institutions, demographics, and way of life. The city reached its height as a Jewish center between the First Jewish-Roman War and the Bar Kokhba revolt, and again in the days of Judah ha-Nasi and the start of the Amoraim period. The city was then the site of numerous public institutions, including schools, study houses, and synagogues. In 43 BC, Cassius, the Roman governor of Syria, sold the inhabitants of Lod into slavery, but they were set free two years later by Mark Antony. During the First Jewish–Roman War, the Roman proconsul of Syria, Cestius Gallus, razed the town on his way to Jerusalem in Tishrei 66 CE. According to Josephus, "[he] found the city deserted, for the entire population had gone up to Jerusalem for the Feast of Tabernacles. He killed fifty people whom he found, burned the town and marched on". Lydda was occupied by Emperor Vespasian in 68 CE. In the period following the destruction of Jerusalem in 70 CE, Rabbi Tarfon, who appears in many Tannaitic and Jewish legal discussions, served as a rabbinic authority in Lod. During the Kitos War, 115–117 CE, the Roman army laid siege to Lod, where the rebel Jews had gathered under the leadership of Julian and Pappos. Torah study was outlawed by the Romans and pursued mostly in the underground. The distress became so great, the patriarch Rabban Gamaliel II, who was shut up there and died soon afterwards, permitted fasting on Ḥanukkah. Other rabbis disagreed with this ruling. Lydda was next taken and many of the Jews were executed; the "slain of Lydda" are often mentioned in words of reverential praise in the Talmud. In 200 CE, emperor Septimius Severus elevated the town to the status of a city, calling it Colonia Lucia Septimia Severa Diospolis. The name Diospolis ("City of Zeus") may have been bestowed earlier, possibly by Hadrian. At that point, most of its inhabitants were Christian. The earliest known bishop is Aëtius, a friend of Arius. During the following century (200-300CE), it's said that Joshua ben Levi founded a yeshiva in Lod. In December 415, the Council of Diospolis was held here to try Pelagius; he was acquitted. In the sixth century, the city was renamed Georgiopolis after St. George, a soldier in the guard of the emperor Diocletian, who was born there between 256 and 285 CE. The Church of Saint George and Mosque of Al-Khadr is named for him. The 6th-century Madaba map shows Lydda as an unwalled city with a cluster of buildings under a black inscription reading "Lod, also Lydea, also Diospolis". An isolated large building with a semicircular colonnaded plaza in front of it might represent the St George shrine. After the Muslim conquest of Palestine by Amr ibn al-'As in 636 CE, Lod which was referred to as "al-Ludd" in Arabic served as the capital of Jund Filastin ("Military District of Palaestina") before the seat of power was moved to nearby Ramla during the reign of the Umayyad Caliph Suleiman ibn Abd al-Malik in 715–716. The population of al-Ludd was relocated to Ramla, as well. With the relocation of its inhabitants and the construction of the White Mosque in Ramla, al-Ludd lost its importance and fell into decay. The city was visited by the local Arab geographer al-Muqaddasi in 985, when it was under the Fatimid Caliphate, and was noted for its Great Mosque which served the residents of al-Ludd, Ramla, and the nearby villages. He also wrote of the city's "wonderful church (of St. George) at the gate of which Christ will slay the Antichrist." The Crusaders occupied the city in 1099 and named it St Jorge de Lidde. It was briefly conquered by Saladin, but retaken by the Crusaders in 1191. For the English Crusaders, it was a place of great significance as the birthplace of Saint George. The Crusaders made it the seat of a Latin Church diocese, and it remains a titular see. It owed the service of 10 knights and 20 sergeants, and it had its own burgess court during this era. In 1226, Ayyubid Syrian geographer Yaqut al-Hamawi visited al-Ludd and stated it was part of the Jerusalem District during Ayyubid rule. Sultan Baybars brought Lydda again under Muslim control by 1267–8. According to Qalqashandi, Lydda was an administrative centre of a wilaya during the fourteenth and fifteenth century in the Mamluk empire. Mujir al-Din described it as a pleasant village with an active Friday mosque. During this time, Lydda was a station on the postal route between Cairo and Damascus. In 1517, Lydda was incorporated into the Ottoman Empire as part of the Damascus Eyalet, and in the 1550s, the revenues of Lydda were designated for the new waqf of Hasseki Sultan Imaret in Jerusalem, established by Hasseki Hurrem Sultan (Roxelana), the wife of Suleiman the Magnificent. By 1596 Lydda was a part of the nahiya ("subdistrict") of Ramla, which was under the administration of the liwa ("district") of Gaza. It had a population of 241 households and 14 bachelors who were all Muslims, and 233 households who were Christians. They paid a fixed tax-rate of 33,3 % on agricultural products, including wheat, barley, summer crops, vineyards, fruit trees, sesame, special product ("dawalib" =spinning wheels), goats and beehives, in addition to occasional revenues and market toll, a total of 45,000 Akçe. All of the revenue went to the Waqf. In 1051 AH/1641/2, the Bedouin tribe of al-Sawālima from around Jaffa attacked the villages of Subṭāra, Bayt Dajan, al-Sāfiriya, Jindās, Lydda and Yāzūr belonging to Waqf Haseki Sultan. The village appeared as Lydda, though misplaced, on the map of Pierre Jacotin compiled in 1799. Missionary William M. Thomson visited Lydda in the mid-19th century, describing it as a "flourishing village of some 2,000 inhabitants, imbosomed in noble orchards of olive, fig, pomegranate, mulberry, sycamore, and other trees, surrounded every way by a very fertile neighbourhood. The inhabitants are evidently industrious and thriving, and the whole country between this and Ramleh is fast being filled up with their flourishing orchards. Rarely have I beheld a rural scene more delightful than this presented in early harvest ... It must be seen, heard, and enjoyed to be appreciated." In 1869, the population of Ludd was given as: 55 Catholics, 1,940 "Greeks", 5 Protestants and 4,850 Muslims. In 1870, the Church of Saint George was rebuilt. In 1892, the first railway station in the entire region was established in the city. In the second half of the 19th century, Jewish merchants migrated to the city, but left after the 1921 Jaffa riots. In 1882, the Palestine Exploration Fund's Survey of Western Palestine described Lod as "A small town, standing among enclosure of prickly pear, and having fine olive groves around it, especially to the south. The minaret of the mosque is a very conspicuous object over the whole of the plain. The inhabitants are principally Moslim, though the place is the seat of a Greek bishop resident of Jerusalem. The Crusading church has lately been restored, and is used by the Greeks. Wells are found in the gardens...." From 1918, Lydda was under the administration of the British Mandate in Palestine, as per a League of Nations decree that followed the Great War. During the Second World War, the British set up supply posts in and around Lydda and its railway station, also building an airport that was renamed Ben Gurion Airport after the death of Israel's first prime minister in 1973. At the time of the 1922 census of Palestine, Lydda had a population of 8,103 inhabitants (7,166 Muslims, 926 Christians, and 11 Jews), the Christians were 921 Orthodox, 4 Roman Catholics and 1 Melkite. This had increased by the 1931 census to 11,250 (10,002 Muslims, 1,210 Christians, 28 Jews, and 10 Bahai), in a total of 2475 residential houses. In 1938, Lydda had a population of 12,750. In 1945, Lydda had a population of 16,780 (14,910 Muslims, 1,840 Christians, 20 Jews and 10 "other"). Until 1948, Lydda was an Arab town with a population of around 20,000—18,500 Muslims and 1,500 Christians. In 1947, the United Nations proposed dividing Mandatory Palestine into two states, one Jewish state and one Arab; Lydda was to form part of the proposed Arab state. In the ensuing war, Israel captured Arab towns outside the area the UN had allotted it, including Lydda. In December 1947, thirteen Jewish passengers in a seven-car convoy to Ben Shemen Youth Village were ambushed and murdered.In a separate incident, three Jewish youths, two men and a woman were captured, then raped and murdered in a neighbouring village. Their bodies were paraded in Lydda’s principal street. The Israel Defense Forces entered Lydda on 11 July 1948. The following day, under the impression that it was under attack, the 3rd Battalion was ordered to shoot anyone "seen on the streets". According to Israel, 250 Arabs were killed. Other estimates are higher: Arab historian Aref al Aref estimated 400, and Nimr al Khatib 1,700. In 1948, the population rose to 50,000 during the Nakba, as Arab refugees fleeing other areas made their way there. A key event was the Palestinian expulsion from Lydda and Ramle, with the expulsion of 50,000-70,000 Palestinians from Lydda and Ramle by the Israel Defense Forces. All but 700 to 1,056 were expelled by order of the Israeli high command, and forced to walk 17 km (10+1⁄2 mi) to the Jordanian Arab Legion lines. Estimates of those who died from exhaustion and dehydration vary from a handful to 355. The town was subsequently sacked by the Israeli army. Some scholars, including Ilan Pappé, characterize this as ethnic cleansing. The few hundred Arabs who remained in the city were soon outnumbered by the influx of Jews who immigrated to Lod from August 1948 onward, most of them from Arab countries. As a result, Lod became a predominantly Jewish town. After the establishment of the state, the biblical name Lod was readopted. The Jewish immigrants who settled Lod came in waves, first from Morocco and Tunisia, later from Ethiopia, and then from the former Soviet Union. Since 2008, many urban development projects have been undertaken to improve the image of the city. Upscale neighbourhoods have been built, among them Ganei Ya'ar and Ahisemah, expanding the city to the east. According to a 2010 report in the Economist, a three-meter-high wall was built between Jewish and Arab neighbourhoods and construction in Jewish areas was given priority over construction in Arab neighborhoods. The newspaper says that violent crime in the Arab sector revolves mainly around family feuds over turf and honour crimes. In 2010, the Lod Community Foundation organised an event for representatives of bicultural youth movements, volunteer aid organisations, educational start-ups, businessmen, sports organizations, and conservationists working on programmes to better the city. In the 2021 Israel–Palestine crisis, a state of emergency was declared in Lod after Arab rioting led to the death of an Israeli Jew. The Mayor of Lod, Yair Revivio, urged Prime Minister of Israel Benjamin Netanyahu to deploy Israel Border Police to restore order in the city. This was the first time since 1966 that Israel had declared this kind of emergency lockdown. International media noted that both Jewish and Palestinian mobs were active in Lod, but the "crackdown came for one side" only. Demographics In the 19th century and until the Lydda Death March, Lod was an exclusively Muslim-Christian town, with an estimated 6,850 inhabitants, of whom approximately 2,000 (29%) were Christian. According to the Israel Central Bureau of Statistics (CBS), the population of Lod in 2010 was 69,500 people. According to the 2019 census, the population of Lod was 77,223, of which 53,581 people, comprising 69.4% of the city's population, were classified as "Jews and Others", and 23,642 people, comprising 30.6% as "Arab". Education According to CBS, 38 schools and 13,188 pupils are in the city. They are spread out as 26 elementary schools and 8,325 elementary school pupils, and 13 high schools and 4,863 high school pupils. About 52.5% of 12th-grade pupils were entitled to a matriculation certificate in 2001.[citation needed] Economy The airport and related industries are a major source of employment for the residents of Lod. Other important factories in the city are the communication equipment company "Talard", "Cafe-Co" - a subsidiary of the Strauss Group and "Kashev" - the computer center of Bank Leumi. A Jewish Agency Absorption Centre is also located in Lod. According to CBS figures for 2000, 23,032 people were salaried workers and 1,405 were self-employed. The mean monthly wage for a salaried worker was NIS 4,754, a real change of 2.9% over the course of 2000. Salaried men had a mean monthly wage of NIS 5,821 (a real change of 1.4%) versus NIS 3,547 for women (a real change of 4.6%). The mean income for the self-employed was NIS 4,991. About 1,275 people were receiving unemployment benefits and 7,145 were receiving an income supplement. Art and culture In 2009-2010, Dor Guez held an exhibit, Georgeopolis, at the Petach Tikva art museum that focuses on Lod. Archaeology A well-preserved mosaic floor dating to the Roman period was excavated in 1996 as part of a salvage dig conducted on behalf of the Israel Antiquities Authority and the Municipality of Lod, prior to widening HeHalutz Street. According to Jacob Fisch, executive director of the Friends of the Israel Antiquities Authority, a worker at the construction site noticed the tail of a tiger and halted work. The mosaic was initially covered over with soil at the conclusion of the excavation for lack of funds to conserve and develop the site. The mosaic is now part of the Lod Mosaic Archaeological Center. The floor, with its colorful display of birds, fish, exotic animals and merchant ships, is believed to have been commissioned by a wealthy resident of the city for his private home. The Lod Community Archaeology Program, which operates in ten Lod schools, five Jewish and five Israeli Arab, combines archaeological studies with participation in digs in Lod. Sports The city's major football club, Hapoel Bnei Lod, plays in Liga Leumit (the second division). Its home is at the Lod Municipal Stadium. The club was formed by a merger of Bnei Lod and Rakevet Lod in the 1980s. Two other clubs in the city play in the regional leagues: Hapoel MS Ortodoxim Lod in Liga Bet and Maccabi Lod in Liga Gimel. Hapoel Lod played in the top division during the 1960s and 1980s, and won the State Cup in 1984. The club folded in 2002. A new club, Hapoel Maxim Lod (named after former mayor Maxim Levy) was established soon after, but folded in 2007. Notable people Twin towns-sister cities Lod is twinned with: See also References Bibliography External links
========================================
[SOURCE: https://en.wikipedia.org/wiki/Crossover_(automobile)] | [TOKENS: 4531]
Contents Crossover SUV A crossover, crossover SUV, or crossover utility vehicle (CUV) is a type of automobile with an increased ride height that is built on unibody chassis construction shared with passenger cars, as opposed to traditional sport utility vehicles (SUVs), which are built on a body-on-frame chassis construction similar to pickup trucks. Originating in North America, the term crossover was initially used for any vehicle that blends characteristics between two different kinds of vehicles; over time, crossover came to refer predominantly to unibody-based SUVs. The term SUV is often used as an umbrella term for both crossovers and traditional SUVs due to the similarities between them. Compared to traditional SUVs, crossovers are known to be less capable of use in off-road conditions or hauling heavy loads while offering other advantages such as improved fuel economy and handling. Compared to traditional cars with lower ride height and lower roofs such as sedans and hatchbacks, crossovers offer larger cabin space and higher driving position. The 1977 Lada Niva is the world's first mass-produced unibody off-road vehicle and has been credited as a forerunner of crossovers before that term was used, with the AMC Eagle introduced in 1979 being the first US example. The Toyota RAV4, introduced in 1994, has also been described as initiating the modern concept of a crossover. In the US, the market share of crossovers has grown from under 4% in 2000 to nearly 40% in 2018. Definition The difference between crossover SUVs and other SUVs as generally defined by journalists and manufacturers is that a crossover is built using a unibody platform, while an SUV is built using a body-on-frame platform. However, these definitions are often blurred in practice, since unibody vehicles are also often referred to as SUVs. "Crossover" is a relatively recent term, and early unibody SUVs (such as the 1984 Jeep Cherokee (XJ)) are rarely called crossovers. Due to these inconsistencies, the term "SUV" is often used as an umbrella term for both crossovers and traditional SUVs. U.S. magazine MotorTrend in 2005 mentioned that the term "crossover" has become "blurred as manufacturers apply it to everything from the Chrysler Pacifica to the Ford Five Hundred sedan". At that time, the publication proposes that the term "soft-roader" is more appropriate. Some regions outside North America do not have a distinction between a crossover SUV and body-on-frame SUV, calling both of them SUVs. Several government bodies in the United States also did not acknowledge the crossover distinction, including the United States Environmental Protection Agency (EPA). In some jurisdictions, crossovers are classified as light trucks as are traditional SUVs and pickup trucks. Outside the United States, the term "crossover" tends to be used for C-segment (compact) or smaller vehicles, with large unibody vehicles—such as the Audi Q7, Range Rover, Porsche Cayenne, and Volkswagen Touareg—usually referred to as SUVs rather than crossovers.[citation needed] In the United Kingdom, a crossover is sometimes defined as a hatchback with raised ride height and SUV-like styling features. The Sunday Times noted the number of "soft-roader" cars on sale in 2019. Characteristics Crossovers' driving characteristics and performance are similar to those of traditional passenger cars while providing more passenger and cargo space with relatively minor trade-offs in fuel economy and running costs. According to Consumer Reports, the three top-selling crossovers in the US in 2018 (Toyota RAV4, Honda CR-V, and Nissan Rogue) return an average of 10% less fuel economy than the top three selling sedan equivalents in the mid-size segment (Toyota Camry, Honda Accord, Nissan Altima), but provide almost 1.5 times the cargo space. Furthermore, the average mid-size crossover in the US costs less than 5% more than the average mid-size car. Compared to truck-based SUVs, crossovers typically have greater interior comfort, a more comfortable ride, better fuel economy, and lower manufacturing costs, but inferior off-road and towing capabilities. Many crossovers lack an all-wheel drive or four-wheel-drive train, which, in combination with their inferior off-road capability, causes many journalists and consumers to question their definition as "sports utility vehicles". This has led some to describe crossovers as pseudo-SUVs. History Introduced in 1979, prior to the terms "SUV" or "crossover" being coined, the AMC Eagle is retroactively considered to be the first dedicated crossover automobile. The mass-market Eagle model line was based on a unibody passenger car platform, with fully-automatic four-wheel drive and a raised ride height. Furthermore, a writer for Motor Trend characterized the 1963 Studebaker Wagonaire as the "first crossover" because the innovative station wagon with a sliding roof "mashed up various vehicle types." It was available only with a conventional rear-wheel drive. Others cite the front-wheel drive 1977 Matra Rancho as a slightly earlier forerunner to the modern crossover. Marketed as a "lifestyle" vehicle, it was not available with four-wheel drive. In 1981, American Motors Corporation (AMC) introduced four-wheel drive subcompact models built on the two-door AMC Spirit, the "Eagle SX/4" and "Eagle Kammback". These low-priced models joined the compact AMC Eagle line and they foreshadowed the market segment of comfortable cars with utility and foul-weather capabilities. The first-generation Toyota RAV4, released in 1994, has been credited as the model that expanded the concept of a modern crossover. The RAV4 was based on a modified platform used by the Toyota Corolla and Toyota Carina. At its release, Toyota in Japan used the term "4x4 vehicle" to describe the model, while Toyota in the US called the vehicle a "new concept SUV". By the early 2000s, Toyota was leading the market in its development of car-based trucks in North America with the release of other crossover models such as the Highlander and the Lexus RX. In North America, crossovers increased in popularity during the 2000s, when fuel efficiency standards for light trucks, which had been stuck at 20.7 miles per US gallon (11.4 L/100 km; 8.8 km/L) since 1996, moved upwards by 2005. With increasing fuel prices, traditional SUVs began to lose market share to crossovers. In the United States as of 2006[update], crossover models comprised more than 50% of the overall SUV market.[needs update] Crossovers have become increasingly popular in Europe also since the early 2010s.[needs update] In the first quarter of 2023 the Tesla Model Y crossover became the best-selling vehicle in the world. Size categories Depending on the market, crossovers are divided into several size categories. Since there is an absence of any official distinction, often the size category is ambiguous for some crossover models. Several aspects needed to determine the size category of a vehicle may include length and width, positioning in its respective brand line-up, platform, and interior space. Crossover city cars (also called A-SUVs, city SUVs, city crossovers, or A-segment SUVs) are crossovers that generally ride on the platform of a city car (A-segment). Crossover city car is a newly introduced automotive vehicle segment, with the first vehicle in the segment being the Fiat Panda Cross, though the Suzuki Ignis helped bring the segment more attention. Cars in this segment are generally styled as hatchbacks. Since the late 2010s, the segment has received significantly more attention. As of 2023[update], examples include the Toyota Aygo X, Hyundai Casper, Suzuki Ignis, Renault Kwid, VinFast VF 5, Suzuki Xbee, and the Fiat Panda Cross/City Cross. Subcompact crossover SUVs (also called B-segment crossover SUV, B-SUV, small SUV) are crossovers that are usually based on the platform of a subcompact (also known as supermini or B-segment) passenger car, although some high-end subcompact crossover models are based on a compact car (C-segment). The segment may be called differently depending on the market. In several regions, the category may be known as "compact crossover" or "compact SUV" instead. This category is particularly popular in Europe, India, and Brazil, where it accounted for 37, 75, and 69% respectively of total SUV sales in 2018. In the United States, it accounted for 7% of total SUV sales in 2018. The best-selling vehicle in the segment in 2019 was the Honda HR-V, with 622,154 units being recorded as having been sold worldwide. A compact crossover SUV (also called C-segment SUV or C-SUV) is a vehicle that is usually based on the platform of a compact car (C-segment), while some models are based on a mid-size car (D-segment) or a subcompact (B-segment) platform. Most compact crossovers have two-row seating, but some have three rows. The naming of the segment may differ depending on the market. In several regions outside North America, the category may be known as "mid-size crossover" or "mid-size SUV", not to be confused with the North American definition of a mid-size crossover SUV, which is a larger D-segment crossover SUV. The first compact crossover was the 1980 AMC Eagle that was based on the compact-sized Concord line. Its four-wheel drive system was an almost unheard-of feature on regular passenger cars at the time, and it came with full-time all-wheel drive, automatic transmission, power steering, power front disk brakes as standard, and numerous convenience and comfort options. Later models included the 1994 Toyota RAV4, 1995 Honda CR-V, 1997 Subaru Forester, 2000 Nissan X-Trail, 2000 Mazda Tribute, and the 2001 Ford Escape. Between 2005 and 2010, the market share of compact crossovers in the US increased from 6 to more than 11%. In 2014, for the first time ever, sales of compact crossovers outpaced mid-size sedans in the United States. In 2019, the American magazine Car and Driver stated that "so many of these vehicles are crowding the marketplace, simply sorting through them can be a daunting task". Due to its popularity and to cater to customers' needs, many manufacturers offer more than one compact crossover, usually in slightly different sizes at different price points. By the late 2010s, the segment had emerged as the most popular segment in several regions. For example, nearly 1 in every 4 cars (24.2%) sold in the United States in 2019 was a compact crossover. It also comprised 5.6% of the total European car market. The best-selling vehicle in the segment in 2019 was the Toyota RAV4, with 961,918 units sold globally. In late 2020 the Volkswagen ID.4 and Ford Mustang Mach-E debuted as battery electric compact crossover SUVs. A mid-size crossover SUV is a class of crossover SUVs that is larger than compact crossover SUVs, but smaller than full-size crossover SUVs. Mid-size crossover SUVs are usually based on the platform of a mid-size (also known as a large family car or a D-segment) passenger car, while some models are based on a full-size car (F-segment) or a compact (C-segment) platform. Some mid-size crossovers have three rows of seats, while others have two rows, which led to several brands offering multiple models to cater to both sub-segments (for example, the Honda Pilot three row crossover and its two row derivative, the Honda Passport). In Australia, American mid-sized crossovers are classified as large SUVs.[citation needed] The first mid-size crossovers included the Toyota Highlander and Pontiac Aztek, both introduced in 2000 for the 2001 model year.[citation needed] The segment is most popular in North America and China, where larger vehicles are preferred. It makes up 15.8% of the total United States car market. In Europe, the segment covers 2.1% of the total market in 2019 with luxury crossover SUVs dominating most of the share. The Toyota Highlander/Kluger was the best-selling vehicle in the category in 2018, with 387,869 sold worldwide. Full-size crossover SUVs are the largest crossovers and offer exclusively three row seating. The first full-size crossovers included the Ford Freestyle, GMC Acadia, Saturn Outlook, and the Buick Enclave, with older full-size SUVs being built mostly above a body-on-frame chassis. The full-size crossover SUV class is sometimes considered to include the three-row mid-size crossover class, as in the case of the Jeep Grand Cherokee L. Full-size crossover SUVs are not necessarily based on full-size cars, even if an automaker still has a full-size passenger car in their lineup. For instance the Mercedes-Benz GLS shares its architecture with the mid-size Mercedes-Benz GLE and not the full-size Mercedes-Benz S-Class sedan, even though the GLS is marketed as the crossover SUV counterpart to the S-Class. Body style categories While three-door body-on-frame SUVs are not uncommon, crossover SUVs with three doors (including the tailgate door) are less prevalent. The decline of two or three-door vehicles, in general, has led to the decline of this category. Crossover SUVs with a sloping rear roofline may be marketed as a "coupe crossover SUV" or "coupe SUV". Although the term "coupe" itself is supposed to refer to a passenger car with a sloping or truncated rear roofline and usually two doors (although some manufacturers have created a four-door coupé), most coupé crossover SUVs are equipped with five doors. The sloping roofline is designed to offer a styling advantage over its standard crossover counterpart, although others suggest that this is less attractive. The body style has attracted criticism as being more expensive and less practical than normal crossovers. The BMW X6 is credited to be the first coupe crossover. The first crossover convertible was the AMC Eagle marketed by AMC dealers during the 1981 and 1982 model years as the Sundancer, a factory-authorized conversion of the all-wheel-drive two-door sedans. Several convertible crossover SUVs have entered mass production, including the Toyota RAV4 convertible. Released in North America in the 1998 model year, it was offered through the 1999 model year. Other examples include the Nissan Murano CrossCabriolet, Range Rover Evoque Convertible, and Volkswagen T-Roc Cabriolet. This category was heavily criticized by journalists, enthusiasts, and analysts for numerous reasons, such as its design and high price tag. Some also questioned its purpose, as the practicality that crossovers usually have did not carryover to the convertible version, since it could only have two doors and little luggage space. Crossover-styled cars Many manufacturers have capitalized on the SUV trend by offering a version of station wagons, hatchbacks, or MPVs with a raised ride height and the addition of rugged-looking accessories such as a black plastic wheel arch extension kit, body cladding, skid plates, and roof rails. Due to their raised ground clearance, these vehicles may then be marketed as more capable off-road. Some of them may also be equipped with an all-wheel-drive. This strategy has been used by manufacturers to move models upmarket or to help fill an absence in a crossover SUV segment. These vehicles have been described as pseudo-crossovers. Many manufacturers have released "off-road" versions of station wagons, with larger cargo space and greater practicality, that are marketed as more capable in soft off-road or all-weather situations due to their raised ground clearance, making them a "crossover" between a station wagon and an SUV. In North America, some manufacturers sell station wagons with crossover styling due to the former's unpopularity, the Subaru Outback being the most popular model. An early model of off-road-styled station wagons was the Subaru Legacy Outback (later Outback) in 1994. At the time, Subaru was absent in the growing SUV segment. Lacking the finances to design a ground-up SUV, Subaru added a two-tone paint scheme, body cladding, and a suspension lift to the Legacy wagon. It was marketed as a capable and more efficient alternative to larger truck-based SUVs. Another example is the Volvo V70 XC (also called V70 Cross Country), first introduced in 1999. In 2002, the model was renamed the XC70. Audi has been making Allroad versions of their station wagons since 1999. The Volkswagen Alltrack and Škoda Scout are equivalent variants. The crossover-styled variant of hatchbacks or city cars with the same body was introduced either as a substitute for or a complement to the subcompact crossover SUV. Most crossover-styled hatchbacks do not have all-wheel-drive. Forerunners of the SUV-themed hatchback are the 1983 Fiat Panda 4x4, the 1994 Outback Sport, the 1996 Toyota Starlet Remix, and the 2003 Rover Streetwise. The Volkswagen Golf Country, a conversion by Steyr-Daimler-Puch, was also sold between 1990 and 1991, and was offered with part-time four-wheel drive and off-road exterior cladding. In the 2000s, the Volkswagen CrossPolo started the modern crossover-style hatchback trend and was marketed as an SUV-like "lifestyle" vehicle. The Dacia/Renault Sandero Stepway, the crossover-style version of the Sandero launched in 2009, is an example of a well-received crossover-style hatchback, making up 65% of Sandero sales. One of the first MPVs with a crossover-style variant was the Renault Scénic RX4, introduced in 2000. It featured a lifted ride height, rugged body cladding, tailgate-mounted spare wheel, and optional part-time four-wheel-drive. Another example is the Volkswagen CrossTouran, launched in 2006 as a lifted version of the Touran and marketed as a "lifestyle" vehicle. Apart from crossover-style variants equipped with exterior accessories, due to the increasing crossover market shares, many manufacturers began developing MPVs from the ground up with crossover characteristics – and often marketed them either purely as an MPV or as a "crossover MPV" – such as the 5th generation of Renault Espace. The innovative unibody all-wheel drive AMC Eagle was available in two- and four-door sedan versions when introduced in 1979. Some examples of sedans with crossover characteristics are the Subaru Legacy SUS (short for "Sport Utility Sedan"), Volvo S60 Cross Country, Polestar 2, Toyota Crown Crossover, Citroën C4 X and C3L in China, Renault/Dacia Logan Stepway and the Qoros 3 GT. Sales Since the early 2010s, sales of crossover-type vehicles have been increasing in Europe. By 2017, European sales of compact and mid-sized crossover models continued to surge. Analysis of the European new car market by data firm JATO Dynamics reveals that SUVs which mostly consisted of crossovers took almost 40% of the market in 2019, with the crossover segment being a key driver of growth for volume and profits. Sales of crossovers increased by 30% between 2003 and 2005. By 2006, the segment came into strong visibility in the U.S., when crossover sales "made up more than 50% of the overall SUV market". Sales increased in 2007 by 16%. In 2013, the Audi Q5 became Audi's second best-selling vehicle in the United States market after the Audi A4 sedan. Between the late 1990s and 2012, around half of Lexus's sales came from its SUVs. American manufacturers were initially slow to switch from their emphasis on light truck-based SUVs, and foreign automakers developed crossovers targeting the U.S. market as an alternative to station wagons that were unpopular there. By the early 2000s, American car manufacturers had caught up. The crossover segment has been the fastest-growing category in China's passenger car market since the early 2010s. From 2011 to 2021, the market share of crossovers in China's passenger vehicle sales surged by nearly 35 percentage points, rising from approximately 13% in 2011 to 47.89% in 2021. A 2017 survey by Mintel indicated that Chinese consumers were willing to pay a premium for SUVs, with the median first-time buyer spending RMB 205,000. By 2023, several crossover models dominated the list of overall best-selling vehicles in China. The Tesla Model Y emerged as the best-selling car across all segments. Other popular crossover models included the BYD Song Plus and Yuan Plus, which ranked 4th and 5th respectively, and the Haval H6, which took the 10th spot. In 2023, Land Rover experienced significant growth in the Chinese market. Combined sales of the Range Rover and Range Rover Sport surpassed 25,000 units, representing a 31% year-over-year increase, whilst sales of the Land Rover Discovery grew by 14% year-over-year. List See Category:Crossover sport utility vehicles ( 315 ) See also References
========================================