text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/Special:BookSources/0-939923-04-1] | [TOKENS: 380] |
Contents Book sources This page allows users to search multiple sources for a book given a 10- or 13-digit International Standard Book Number. Spaces and dashes in the ISBN do not matter. This page links to catalogs of libraries, booksellers, and other book sources where you will be able to search for the book by its International Standard Book Number (ISBN). Online text Google Books and other retail sources below may be helpful if you want to verify citations in Wikipedia articles, because they often let you search an online version of the book for specific words or phrases, or you can browse through the book (although for copyright reasons the entire book is usually not available). At the Open Library (part of the Internet Archive) you can borrow and read entire books online. Online databases Subscription eBook databases Libraries Alabama Alaska California Colorado Connecticut Delaware Florida Georgia Illinois Indiana Iowa Kansas Kentucky Massachusetts Michigan Minnesota Missouri Nebraska New Jersey New Mexico New York North Carolina Ohio Oklahoma Oregon Pennsylvania Rhode Island South Carolina South Dakota Tennessee Texas Utah Washington state Wisconsin Bookselling and swapping Find your book on a site that compiles results from other online sites: These sites allow you to search the catalogs of many individual booksellers: Non-English book sources If the book you are looking for is in a language other than English, you might find it helpful to look at the equivalent pages on other Wikipedias, linked below – they are more likely to have sources appropriate for that language. Find other editions The WorldCat xISBN tool for finding other editions is no longer available. However, there is often a "view all editions" link on the results page from an ISBN search. Google books often lists other editions of a book and related books under the "about this book" link. You can convert between 10 and 13 digit ISBNs with these tools: Find on Wikipedia See also Get free access to research! Research tools and services Outreach Get involved |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Siddur] | [TOKENS: 3330] |
Contents Siddur A siddur (Hebrew: סִדּוּר sīddūr, [siˈduʁ, 'sɪdəʁ]; plural siddurim סִדּוּרִים [siduˈʁim]) is a Jewish prayer book containing a set order of daily prayers. The word siddur comes from the Hebrew root ס־ד־ר, meaning 'order.' Other terms for prayer books are tefillot (תְּפִלּוֹת) among Sephardi Jews, tefillah among German Jews, and tiklāl (תכלאל) among Yemenite Jews. History The earliest parts of Jewish prayer books are the Shema Yisrael ("Hear O Israel") (Deuteronomy 6:4 et seq) and the Priestly Blessing (Numbers 6:24-26), which are in the Torah. A set of eighteen (currently nineteen) blessings called the Shemoneh Esreh or the Amidah (Hebrew, "standing [prayer]"), is traditionally ascribed to the Great Assembly in the time of Ezra, at the end of the biblical period. The name Shemoneh Esreh, literally "eighteen", is a historical anachronism, since it now contains nineteen blessings. It was only near the end of the Second Temple period that the eighteen prayers of the weekday Amidah became standardized. Even at that time their precise wording and order was not yet fixed, and varied from locale to locale. Many modern scholars[who?] believe that parts of the Amidah came from the Hebrew apocryphal work Ben Sira.[citation needed] According to the Talmud, soon after the destruction of the Temple in Jerusalem a formal version of the Amidah was adopted at a rabbinical council in Yavne, under the leadership of Gamaliel II and his colleagues. However, the precise wording was still left open. The order, general ideas, opening and closing lines were fixed. Most of the wording was left to the individual reader. It was not until several centuries later that the prayers began to be formally fixed. By the Middle Ages the texts of the prayers were nearly fixed, and in the form in which they are still used today, albeït with significant variations across communities. Scholars generally trace the formation of fixed Jewish prayer to the Geonic period (7th–11th centuries), when rabbinic leaders in Babylonia created more standardized forms of liturgy to meet the needs of their communities. During this time, the first complete siddurim appeared, including the important 10th-century prayer book compiled by Saadia Gaon. In the centuries that followed, different Jewish communities developed their own rites (nuschaot), shaped by local customs and halakhic interpretations. The Siddur was printed by Soncino in Italy as early as 1486, though a Siddur was first mass-distributed only in 1865. The Siddur began appearing in the vernacular as early as 1538. The first English translation was published in London in 1738 by an author writing under the pseudonym Gamaliel ben Pedahzur; a different translation was released in the United States in 1837. Readings from the Torah (five books of Moses) and the Nevi'im ("Prophets") form part of the prayer services. To this framework various Jewish sages added, from time to time, various prayers, and, for festivals especially, numerous hymns. The earliest existing codification of the prayerbook was drawn up by Amram ben Sheshna of Sura Academy in Sawad, the Abbasid Caliphate, an area known as "Babylonia" in Jewish texts, about 850 CE (Seder Rav ʿAmram). Half a century later, Saadia Gaon, also of Sura, composed a siddur (see Siddur of Saadia Gaon), in which the rubrical matter is in Judeo-Arabic. These were the basis of Simhah ben Samuel of Vitry's 11th century Machzor Vitry, which was based on the ideas of his teacher, Rashi. Another formulation of the prayers was that appended by Maimonides to the Book of Love in his Mishneh Torah: this forms the basis of the Yemenite liturgy, and has had some influence on other rites. From this point forward all Jewish prayerbooks had the same basic order and contents. Two authoritative versions of the Ashkenazi siddur were those of Shabbetai Sofer in the 16th century and Seligman Baer in the 19th century; siddurim have also been published reflecting the views of Jacob Emden and the Vilna Gaon. There are differences among, amongst others, the Sephardic (including Spanish and Portuguese and Mizrachim), Teimani (Yemenite), Hasidic, Ashkenazic (divided into German, Polish and other European and eastern-European rites), Bené Roma or Italkim, Romaniote (Greek, once extending to Turkey, Crimea and the southern Italian peninsula) and also Persian, Kurdish, Bukharian, Georgian, Mountain Jewish, Ethiopian and Cochin-Jewish liturgies. Most of these are slight differences in the wording of the prayers; for instance, Oriental Sephardic and some Hasidic prayer books state "חננו מאתך חכמה בינה ודעת", "Graciously bestow upon us from You wisdom (ḥochmah), understanding (binah) and knowledge (daat)", in allusion to the Kabbalistic sefirot of those names, while the Nusach Ashkenaz, as well as Western Sephardic and other Hasidic versions retain the older wording "חננו מאתך דעה בינה והשכל", "Graciously bestow upon us from You knowledge, understanding, and reason". In some cases, however, the order of the preparation for the Amidah is drastically different, reflecting the different halakhic and kabbalistic formulae that the various scholars relied on in assembling their prayer books, as well as the minhagim, or customs, or their locales. Some forms of the Sephardic rite are considered to be very overtly kabbalistic, depending on how far they reflect the ritual of Isaac Luria (see Lurianic Kabbalah). This is partly because the Tetragrammaton frequently appears with varying vowel points beneath the letters (unpronounced, but to be meditated upon) and different Names of God appear in small print within the final hei (ה) of the Tetragrammaton. In some editions, there is a Psalm in the preparations for the Amidah that is printed in the outline of a menorah, and the worshipper meditates on this shape as he recites the psalm. While the Ashkenazic rite does contain some kabbalistic elements, such as acrostics and allusions to the sefirot ("To You, God, is the greatness [gedullah], and the might [gevurah], and the glory [tiferet], longevity [netzach],..." etc.), these are not easily seen unless the reader is already initiated. It is notable that although many other traditions avoid using the poem Anim Zemiroth on the Sabbath, for fear that its holiness would be less appreciated due to the frequency of the Sabbath, the poem is usually sung by Ashkenazi congregations before concluding the Sabbath Musaf service with the daily psalm. The ark is opened for the duration of the song. Hasidim, though usually ethnically Ashkenazi, usually use liturgies with varying degrees of Sephardic influence, such as Nusach Sefard and Nusach Ari, in order to follow the order of the prayers set by Rabbi Isaac Luria, often called "Ari HaKadosh", or "The Holy Lion". Although the Ari himself was born Ashkenazi, he borrowed many elements from Sephardi and other traditions, since he felt that they followed Kabbalah and Halacha more faithfully. The Ari did not publish any siddur, but orally transmitted his particular usages to his students with interpretations and certain meditations. Many siddurim containing some form of the Sephardic rite together with the usages of the Ari were published, both by actual Sephardic communities and for the use of Hasidim and other Ashkenazim interested in Kabbalah. In 1803, Rabbi Schneur Zalman of Liadi compiled an authoritative siddur from the sixty siddurim that he checked for compliance with Hebrew grammar, Jewish law, and Kabbalah: some call this siddur "Nusach Ari", and is used by Lubavitch Hasidim. Those that use Nusach HaAri claim that it is an all-encompassing nusach that is valid for any Jew, no matter what his ancestral tribe or identity, a view attributed to the Maggid of Mezeritch.[citation needed] The Mahzor of each rite is distinguished by hymns (piyyutim). The most important writers are Jose ben Jose, probably in the 4th-5th century CE, chiefly known for his compositions for Rosh Hashanah and Yom Kippur; Yannai; Eleazar Kalir, the founder of the payyetanic style, perhaps in the 7th century; Saadia Gaon; the Spanish school, consisting of Joseph ibn Abitur (died in 970), ibn Gabirol, Isaac Gayyath, Moses ibn Ezra, Abraham ibn Ezra and Judah ha-Levi, Moses ben Nahman (Nahmanides) and Isaac Luria; and the Ashkenazic and French schools including Shimon bar Yitzchak, Meir bar Yitzchak and many others. The Ari recited only early piyyutim, such as those by Eleazar Kalir, but did not like the Sephardic piyyutim. Therefore, on holidays he would daven (recite the prescribed liturgical prayers) with Ashkenazim—as opposed to his practice the rest of the year to daven with Sephardim—in order to recite their piyyutim, which include many more earlier piyyutim. For this reason, many Hasidim (such Belz and Viznitz) recite many piyyutim on Yom Tov and the sabbaths of the four special portions preceding Passover in accordance with the practice of the Ari. However, in Sephardic communities which accepted most of the practices of the Ari, they never accepted the Ashkenazic piyyutim. Complete and weekday siddurim Some siddurim have only prayers for weekdays; others have prayers for weekdays and Shabbat. Many have prayers for weekdays, Shabbat, and the three Biblical festivals, Sukkot (the feast of Tabernacles), Shavuot (the feast of weeks) and Pesach (Passover). The latter are referred to as a Siddur Shalem ("complete siddur"). Variations and additions on holidays Popular siddurim Below are listed many popular siddurim used by religious Jews. This list mostly excludes prayer books specifically for the High Holidays; see Machzor (Popular versions). These siddurim follow the halakha of Rabbi Ovadia Yosef (1920–2013) a Talmudic scholar, and authority on Jewish religious law, and spiritual leader of Israel's ultra-orthodox Shas party. Yosef believed that the Sephardic halakhic tradition favoured leniency, and these principles are reflected in his siddurim. please note, these siddurim are also for the Edot Ha-mizrach communities. Some notable editions are: (Characterised by relative absence of Kabbalistic elements:) (Usually characterised by presence of Kabbalistic elements:) (Usually characterised by presence of Kabbalistic elements, except for the Moroccan siddurim which generally contain fewer Kabbalistic elements:) (Usually characterised by presence of Kabbalistic elements:) The Baladi Jews (from Arabic balad, country) follow the legal rulings of the Rambam (Maimonides) as codified in his work the Mishneh Torah. Rabbi Yiḥye Tsalaḥ (Maharits) revised this liturgy to end friction between traditionalists (who followed Rambam's rulings and the siddur as it developed in Yemen) and Kabbalists who followed the innovations of the Ari. This prayer book makes very few additions or changes and substantially follows the older Yemenite tradition as it had existed prior to this conflict. The Shami Jews (from Arabic ash-Sham, the north, referring to Palestine or Damascus) represent those who accepted the Sephardic rite, after being exposed to new inexpensive, typeset prayer books brought from Israel and the Sephardic diaspora by envoys and merchants in the late 17th century and 18th century. The "local rabbinic leadership resisted the new versions....Nevertheless, the new prayer books were widely accepted." As part of that process, the Shami modified their rites to accommodate the usages of the Ari to the maximum extent. The text of the Shami siddur now largely follows the Sephardic tradition, though the pronunciation, chant and customs are still Yemenite in flavour. All of the following are published by the Central Conference of American Rabbis: Prayer books edited by Rabbi Mordecai Kaplan and others: Kol Haneshamah Prayerbook series, ed. David Teutsch: Feminist siddurim Siddur Nashim, by Margaret Wenig and Naomi Janowitz in 1976, was the first Jewish prayer book to refer to God using female pronouns and imagery. Reconstructionist Rabbi Rebecca Alpert (Reform Judaism, Winter 1991) commented: The experience of praying with Siddur Nashim... transformed my relationship with God. For the first time, I understood what it meant to be made in God's image. To think of God as a woman like myself, to see Her as both powerful and nurturing, to see Her imaged with a woman's body, with womb, with breasts – this was an experience of ultimate significance. Was this the relationship that men have had with God for all these millennia? How wonderful to gain access to those feelings and perceptions.[citation needed] Following in the footsteps of feminist prayerbooks, liberal prayerbooks tend increasingly to avoid male-specific words and pronouns, seeking that all references to God in translations be made in gender-neutral language. For example, the UK Liberal movement's Siddur Lev Chadash (1995) does so, as does the UK Reform Movement's Forms of Prayer (2008). In Mishkan T'filah, the American Reform Jewish prayer book released in 2007, references to God as "He" have been removed, and whenever Jewish patriarchs are named (Abraham, Isaac, and Jacob), so also are the matriarchs (Sarah, Rebecca, Rachel, and Leah). Humanistic and atheist siddurim Yoreh writes about his work: "I think prayer is communal and private expression of hopes, fears, an appreciation of aesthetic beauty, good attributes. But that has nothing to do with God." Other siddurim There are also some Karaite, Samaritan and Sabbatean prayer books.[example needed] See also References Bibliography External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Social_norms] | [TOKENS: 5433] |
Contents Social norm A social norm or norm is a shared standard of acceptable behavior by a group. Social norms can both be informal understandings that govern the behavior of members of a society, as well as be codified into rules and laws. Social normative influences or social norms, are deemed to be powerful drivers of human behavioural changes and well organized and incorporated by major theories which explain human behaviour. Institutions are composed of multiple norms. Norms are shared social beliefs about behavior; thus, they are distinct from "ideas", "attitudes", and "values", which can be held privately, and which do not necessarily concern behavior. Norms are contingent on context, social group, and historical circumstances. Scholars distinguish between regulative norms (which constrain behavior), constitutive norms (which shape interests), and prescriptive norms (which prescribe what actors ought to do). The effects of norms can be determined by a logic of appropriateness and logic of consequences; the former entails that actors follow norms because it is socially appropriate, and the latter entails that actors follow norms because of cost-benefit calculations. Others see social norms emerging as part of an evolutionarily stable strategy, where they stabilize through third-party punishment. Three stages have been identified in the life cycle of a norm: (1) Norm emergence – norm entrepreneurs seek to persuade others of the desirability and appropriateness of certain behaviors; (2) Norm cascade – when a norm obtains broad acceptance; and (3) Norm internalization – when a norm acquires a "taken-for-granted" quality. Norms are robust to various degrees: some norms are often violated whereas other norms are so deeply internalized that norm violations are infrequent. Evidence for the existence of norms can be detected in the patterns of behavior within and across groups, as well as the articulation of norms in group discourse. Definition There are varied definitions of social norms, but there is agreement among scholars that norms are: In 1965, Jack P. Gibbs identified three basic normative dimensions that all concepts of norms could be subsumed under: According to Ronald Jepperson, Peter Katzenstein and Alexander Wendt, "norms are collective expectations about proper behavior for a given identity." Wayne Sandholtz argues against this definition, as he writes that shared expectations are an effect of norms, not an intrinsic quality of norms. Sandholtz, Martha Finnemore and Kathryn Sikkink define norms instead as "standards of appropriate behavior for actors with a given identity." In this definition, norms have an "oughtness" quality to them. Michael Hechter and Karl-Dieter Opp define norms as "cultural phenomena that prescribe and proscribe behavior in specific circumstances." Sociologists Christine Horne and Stefanie Mollborn define norms as "group-level evaluations of behavior." This entails that norms are widespread expectations of social approval or disapproval of behavior. Scholars debate whether social norms are individual constructs or collective constructs. Economist and game theorist Peyton Young defines norms as "patterns of behavior that are self-enforcing within a group." He emphasizes that norms are driven by shared expectations: "Everyone conforms, everyone is expected to conform, and everyone wants to conform when they expect everyone else to conform." He characterizes norms as devices that "coordinate people's expectations in interactions that possess multiple equilibria." Concepts such as "conventions", "customs", "morals", "mores", "rules", and "laws" have been characterized as equivalent to norms. Institutions can be considered collections or clusters of multiple norms. Rules and norms are not necessarily distinct phenomena: both are standards of conduct that can have varying levels of specificity and formality. Laws are a highly formal version of norms. Laws, rules and norms may be at odds; for example, a law may prohibit something but norms still allow it. Norms are not the equivalent of an aggregation of individual attitudes. Ideas, attitudes and values are not necessarily norms, as these concepts do not necessarily concern behavior and may be held privately. "Prevalent behaviors" and behavioral regularities are not necessarily norms. Instinctual or biological reactions, personal tastes, and personal habits are not necessarily norms. Emergence and transmission Groups may adopt norms in a variety of ways. Some stable and self-reinforcing norms may emerge spontaneously without conscious human design. Peyton Young goes as far as to say that "norms typically evolve without top-down direction... through interactions of individuals rather than by design." Norms may develop informally, emerging gradually as a result of repeated use of discretionary stimuli to control behavior. Not necessarily laws set in writing, informal norms represent generally accepted and widely sanctioned routines that people follow in everyday life. These informal norms, if broken, may not invite formal legal punishments or sanctions, but instead encourage reprimands, warnings, or othering; incest, for example, is generally thought of as wrong in society, but many jurisdictions do not legally prohibit it. Norms may also be created and advanced through conscious human design by norm entrepreneurs. Norms can arise formally, where groups explicitly outline and implement behavioral expectations. Legal norms typically arise from design. A large number of these norms we follow 'naturally' such as driving on the right side of the road in the US and on the left side in the UK, or not speeding in order to avoid a ticket. Martha Finnemore and Kathryn Sikkink identify three stages in the life cycle of a norm: They argue that several factors may raise the influence of certain norms: Christina Horne and Stefanie Mollborn have identified two broad categories of arguments for the emergence of norms: Per consequentialism, norms contribute to the collective good. However, per relationalism, norms do not necessarily contribute to the collective good; norms may even be harmful to the collective. Some scholars have characterized norms as essentially unstable, thus creating possibilities for norm change. According to Wayne Sandholtz, actors are more likely to persuade others to modify existing norms if they possess power, can reference existing foundational meta-norms, and can reference precedents. Social closeness between actors has been characterized as a key component in sustaining social norms. Individuals may also import norms from a previous organization to their new group, which can get adopted over time. Without a clear indication of how to act, people typically rely on their history to determine the best course forward; what was successful before may serve them well again. In a group, individuals may all import different histories or scripts about appropriate behaviors; common experience over time will lead the group to define as a whole its take on the right action, usually with the integration of several members' schemas. Under the importation paradigm, norm formation occurs subtly and swiftly whereas with formal or informal development of norms may take longer. Groups internalize norms by accepting them as reasonable and proper standards for behavior within the group. Once firmly established, a norm becomes a part of the group's operational structure and hence more difficult to change. While possible for newcomers to a group to change its norms, it is much more likely that the new individual will adopt the group's norms, values, and perspectives, rather than the other way around. Deviance from social norms Deviance is defined as "nonconformity to a set of norms that are accepted by a significant number of people in a community or society" More simply put, if group members do not follow a norm, they become tagged as a deviant. In the sociological literature, this can often lead to them being considered outcasts of society. Yet, deviant behavior amongst children is somewhat expected. Except the idea of this deviance manifesting as a criminal action, the social tolerance given in the example of the child is quickly withdrawn against the criminal. Crime is considered one of the most extreme forms of deviancy according to scholar Clifford R. Shaw. What is considered "normal" is relative to the location of the culture in which the social interaction is taking place. In psychology, an individual who routinely disobeys group norms runs the risk of turning into the "institutionalized deviant." Similar to the sociological definition, institutionalized deviants may be judged by other group members for their failure to adhere to norms. At first, group members may increase pressure on a non-conformist, attempting to engage the individual in conversation or explicate why he or she should follow their behavioral expectations. The role in which one decides on whether or not to behave is largely determined on how their actions will affect others. Especially with new members who perhaps do not know any better, groups may use discretionary stimuli to bring an individual's behavior back into line. Over time, however, if members continue to disobey, the group will give-up on them as a lost cause; while the group may not necessarily revoke their membership, they may give them only superficial consideration. If a worker is late to a meeting, for example, violating the office norm of punctuality, a supervisor or other co-worker may wait for the individual to arrive and pull him aside later to ask what happened. If the behavior continues, eventually the group may begin meetings without him since the individual "is always late." The group generalizes the individual's disobedience and promptly dismisses it, thereby reducing the member's influence and footing in future group disagreements. Group tolerance for deviation varies across membership; not all group members receive the same treatment for norm violations. Individuals may build up a "reserve" of good behavior through conformity, which they can borrow against later. These idiosyncrasy credits provide a theoretical currency for understanding variations in group behavioral expectations. A teacher, for example, may more easily forgive a straight-A student for misbehaving—who has past "good credit" saved up—than a repeatedly disruptive student. While past performance can help build idiosyncrasy credits, some group members have a higher balance to start with. Individuals can import idiosyncrasy credits from another group; childhood movie stars, for example, who enroll in college, may experience more leeway in adopting school norms than other incoming freshmen. Finally, leaders or individuals in other high-status positions may begin with more credits and appear to be "above the rules" at times. Even their idiosyncrasy credits are not bottomless, however; while held to a more lenient standard than the average member, leaders may still face group rejection if their disobedience becomes too extreme. Deviance also causes multiple emotions one experiences when going against a norm. One of those emotions widely attributed to deviance is guilt. Guilt is connected to the ethics of duty which in turn becomes a primary object of moral obligation. Guilt is followed by an action that is questioned after its doing. It can be described as something negative to the self as well as a negative state of feeling. Used in both instances, it is both an unpleasant feeling as well as a form of self-punishment. Using the metaphor of "dirty hands", it is the staining or tainting of oneself and therefore having to self cleanse away the filth. It is a form of reparation that confronts oneself as well as submitting to the possibility of anger and punishment from others. Guilt is a point in both action and feeling that acts as a stimulus for further "honorable" actions. A 2023 study found that non-industrial societies varied in their punishments of norm violations. Punishment varied based on the types of norm violations and the socio-economic system of the society. The study "found evidence that reputational punishment was associated with egalitarianism and the absence of food storage; material punishment was associated with the presence of food storage; physical punishment was moderately associated with greater dependence on hunting; and execution punishment was moderately associated with social stratification." Behavior Whereas ideas in general do not necessarily have behavioral implications, Martha Finnemore notes that "norms by definition concern behavior. One could say that they are collectively held ideas about behavior." Norms running counter to the behaviors of the overarching society or culture may be transmitted and maintained within small subgroups of society. For example, Crandall (1988) noted that certain groups (e.g., cheerleading squads, dance troupes, sports teams, sororities) have a rate of bulimia, a publicly recognized life-threatening disease, that is much higher than society as a whole. Social norms have a way of maintaining order and organizing groups. In the field of social psychology, the roles of norms are emphasized—which can guide behavior in a certain situation or environment as "mental representations of appropriate behavior". It has been shown that normative messages can promote pro-social behavior, including decreasing alcohol use, increasing voter turnout, and reducing energy use. According to the psychological definition of social norms' behavioral component, norms have two dimensions: how much a behavior is exhibited, and how much the group approves of that behavior. Social control Although not considered to be formal laws within society, norms still work to promote a great deal of social control. They are statements that regulate conduct. The cultural phenomenon that is the norm is the prescriber of acceptable behavior in specific instances. Ranging in variations depending on culture, race, religion, and geographical location, it is the foundation of the terms some know as acceptable as not to injure others, the golden rule, and to keep promises that have been pledged. Without them, there would be a world without consensus, common ground, or restrictions. Even though the law and a state's legislation is not intended to control social norms, society and the law are inherently linked and one dictates the other. This is why it has been said that the language used in some legislation is controlling and dictating for what should or should not be accepted. For example, the criminalization of familial sexual relations is said to protect those that are vulnerable, however even consenting adults cannot have sexual relationships with their relatives. The language surrounding these laws conveys the message that such acts are supposedly immoral and should be condemned, even though there is no actual victim in these consenting relationships. Social norms can be enforced formally (e.g., through sanctions) or informally (e.g., through body language and non-verbal communication cues). Because individuals often derive physical or psychological resources from group membership, groups are said to control discretionary stimuli; groups can withhold or give out more resources in response to members' adherence to group norms, effectively controlling member behavior through rewards and operant conditioning. Social psychology research has found the more an individual values group-controlled resources or the more an individual sees group membership as central to his definition of self, the more likely he is to conform. Social norms also allow an individual to assess what behaviors the group deems important to its existence or survival, since they represent a codification of belief; groups generally do not punish members or create norms over actions which they care little about. Norms in every culture create conformity that allows for people to become socialized to the culture in which they live. As social beings, individuals learn when and where it is appropriate to say certain things, to use certain words, to discuss certain topics or wear certain clothes, and when it is not. Thus, knowledge about cultural norms is important for impressions, which is an individual's regulation of their nonverbal behavior. One also comes to know through experience what types of people he/she can and cannot discuss certain topics with or wear certain types of dress around. Typically, this knowledge is derived through experience (i.e. social norms are learned through social interaction). Wearing a suit to a job interview in order to give a great first impression represents a common example of a social norm in the white collar work force. In his work "Order without Law: How Neighbors Settle Disputes", Robert Ellickson studies various interactions between members of neighbourhoods and communities to show how societal norms create order within a small group of people. He argues that, in a small community or neighborhood, many rules and disputes can be settled without a central governing body simply by the interactions within these communities. Sociology In sociology, norms are seen as rules that bind an individual's actions to a specific sanction in one of two forms: a punishment or a reward. Through regulation of behavior, social norms create unique patterns that allow for distinguishing characteristics to be made between social systems. This creates a boundary that allows for a differentiation between those that belong in a specific social setting and those that do not. Research in social psychology distinguishes between descriptive norms (what people commonly do) and injunctive norms (what people ought to do). In a field experiment at Petrified Forest National Park, and colleagues found that signs emphasizing descriptive norms (e.g., "Many past visitors have removed petrified wood") inadvertently increased theft, whereas signs stating injunctive norms (e.g., "Please don’t remove the petrified wood") reduced it. This demonstrates that norms can be leveraged for social influence, but the type of norm communicated matters critically for behavioral outcomes. For Talcott Parsons of the functionalist school, norms dictate the interactions of people in all social encounters. On the other hand, Karl Marx believed that norms are used to promote the creation of roles in society which allows for people of different levels of social class structure to be able to function properly. Marx claims that this power dynamic creates social order. James Coleman used both micro and macro conditions for his theory. For Coleman, norms start out as goal oriented actions by actors on the micro level. If the benefits do not outweigh the costs of the action for the actors, then a social norm would emerge. The norm's effectiveness is then determined by its ability to enforce its sanctions against those who would not contribute to the "optimal social order." Heinrich Popitz is convinced that the establishment of social norms, that make the future actions of alter foreseeable for ego, solves the problem of contingency (Niklas Luhmann). In this way, ego can count on those actions as if they would already have been performed and does not have to wait for their actual execution; social interaction is thus accelerated. Important factors in the standardization of behavior are sanctions and social roles. The probability of these behaviours occurring again is discussed in the theories of B. F. Skinner, who states that operant conditioning plays a role in the process of social norm development. Operant conditioning is the process by which behaviours are changed as a function of their consequences. The probability that a behaviour will occur can be increased or decreased depending on the consequences of said behaviour. In the case of social deviance, an individual who has gone against a norm will contact the negative contingencies associated with deviance, this may take the form of formal or informal rebuke, social isolation or censure, or more concrete punishments such as fines or imprisonment. If one reduces the deviant behavior after receiving a negative consequence, then they have learned via punishment. If they have engaged in a behavior consistent with a social norm after having an aversive stimulus reduced, then they have learned via negative reinforcement. Reinforcement increases behavior, while punishment decreases behavior. As an example of this, consider a child who has painted on the walls of her house, if she has never done this before she may immediately seek a reaction from her mother or father. The form of reaction taken by the mother or father will affect whether the behaviour is likely to occur again in the future. If her parent is positive and approving of the behaviour it will likely reoccur (reinforcement) however, if the parent offers an aversive consequence (physical punishment, time-out, anger etc...) then the child is less likely to repeat the behaviour in future (punishment). Skinner also states that humans are conditioned from a very young age on how to behave and how to act with those around us considering the outside influences of the society and location one is in. Built to blend into the ambiance and attitude around us, deviance is a frowned upon action. Focus theory of normative conduct Cialdini, Reno, and Kallgren developed the focus theory of normative conduct to describe how individuals implicitly juggle multiple behavioral expectations at once. Expanding on conflicting prior beliefs about whether cultural, situational or personal norms motivate action, the researchers suggested the focus of an individual's attention will dictate what behavioral expectation they follow. Types There is no clear consensus on how the term norm should be used. Martha Finnemore and Kathryn Sikkink distinguish between three types of norms: Finnemore, Sikkink, Jeffrey W. Legro and others have argued that the robustness (or effectiveness) of norms can be measured by factors such as: Christine Horne argues that the robustness of a norm is shaped by the degree of support for the actors who sanction deviant behaviors; she refers to norms regulating how to enforce norms as "metanorms." According to Beth G. Simmons and Hyeran Jo, diversity of support for a norm can be a strong indicator of robustness. They add that institutionalization of a norm raises its robustness. It has also been posited that norms that exist within broader clusters of distinct but mutually reinforcing norms may be more robust. Jeffrey Checkel argues that there are two common types of explanations for the efficacy of norms: According to Peyton Young, mechanisms that support normative behavior include: Descriptive norms depict what happens, while injunctive norms describe what should happen. Cialdini, Reno, and Kallgren (1990) define a descriptive norm as people's perceptions of what is commonly done in specific situations; it signifies what most people do, without assigning judgment. The absence of trash on the ground in a parking lot, for example, transmits the descriptive norm that most people there do not litter. An Injunctive norm, on the other hand, transmits group approval about a particular behavior; it dictates how an individual should behave. Watching another person pick up trash off the ground and throw it out, a group member may pick up on the injunctive norm that he ought not to litter. Prescriptive norms are unwritten rules that are understood and followed by society and indicate what we should do. Expressing gratitude or writing a Thank You card when someone gives you a gift represents a prescriptive norm in American culture. Proscriptive norms, in contrast, comprise the other end of the same spectrum; they are similarly society's unwritten rules about what one should not do. These norms can vary between cultures; while kissing someone you just met on the cheek is an acceptable greeting in some European countries, this is not acceptable, and thus represents a proscriptive norm in the United States. Subjective norms are determined by beliefs about the extent to which important others want a person to perform a behavior. When combined with attitude toward behavior, subjective norms shape an individual's intentions. Social influences are conceptualized in terms of the pressure that people perceive from important others to perform, or not to perform, a behavior. Social Psychologist Icek Azjen theorized that subjective norms are determined by the strength of a given normative belief and further weighted by the significance of a social referent, as represented in the following equation: SN ∝ Σnimi , where (n) is a normative belief and (m) is the motivation to comply with said belief. Mathematical representations Over the last few decades, several theorists have attempted to explain social norms from a more theoretical point of view. By quantifying behavioral expectations graphically or attempting to plot the logic behind adherence, theorists hoped to be able to predict whether or not individuals would conform. The return potential model and game theory provide a slightly more economic conceptualization of norms, suggesting individuals can calculate the cost or benefit behind possible behavioral outcomes. Under these theoretical frameworks, choosing to obey or violate norms becomes a more deliberate, quantifiable decision. Developed in the 1960s, the return potential model provides a method for plotting and visualizing group norms. In the regular coordinate plane, the amount of behavior exhibited is plotted on the X-axis (label a in Figure 1) while the amount of group acceptance or approval gets plotted on the Y-axis (b in Figure 1). The graph represents the potential return or positive outcome to an individual for a given behavioral norm. Theoretically, one could plot a point for each increment of behavior how much the group likes or dislikes that action. For example, it may be the case that among first-year graduate students, strong social norms exist around how many daily cups of coffee a student drinks. If the return curve in Figure 1 correctly displays the example social norm, we can see that if someone drinks 0 cups of coffee a day, the group strongly disapproves. The group disapproves of the behavior of any member who drinks fewer than four cups of coffee a day; the group disapproves of drinking more than seven cups, shown by the approval curve dipping back below zero. As seen in this example, the return potential model displays how much group approval one can expect for each increment of behavior. Another general formal framework that can be used to represent the essential elements of the social situation surrounding a norm is the repeated game of game theory. Rational choice, a branch of game theory, deals with the relations and actions socially committed among rational agents. A norm gives a person a rule of thumb for how they should behave. However, a rational person acts according to the rule only if it is beneficial for them. The situation can be described as follows. A norm gives an expectation of how other people act in a given situation (macro). A person acts optimally given the expectation (micro). For a norm to be stable, people's actions must reconstitute the expectation without change (micro-macro feedback loop). A set of such correct stable expectations is known as a Nash equilibrium. Thus, a stable norm must constitute a Nash equilibrium. In the Nash equilibrium, no one actor has any positive incentive in individually deviating from a certain action. Social norms will be implemented if the actions of that specific norm come into agreement by the support of the Nash equilibrium in the majority of the game theoretical approaches. From a game-theoretical point of view, there are two explanations for the vast variety of norms that exist throughout the world. One is the difference in games. Different parts of the world may give different environmental contexts and different people may have different values, which may result in a difference in games. The other is equilibrium selection not explicable by the game itself. Equilibrium selection is closely related to coordination. For a simple example, driving is common throughout the world, but in some countries people drive on the right and in other countries people drive on the left (see coordination game). A framework called comparative institutional analysis is proposed to deal with the game theoretical structural understanding of the variety of social norms. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-138] | [TOKENS: 17273] |
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America) |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Intelligence] | [TOKENS: 2472] |
Contents Intelligence Intelligence (/ˌɪntɛlɪˈdʒəns/) has been defined in many ways: the capacity for abstraction, logic, understanding, self-awareness, learning, emotional knowledge, reasoning, planning, creativity, critical thinking, and problem-solving. It can be described as the ability to perceive or infer information and to retain it as knowledge to be applied to adaptive behaviors within an environment or context. The term rose to prominence during the early 1900s. Most psychologists believe that intelligence can be divided into various domains or competencies. Intelligence has been long-studied in humans, and across numerous disciplines. It has also been observed in the cognition of non-human animals. Some researchers have suggested that plants exhibit forms of intelligence, though this remains controversial. Etymology The word intelligence derives from the Latin nouns intelligentia or intellēctus, which in turn stem from the verb intelligere, to comprehend or perceive. In the Middle Ages, the word intellectus became the scholarly technical term for understanding and a translation for the Greek philosophical term nous. This term, however, was strongly linked to the metaphysical and cosmological theories of teleological scholasticism, including theories of the immortality of the soul, and the concept of the active intellect (also known as the active intelligence). This approach to the study of nature was strongly rejected by early modern philosophers such as Francis Bacon, Thomas Hobbes, John Locke, and David Hume, all of whom preferred "understanding" (in place of "intellectus" or "intelligence") in their English philosophical works. Hobbes for example, in his Latin De Corpore, used "intellectus intelligit", translated in the English version as "the understanding understandeth", as a typical example of a logical absurdity. "Intelligence" has therefore become less common in English language philosophy, but it has later been taken up (with the scholastic theories that it now implies) in more contemporary psychology. Definitions There is controversy over how to define intelligence. Scholars describe its constituent abilities in various ways, and differ in the degree to which they conceive of intelligence as quantifiable. A consensus report called Intelligence: Knowns and Unknowns, published in 1995 by the Board of Scientific Affairs of the American Psychological Association, states: Individuals differ from one another in their ability to understand complex ideas, to adapt effectively to the environment, to learn from experience, to engage in various forms of reasoning, to overcome obstacles by taking thought. Although these individual differences can be substantial, they are never entirely consistent: a given person's intellectual performance will vary on different occasions, in different domains, as judged by different criteria. Concepts of "intelligence" are attempts to clarify and organize this complex set of phenomena. Although considerable clarity has been achieved in some areas, no such conceptualization has yet answered all the important questions, and none commands universal assent. Indeed, when two dozen prominent theorists were recently asked to define intelligence, they gave two dozen, somewhat different, definitions. Psychologists and learning researchers also have suggested definitions of intelligence such as the following: "Intelligence is a force, F, that acts so as to maximize future freedom of action. It acts to maximize future freedom of action, or keep options open, with some strength T, with the diversity of possible accessible futures, S, up to some future time horizon, τ. In short, intelligence doesn't like to get trapped". Human Human intelligence is the intellectual power of humans, which is marked by complex cognitive feats and high levels of motivation and self-awareness. Intelligence enables humans to remember descriptions of things and use those descriptions in future behaviors. It gives humans the cognitive abilities to learn, form concepts, understand, and reason, including the capacities to recognize patterns, innovate, plan, solve problems, and employ language to communicate. These cognitive abilities can be organized into frameworks like fluid vs. crystallized and the Unified Cattell-Horn-Carroll model, which contains abilities like fluid reasoning, perceptual speed, verbal abilities, and others. Intelligence is different from learning. Learning refers to the act of retaining facts and information or abilities and being able to recall them for future use. Intelligence, on the other hand, is the cognitive ability of someone to perform these and other processes. There have been various attempts to quantify intelligence via psychometric testing. Prominent among these are the various Intelligence Quotient (IQ) tests, which were first developed in the early 20th century to screen children for intellectual disability. Over time, IQ tests became more pervasive, being used to screen immigrants, military recruits, and job applicants. As the tests became more popular, belief that IQ tests measure a fundamental and unchanging attribute that all humans possess became widespread. An influential theory that promoted the idea that IQ measures a fundamental quality possessed by every person is the theory of General Intelligence, or g factor. The g factor is a construct that summarizes the correlations observed between an individual's scores on a range of cognitive tests. Today, most psychologists agree that IQ measures at least some aspects of human intelligence, particularly the ability to thrive in an academic context. However, many psychologists question the validity of IQ tests as a measure of intelligence as a whole. There is debate about the heritability of IQ, that is, what proportion of differences in IQ test performance between individuals are explained by genetic or environmental factors. The scientific consensus is that genetics does not explain average differences in IQ test performance between racial groups. Emotional intelligence is thought to be the ability to convey emotion to others in an understandable way as well as to read the emotions of others accurately. Some theories imply that a heightened emotional intelligence could also lead to faster generating and processing of emotions in addition to the accuracy. In addition, higher emotional intelligence is thought to help us manage emotions, which is beneficial for our problem-solving skills. Emotional intelligence is important to our mental health and has ties to social intelligence. Social intelligence is the ability to understand the social cues and motivations of others and oneself in social situations. It is thought to be distinct from other types of intelligence, but has relations to emotional intelligence. Social intelligence has coincided with other studies that focus on how we make judgements of others, the accuracy with which we do so, and why people would be viewed as having positive or negative social character. There is debate as to whether or not these studies and social intelligence come from the same theories or if there is a distinction between them, and they are generally thought to be of two different schools of thought. Moral intelligence is the capacity to understand right from wrong and to behave based on the value that is believed to be right. It is considered a distinct form of intelligence, independent to both emotional and cognitive intelligence. Concepts of "book smarts" and "street smart" are contrasting views based on the premise that some people have knowledge gained through academic study, but may lack the experience to sensibly apply that knowledge, while others have knowledge gained through practical experience, but may lack accurate information usually gained through study by which to effectively apply that knowledge. Artificial intelligence researcher Hector Levesque has noted that: Given the importance of learning through text in our own personal lives and in our culture, it is perhaps surprising how utterly dismissive we tend to be of it. It is sometimes derided as being merely "book knowledge", and having it is being "book smart". In contrast, knowledge acquired through direct experience and apprenticeship is called "street knowledge", and having it is being "street smart". Nonhuman animal Although humans have been the primary focus of intelligence researchers, scientists have also attempted to investigate animal intelligence, or more broadly, animal cognition. These researchers are interested in studying both mental ability in a particular species, and comparing abilities between species. They study various measures of problem solving, as well as numerical and verbal reasoning abilities. Some challenges include defining intelligence so it has the same meaning across species, and operationalizing a measure that accurately compares mental ability across species and contexts. Wolfgang Köhler's research on the intelligence of apes is an example of research in this area, as is Stanley Coren's book, The Intelligence of Dogs. Non-human animals particularly noted and studied for their intelligence include chimpanzees, bonobos (notably the language-using Kanzi) and other great apes, dolphins, elephants and to some extent parrots, rats and ravens. Cephalopod intelligence provides an important comparative study. Cephalopods appear to exhibit characteristics of significant intelligence, yet their nervous systems differ radically from those of backboned animals. Vertebrates such as mammals, birds, reptiles and fish have shown a fairly high degree of intellect that varies according to each species. The same is true with arthropods. Evidence of a general factor of intelligence has been observed in non-human animals. First described in humans, the g factor has since been identified in a number of non-human species. Cognitive ability and intelligence cannot be measured using the same, largely verbally dependent, scales developed for humans. Instead, intelligence is measured using a variety of interactive and observational tools focusing on innovation, habit reversal, social learning, and responses to novelty. Studies have shown that g is responsible for 47% of the individual variance in cognitive ability measures in primates and between 55% and 60% of the variance in mice (Locurto, Locurto). These values are similar to the accepted variance in IQ explained by g in humans (40–50%). Plant It has been argued that plants should also be classified as intelligent based on their ability to sense and model external and internal environments and adjust their morphology, physiology and phenotype accordingly to ensure self-preservation and reproduction. A counter argument is that intelligence is commonly understood to involve the creation and use of persistent memories as opposed to computation that does not involve learning. If this is accepted as definitive of intelligence, then it includes the artificial intelligence of robots capable of "machine learning", but excludes those purely autonomic sense-reaction responses that can be observed in many plants. Plants are not limited to automated sensory-motor responses, however, they are capable of discriminating positive and negative experiences and of "learning" (registering memories) from their past experiences. They are also capable of communication, accurately computing their circumstances, using sophisticated cost–benefit analysis and taking tightly controlled actions to mitigate and control the diverse environmental stressors. Artificial Scholars studying artificial intelligence have proposed definitions of intelligence that include the intelligence demonstrated by machines. Some of these definitions are meant to be general enough to encompass human and other animal intelligence as well. An intelligent agent can be defined as a system that perceives its environment and takes actions which maximize its chances of success. Kaplan and Haenlein define artificial intelligence as "a system's ability to correctly interpret external data, to learn from such data, and to use those learnings to achieve specific goals and tasks through flexible adaptation". Progress in artificial intelligence can be demonstrated in benchmarks ranging from games to practical tasks such as protein folding. Existing AI lags humans in terms of general intelligence, which is sometimes defined as the "capacity to learn how to carry out a huge range of tasks". Mathematician Olle Häggström defines intelligence in terms of "optimization power", an agent's capacity for efficient cross-domain optimization of the world according to the agent's preferences, or more simply the ability to "steer the future into regions of possibility ranked high in a preference ordering". In this optimization framework, Deep Blue has the power to "steer a chessboard's future into a subspace of possibility which it labels as 'winning', despite attempts by Garry Kasparov to steer the future elsewhere." Hutter and Legg, after surveying the literature, define intelligence as "an agent's ability to achieve goals in a wide range of environments". While cognitive ability is sometimes measured as a one-dimensional parameter, it could also be represented as a "hypersurface in a multidimensional space" to compare systems that are good at different intellectual tasks. Some skeptics believe that there is no meaningful way to define intelligence, aside from "just pointing to ourselves". See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Mars#cite_note-plumes-136] | [TOKENS: 11899] |
Contents Mars Mars is the fourth planet from the Sun. It is also known as the "Red Planet", for its orange-red appearance. Mars is a desert-like rocky planet with a tenuous atmosphere that is primarily carbon dioxide (CO2). At the average surface level the atmospheric pressure is a few thousandths of Earth's, atmospheric temperature ranges from −153 to 20 °C (−243 to 68 °F), and cosmic radiation is high. Mars retains some water, in the ground as well as thinly in the atmosphere, forming cirrus clouds, fog, frost, larger polar regions of permafrost and ice caps (with seasonal CO2 snow), but no bodies of liquid surface water. Its surface gravity is roughly a third of Earth's or double that of the Moon. Its diameter, 6,779 km (4,212 mi), is about half the Earth's, or twice the Moon's, and its surface area is the size of all the dry land of Earth. Fine dust is prevalent across the surface and the atmosphere, being picked up and spread at the low Martian gravity even by the weak wind of the tenuous atmosphere. The terrain of Mars roughly follows a north-south divide, the Martian dichotomy, with the northern hemisphere mainly consisting of relatively flat, low lying plains, and the southern hemisphere of cratered highlands. Geologically, the planet is fairly active with marsquakes trembling underneath the ground, but also hosts many enormous volcanoes that are extinct (the tallest is Olympus Mons, 21.9 km or 13.6 mi tall), as well as one of the largest canyons in the Solar System (Valles Marineris, 4,000 km or 2,500 mi long). Mars has two natural satellites that are small and irregular in shape: Phobos and Deimos. With a significant axial tilt of 25 degrees, Mars experiences seasons, like Earth (which has an axial tilt of 23.5 degrees). A Martian solar year is equal to 1.88 Earth years (687 Earth days), a Martian solar day (sol) is equal to 24.6 hours. Mars formed along with the other planets approximately 4.5 billion years ago. During the martian Noachian period (4.5 to 3.5 billion years ago), its surface was marked by meteor impacts, valley formation, erosion, the possible presence of water oceans and the loss of its magnetosphere. The Hesperian period (beginning 3.5 billion years ago and ending 3.3–2.9 billion years ago) was dominated by widespread volcanic activity and flooding that carved immense outflow channels. The Amazonian period, which continues to the present, is the currently dominating and remaining influence on geological processes. Because of Mars's geological history, the possibility of past or present life on Mars remains an area of active scientific investigation, with some possible traces needing further examination. Being visible with the naked eye in Earth's sky as a red wandering star, Mars has been observed throughout history, acquiring diverse associations in different cultures. In 1963 the first flight to Mars took place with Mars 1, but communication was lost en route. The first successful flyby exploration of Mars was conducted in 1965 with Mariner 4. In 1971 Mariner 9 entered orbit around Mars, being the first spacecraft to orbit any body other than the Moon, Sun or Earth; following in the same year were the first uncontrolled impact (Mars 2) and first successful landing (Mars 3) on Mars. Probes have been active on Mars continuously since 1997. At times, more than ten probes have simultaneously operated in orbit or on the surface, more than at any other planet beyond Earth. Mars is an often proposed target for future crewed exploration missions, though no such mission is currently planned. Natural history Scientists have theorized that during the Solar System's formation, Mars was created as the result of a random process of run-away accretion of material from the protoplanetary disk that orbited the Sun. Mars has many distinctive chemical features caused by its position in the Solar System. Elements with comparatively low boiling points, such as chlorine, phosphorus, and sulfur, are much more common on Mars than on Earth; these elements were probably pushed outward by the young Sun's energetic solar wind. After the formation of the planets, the inner Solar System may have been subjected to the so-called Late Heavy Bombardment. About 60% of the surface of Mars shows a record of impacts from that era, whereas much of the remaining surface is probably underlain by immense impact basins caused by those events. However, more recent modeling has disputed the existence of the Late Heavy Bombardment. There is evidence of an enormous impact basin in the Northern Hemisphere of Mars, spanning 10,600 by 8,500 kilometres (6,600 by 5,300 mi), or roughly four times the size of the Moon's South Pole–Aitken basin, which would be the largest impact basin yet discovered if confirmed. It has been hypothesized that the basin was formed when Mars was struck by a Pluto-sized body about four billion years ago. The event, thought to be the cause of the Martian hemispheric dichotomy, created the smooth Borealis basin that covers 40% of the planet. A 2023 study shows evidence, based on the orbital inclination of Deimos (a small moon of Mars), that Mars may once have had a ring system 3.5 billion years to 4 billion years ago. This ring system may have been formed from a moon, 20 times more massive than Phobos, orbiting Mars billions of years ago; and Phobos would be a remnant of that ring. Epochs: The geological history of Mars can be split into many periods, but the following are the three primary periods: Geological activity is still taking place on Mars. The Athabasca Valles is home to sheet-like lava flows created about 200 million years ago. Water flows in the grabens called the Cerberus Fossae occurred less than 20 million years ago, indicating equally recent volcanic intrusions. The Mars Reconnaissance Orbiter has captured images of avalanches. Physical characteristics Mars is approximately half the diameter of Earth or twice that of the Moon, with a surface area only slightly less than the total area of Earth's dry land. Mars is less dense than Earth, having about 15% of Earth's volume and 11% of Earth's mass, resulting in about 38% of Earth's surface gravity. Mars is the only presently known example of a desert planet, a rocky planet with a surface akin to that of Earth's deserts. The red-orange appearance of the Martian surface is caused by iron(III) oxide (nanophase Fe2O3) and the iron(III) oxide-hydroxide mineral goethite. It can look like butterscotch; other common surface colors include golden, brown, tan, and greenish, depending on the minerals present. Like Earth, Mars is differentiated into a dense metallic core overlaid by less dense rocky layers. The outermost layer is the crust, which is on average about 42–56 kilometres (26–35 mi) thick, with a minimum thickness of 6 kilometres (3.7 mi) in Isidis Planitia, and a maximum thickness of 117 kilometres (73 mi) in the southern Tharsis plateau. For comparison, Earth's crust averages 27.3 ± 4.8 km in thickness. The most abundant elements in the Martian crust are silicon, oxygen, iron, magnesium, aluminum, calcium, and potassium. Mars is confirmed to be seismically active; in 2019, it was reported that InSight had detected and recorded over 450 marsquakes and related events. Beneath the crust is a silicate mantle responsible for many of the tectonic and volcanic features on the planet's surface. The upper Martian mantle is a low-velocity zone, where the velocity of seismic waves is lower than surrounding depth intervals. The mantle appears to be rigid down to the depth of about 250 km, giving Mars a very thick lithosphere compared to Earth. Below this the mantle gradually becomes more ductile, and the seismic wave velocity starts to grow again. The Martian mantle does not appear to have a thermally insulating layer analogous to Earth's lower mantle; instead, below 1050 km in depth, it becomes mineralogically similar to Earth's transition zone. At the bottom of the mantle lies a basal liquid silicate layer approximately 150–180 km thick. The Martian mantle appears to be highly heterogenous, with dense fragments up to 4 km across, likely injected deep into the planet by colossal impacts ~4.5 billion years ago; high-frequency waves from eight marsquakes slowed as they passed these localized regions, and modeling indicates the heterogeneities are compositionally distinct debris preserved because Mars lacks plate tectonics and has a sluggishly convecting interior that prevents complete homogenization. Mars's iron and nickel core is at least partially molten, and may have a solid inner core. It is around half of Mars's radius, approximately 1650–1675 km, and is enriched in light elements such as sulfur, oxygen, carbon, and hydrogen. The temperature of the core is estimated to be 2000–2400 K, compared to 5400–6230 K for Earth's solid inner core. In 2025, based on data from the InSight lander, a group of researchers reported the detection of a solid inner core 613 kilometres (381 mi) ± 67 kilometres (42 mi) in radius. Mars is a terrestrial planet with a surface that consists of minerals containing silicon and oxygen, metals, and other elements that typically make up rock. The Martian surface is primarily composed of tholeiitic basalt, although parts are more silica-rich than typical basalt and may be similar to andesitic rocks on Earth, or silica glass. Regions of low albedo suggest concentrations of plagioclase feldspar, with northern low albedo regions displaying higher than normal concentrations of sheet silicates and high-silicon glass. Parts of the southern highlands include detectable amounts of high-calcium pyroxenes. Localized concentrations of hematite and olivine have been found. Much of the surface is deeply covered by finely grained iron(III) oxide dust. The Phoenix lander returned data showing Martian soil to be slightly alkaline and containing elements such as magnesium, sodium, potassium and chlorine. These nutrients are found in soils on Earth, and are necessary for plant growth. Experiments performed by the lander showed that the Martian soil has a basic pH of 7.7, and contains 0.6% perchlorate by weight, concentrations that are toxic to humans. Streaks are common across Mars and new ones appear frequently on steep slopes of craters, troughs, and valleys. The streaks are dark at first and get lighter with age. The streaks can start in a tiny area, then spread out for hundreds of metres. They have been seen to follow the edges of boulders and other obstacles in their path. The commonly accepted hypotheses include that they are dark underlying layers of soil revealed after avalanches of bright dust or dust devils. Several other explanations have been put forward, including those that involve water or even the growth of organisms. Environmental radiation levels on the surface are on average 0.64 millisieverts of radiation per day, and significantly less than the radiation of 1.84 millisieverts per day or 22 millirads per day during the flight to and from Mars. For comparison the radiation levels in low Earth orbit, where Earth's space stations orbit, are around 0.5 millisieverts of radiation per day. Hellas Planitia has the lowest surface radiation at about 0.342 millisieverts per day, featuring lava tubes southwest of Hadriacus Mons with potentially levels as low as 0.064 millisieverts per day, comparable to radiation levels during flights on Earth. Although Mars has no evidence of a structured global magnetic field, observations show that parts of the planet's crust have been magnetized, suggesting that alternating polarity reversals of its dipole field have occurred in the past. This paleomagnetism of magnetically susceptible minerals is similar to the alternating bands found on Earth's ocean floors. One hypothesis, published in 1999 and re-examined in October 2005 (with the help of the Mars Global Surveyor), is that these bands suggest plate tectonic activity on Mars four billion years ago, before the planetary dynamo ceased to function and the planet's magnetic field faded. Geography and features Although better remembered for mapping the Moon, Johann Heinrich von Mädler and Wilhelm Beer were the first areographers. They began by establishing that most of Mars's surface features were permanent and by more precisely determining the planet's rotation period. In 1840, Mädler combined ten years of observations and drew the first map of Mars. Features on Mars are named from a variety of sources. Albedo features are named for classical mythology. Craters larger than roughly 50 km are named for deceased scientists and writers and others who have contributed to the study of Mars. Smaller craters are named for towns and villages of the world with populations of less than 100,000. Large valleys are named for the word "Mars" or "star" in various languages; smaller valleys are named for rivers. Large albedo features retain many of the older names but are often updated to reflect new knowledge of the nature of the features. For example, Nix Olympica (the snows of Olympus) has become Olympus Mons (Mount Olympus). The surface of Mars as seen from Earth is divided into two kinds of areas, with differing albedo. The paler plains covered with dust and sand rich in reddish iron oxides were once thought of as Martian "continents" and given names like Arabia Terra (land of Arabia) or Amazonis Planitia (Amazonian plain). The dark features were thought to be seas, hence their names Mare Erythraeum, Mare Sirenum and Aurorae Sinus. The largest dark feature seen from Earth is Syrtis Major Planum. The permanent northern polar ice cap is named Planum Boreum. The southern cap is called Planum Australe. Mars's equator is defined by its rotation, but the location of its Prime Meridian was specified, as was Earth's (at Greenwich), by choice of an arbitrary point; Mädler and Beer selected a line for their first maps of Mars in 1830. After the spacecraft Mariner 9 provided extensive imagery of Mars in 1972, a small crater (later called Airy-0), located in the Sinus Meridiani ("Middle Bay" or "Meridian Bay"), was chosen by Merton E. Davies, Harold Masursky, and Gérard de Vaucouleurs for the definition of 0.0° longitude to coincide with the original selection. Because Mars has no oceans, and hence no "sea level", a zero-elevation surface had to be selected as a reference level; this is called the areoid of Mars, analogous to the terrestrial geoid. Zero altitude was defined by the height at which there is 610.5 Pa (6.105 mbar) of atmospheric pressure. This pressure corresponds to the triple point of water, and it is about 0.6% of the sea level surface pressure on Earth (0.006 atm). For mapping purposes, the United States Geological Survey divides the surface of Mars into thirty cartographic quadrangles, each named for a classical albedo feature it contains. In April 2023, The New York Times reported an updated global map of Mars based on images from the Hope spacecraft. A related, but much more detailed, global Mars map was released by NASA on 16 April 2023. The vast upland region Tharsis contains several massive volcanoes, which include the shield volcano Olympus Mons. The edifice is over 600 km (370 mi) wide. Because the mountain is so large, with complex structure at its edges, giving a definite height to it is difficult. Its local relief, from the foot of the cliffs which form its northwest margin to its peak, is over 21 km (13 mi), a little over twice the height of Mauna Kea as measured from its base on the ocean floor. The total elevation change from the plains of Amazonis Planitia, over 1,000 km (620 mi) to the northwest, to the summit approaches 26 km (16 mi), roughly three times the height of Mount Everest, which in comparison stands at just over 8.8 kilometres (5.5 mi). Consequently, Olympus Mons is either the tallest or second-tallest mountain in the Solar System; the only known mountain which might be taller is the Rheasilvia peak on the asteroid Vesta, at 20–25 km (12–16 mi). The dichotomy of Martian topography is striking: northern plains flattened by lava flows contrast with the southern highlands, pitted and cratered by ancient impacts. It is possible that, four billion years ago, the Northern Hemisphere of Mars was struck by an object one-tenth to two-thirds the size of Earth's Moon. If this is the case, the Northern Hemisphere of Mars would be the site of an impact crater 10,600 by 8,500 kilometres (6,600 by 5,300 mi) in size, or roughly the area of Europe, Asia, and Australia combined, surpassing Utopia Planitia and the Moon's South Pole–Aitken basin as the largest impact crater in the Solar System. Mars is scarred by 43,000 impact craters with a diameter of 5 kilometres (3.1 mi) or greater. The largest exposed crater is Hellas, which is 2,300 kilometres (1,400 mi) wide and 7,000 metres (23,000 ft) deep, and is a light albedo feature clearly visible from Earth. There are other notable impact features, such as Argyre, which is around 1,800 kilometres (1,100 mi) in diameter, and Isidis, which is around 1,500 kilometres (930 mi) in diameter. Due to the smaller mass and size of Mars, the probability of an object colliding with the planet is about half that of Earth. Mars is located closer to the asteroid belt, so it has an increased chance of being struck by materials from that source. Mars is more likely to be struck by short-period comets, i.e., those that lie within the orbit of Jupiter. Martian craters can[discuss] have a morphology that suggests the ground became wet after the meteor impact. The large canyon, Valles Marineris (Latin for 'Mariner Valleys, also known as Agathodaemon in the old canal maps), has a length of 4,000 kilometres (2,500 mi) and a depth of up to 7 kilometres (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 kilometres (277 mi) long and nearly 2 kilometres (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 kilometres (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement. Images from the Thermal Emission Imaging System (THEMIS) aboard NASA's Mars Odyssey orbiter have revealed seven possible cave entrances on the flanks of the volcano Arsia Mons. The caves, named after loved ones of their discoverers, are collectively known as the "seven sisters". Cave entrances measure from 100 to 252 metres (328 to 827 ft) wide and they are estimated to be at least 73 to 96 metres (240 to 315 ft) deep. Because light does not reach the floor of most of the caves, they may extend much deeper than these lower estimates and widen below the surface. "Dena" is the only exception; its floor is visible and was measured to be 130 metres (430 ft) deep. The interiors of these caverns may be protected from micrometeoroids, UV radiation, solar flares and high energy particles that bombard the planet's surface. Martian geysers (or CO2 jets) are putative sites of small gas and dust eruptions that occur in the south polar region of Mars during the spring thaw. "Dark dune spots" and "spiders" – or araneiforms – are the two most visible types of features ascribed to these eruptions. Similarly sized dust will settle from the thinner Martian atmosphere sooner than it would on Earth. For example, the dust suspended by the 2001 global dust storms on Mars only remained in the Martian atmosphere for 0.6 years, while the dust from Mount Pinatubo took about two years to settle. However, under current Martian conditions, the mass movements involved are generally much smaller than on Earth. Even the 2001 global dust storms on Mars moved only the equivalent of a very thin dust layer – about 3 μm thick if deposited with uniform thickness between 58° north and south of the equator. Dust deposition at the two rover sites has proceeded at a rate of about the thickness of a grain every 100 sols. Atmosphere Mars lost its magnetosphere 4 billion years ago, possibly because of numerous asteroid strikes, so the solar wind interacts directly with the Martian ionosphere, lowering the atmospheric density by stripping away atoms from the outer layer. Both Mars Global Surveyor and Mars Express have detected ionized atmospheric particles trailing off into space behind Mars, and this atmospheric loss is being studied by the MAVEN orbiter. Compared to Earth, the atmosphere of Mars is quite rarefied. Atmospheric pressure on the surface today ranges from a low of 30 Pa (0.0044 psi) on Olympus Mons to over 1,155 Pa (0.1675 psi) in Hellas Planitia, with a mean pressure at the surface level of 600 Pa (0.087 psi). The highest atmospheric density on Mars is equal to that found 35 kilometres (22 mi) above Earth's surface. The resulting mean surface pressure is only 0.6% of Earth's 101.3 kPa (14.69 psi). The scale height of the atmosphere is about 10.8 kilometres (6.7 mi), which is higher than Earth's 6 kilometres (3.7 mi), because the surface gravity of Mars is only about 38% of Earth's. The atmosphere of Mars consists of about 96% carbon dioxide, 1.93% argon and 1.89% nitrogen along with traces of oxygen and water. The atmosphere is quite dusty, containing particulates about 1.5 μm in diameter which give the Martian sky a tawny color when seen from the surface. It may take on a pink hue due to iron oxide particles suspended in it. Despite repeated detections of methane on Mars, there is no scientific consensus as to its origin. One suggestion is that methane exists on Mars and that its concentration fluctuates seasonally. The existence of methane could be produced by non-biological process such as serpentinization involving water, carbon dioxide, and the mineral olivine, which is known to be common on Mars, or by Martian life. Compared to Earth, its higher concentration of atmospheric CO2 and lower surface pressure may be why sound is attenuated more on Mars, where natural sources are rare apart from the wind. Using acoustic recordings collected by the Perseverance rover, researchers concluded that the speed of sound there is approximately 240 m/s for frequencies below 240 Hz, and 250 m/s for those above. Auroras have been detected on Mars. Because Mars lacks a global magnetic field, the types and distribution of auroras there differ from those on Earth; rather than being mostly restricted to polar regions as is the case on Earth, a Martian aurora can encompass the planet. In September 2017, NASA reported radiation levels on the surface of the planet Mars were temporarily doubled, and were associated with an aurora 25 times brighter than any observed earlier, due to a massive, and unexpected, solar storm in the middle of the month. Mars has seasons, alternating between its northern and southern hemispheres, similar to on Earth. Additionally the orbit of Mars has, compared to Earth's, a large eccentricity and approaches perihelion when it is summer in its southern hemisphere and winter in its northern, and aphelion when it is winter in its southern hemisphere and summer in its northern. As a result, the seasons in its southern hemisphere are more extreme and the seasons in its northern are milder than would otherwise be the case. The summer temperatures in the south can be warmer than the equivalent summer temperatures in the north by up to 30 °C (54 °F). Martian surface temperatures vary from lows of about −110 °C (−166 °F) to highs of up to 35 °C (95 °F) in equatorial summer. The wide range in temperatures is due to the thin atmosphere which cannot store much solar heat, the low atmospheric pressure (about 1% that of the atmosphere of Earth), and the low thermal inertia of Martian soil. The planet is 1.52 times as far from the Sun as Earth, resulting in just 43% of the amount of sunlight. Mars has the largest dust storms in the Solar System, reaching speeds of over 160 km/h (100 mph). These can vary from a storm over a small area, to gigantic storms that cover the entire planet. They tend to occur when Mars is closest to the Sun, and have been shown to increase global temperature. Seasons also produce dry ice covering polar ice caps. Hydrology While Mars contains water in larger amounts, most of it is dust covered water ice at the Martian polar ice caps. The volume of water ice in the south polar ice cap, if melted, would be enough to cover most of the surface of the planet with a depth of 11 metres (36 ft). Water in its liquid form cannot persist on the surface due to Mars's low atmospheric pressure, which is less than 1% that of Earth. Only at the lowest of elevations are the pressure and temperature high enough for liquid water to exist for short periods. Although little water is present in the atmosphere, there is enough to produce clouds of water ice and different cases of snow and frost, often mixed with snow of carbon dioxide dry ice. Landforms visible on Mars strongly suggest that liquid water has existed on the planet's surface. Huge linear swathes of scoured ground, known as outflow channels, cut across the surface in about 25 places. These are thought to be a record of erosion caused by the catastrophic release of water from subsurface aquifers, though some of these structures have been hypothesized to result from the action of glaciers or lava. One of the larger examples, Ma'adim Vallis, is 700 kilometres (430 mi) long, much greater than the Grand Canyon, with a width of 20 kilometres (12 mi) and a depth of 2 kilometres (1.2 mi) in places. It is thought to have been carved by flowing water early in Mars's history. The youngest of these channels is thought to have formed only a few million years ago. Elsewhere, particularly on the oldest areas of the Martian surface, finer-scale, dendritic networks of valleys are spread across significant proportions of the landscape. Features of these valleys and their distribution strongly imply that they were carved by runoff resulting from precipitation in early Mars history. Subsurface water flow and groundwater sapping may play important subsidiary roles in some networks, but precipitation was probably the root cause of the incision in almost all cases. Along craters and canyon walls, there are thousands of features that appear similar to terrestrial gullies. The gullies tend to be in the highlands of the Southern Hemisphere and face the Equator; all are poleward of 30° latitude. A number of authors have suggested that their formation process involves liquid water, probably from melting ice, although others have argued for formation mechanisms involving carbon dioxide frost or the movement of dry dust. No partially degraded gullies have formed by weathering and no superimposed impact craters have been observed, indicating that these are young features, possibly still active. Other geological features, such as deltas and alluvial fans preserved in craters, are further evidence for warmer, wetter conditions at an interval or intervals in earlier Mars history. Such conditions necessarily require the widespread presence of crater lakes across a large proportion of the surface, for which there is independent mineralogical, sedimentological and geomorphological evidence. Further evidence that liquid water once existed on the surface of Mars comes from the detection of specific minerals such as hematite and goethite, both of which sometimes form in the presence of water. The chemical signature of water vapor on Mars was first unequivocally demonstrated in 1963 by spectroscopy using an Earth-based telescope. In 2004, Opportunity detected the mineral jarosite. This forms only in the presence of acidic water, showing that water once existed on Mars. The Spirit rover found concentrated deposits of silica in 2007 that indicated wet conditions in the past, and in December 2011, the mineral gypsum, which also forms in the presence of water, was found on the surface by NASA's Mars rover Opportunity. It is estimated that the amount of water in the upper mantle of Mars, represented by hydroxyl ions contained within Martian minerals, is equal to or greater than that of Earth at 50–300 parts per million of water, which is enough to cover the entire planet to a depth of 200–1,000 metres (660–3,280 ft). On 18 March 2013, NASA reported evidence from instruments on the Curiosity rover of mineral hydration, likely hydrated calcium sulfate, in several rock samples including the broken fragments of "Tintina" rock and "Sutton Inlier" rock as well as in veins and nodules in other rocks like "Knorr" rock and "Wernicke" rock. Analysis using the rover's DAN instrument provided evidence of subsurface water, amounting to as much as 4% water content, down to a depth of 60 centimetres (24 in), during the rover's traverse from the Bradbury Landing site to the Yellowknife Bay area in the Glenelg terrain. In September 2015, NASA announced that they had found strong evidence of hydrated brine flows in recurring slope lineae, based on spectrometer readings of the darkened areas of slopes. These streaks flow downhill in Martian summer, when the temperature is above −23 °C, and freeze at lower temperatures. These observations supported earlier hypotheses, based on timing of formation and their rate of growth, that these dark streaks resulted from water flowing just below the surface. However, later work suggested that the lineae may be dry, granular flows instead, with at most a limited role for water in initiating the process. A definitive conclusion about the presence, extent, and role of liquid water on the Martian surface remains elusive. Researchers suspect much of the low northern plains of the planet were covered with an ocean hundreds of meters deep, though this theory remains controversial. In March 2015, scientists stated that such an ocean might have been the size of Earth's Arctic Ocean. This finding was derived from the ratio of protium to deuterium in the modern Martian atmosphere compared to that ratio on Earth. The amount of Martian deuterium (D/H = 9.3 ± 1.7 10−4) is five to seven times the amount on Earth (D/H = 1.56 10−4), suggesting that ancient Mars had significantly higher levels of water. Results from the Curiosity rover had previously found a high ratio of deuterium in Gale Crater, though not significantly high enough to suggest the former presence of an ocean. Other scientists caution that these results have not been confirmed, and point out that Martian climate models have not yet shown that the planet was warm enough in the past to support bodies of liquid water. Near the northern polar cap is the 81.4 kilometres (50.6 mi) wide Korolev Crater, which the Mars Express orbiter found to be filled with approximately 2,200 cubic kilometres (530 cu mi) of water ice. In November 2016, NASA reported finding a large amount of underground ice in the Utopia Planitia region. The volume of water detected has been estimated to be equivalent to the volume of water in Lake Superior (which is 12,100 cubic kilometers). During observations from 2018 through 2021, the ExoMars Trace Gas Orbiter spotted indications of water, probably subsurface ice, in the Valles Marineris canyon system. Orbital motion Mars's average distance from the Sun is roughly 230 million km (143 million mi), and its orbital period is 687 (Earth) days. The solar day (or sol) on Mars is only slightly longer than an Earth day: 24 hours, 39 minutes, and 35.244 seconds. A Martian year is equal to 1.8809 Earth years, or 1 year, 320 days, and 18.2 hours. The gravitational potential difference and thus the delta-v needed to transfer between Mars and Earth is the second lowest for Earth. The axial tilt of Mars is 25.19° relative to its orbital plane, which is similar to the axial tilt of Earth. As a result, Mars has seasons like Earth, though on Mars they are nearly twice as long because its orbital period is that much longer. In the present day, the orientation of the north pole of Mars is close to the star Deneb. Mars has a relatively pronounced orbital eccentricity of about 0.09; of the seven other planets in the Solar System, only Mercury has a larger orbital eccentricity. It is known that in the past, Mars has had a much more circular orbit. At one point, 1.35 million Earth years ago, Mars had an eccentricity of roughly 0.002, much less than that of Earth today. Mars's cycle of eccentricity is 96,000 Earth years compared to Earth's cycle of 100,000 years. Mars has its closest approach to Earth (opposition) in a synodic period of 779.94 days. It should not be confused with Mars conjunction, where the Earth and Mars are at opposite sides of the Solar System and form a straight line crossing the Sun. The average time between the successive oppositions of Mars, its synodic period, is 780 days; but the number of days between successive oppositions can range from 764 to 812. The distance at close approach varies between about 54 and 103 million km (34 and 64 million mi) due to the planets' elliptical orbits, which causes comparable variation in angular size. At their furthest Mars and Earth can be as far as 401 million km (249 million mi) apart. Mars comes into opposition from Earth every 2.1 years. The planets come into opposition near Mars's perihelion in 2003, 2018 and 2035, with the 2020 and 2033 events being particularly close to perihelic opposition. The mean apparent magnitude of Mars is +0.71 with a standard deviation of 1.05. Because the orbit of Mars is eccentric, the magnitude at opposition from the Sun can range from about −3.0 to −1.4. The minimum brightness is magnitude +1.86 when the planet is near aphelion and in conjunction with the Sun. At its brightest, Mars (along with Jupiter) is second only to Venus in apparent brightness. Mars usually appears distinctly yellow, orange, or red. When farthest away from Earth, it is more than seven times farther away than when it is closest. Mars is usually close enough for particularly good viewing once or twice at 15-year or 17-year intervals. Optical ground-based telescopes are typically limited to resolving features about 300 kilometres (190 mi) across when Earth and Mars are closest because of Earth's atmosphere. As Mars approaches opposition, it begins a period of retrograde motion, which means it will appear to move backwards in a looping curve with respect to the background stars. This retrograde motion lasts for about 72 days, and Mars reaches its peak apparent brightness in the middle of this interval. Moons Mars has two relatively small (compared to Earth's) natural moons, Phobos (about 22 km (14 mi) in diameter) and Deimos (about 12 km (7.5 mi) in diameter), which orbit at 9,376 km (5,826 mi) and 23,460 km (14,580 mi) around the planet. The origin of both moons is unclear, although a popular theory states that they were asteroids captured into Martian orbit. Both satellites were discovered in 1877 by Asaph Hall and were named after the characters Phobos (the deity of panic and fear) and Deimos (the deity of terror and dread), twins from Greek mythology who accompanied their father Ares, god of war, into battle. Mars was the Roman equivalent to Ares. In modern Greek, the planet retains its ancient name Ares (Aris: Άρης). From the surface of Mars, the motions of Phobos and Deimos appear different from that of the Earth's satellite, the Moon. Phobos rises in the west, sets in the east, and rises again in just 11 hours. Deimos, being only just outside synchronous orbit – where the orbital period would match the planet's period of rotation – rises as expected in the east, but slowly. Because the orbit of Phobos is below a synchronous altitude, tidal forces from Mars are gradually lowering its orbit. In about 50 million years, it could either crash into Mars's surface or break up into a ring structure around the planet. The origin of the two satellites is not well understood. Their low albedo and carbonaceous chondrite composition have been regarded as similar to asteroids, supporting a capture theory. The unstable orbit of Phobos would seem to point toward a relatively recent capture. But both have circular orbits near the equator, which is unusual for captured objects, and the required capture dynamics are complex. Accretion early in the history of Mars is plausible, but would not account for a composition resembling asteroids rather than Mars itself, if that is confirmed. Mars may have yet-undiscovered moons, smaller than 50 to 100 metres (160 to 330 ft) in diameter, and a dust ring is predicted to exist between Phobos and Deimos. A third possibility for their origin as satellites of Mars is the involvement of a third body or a type of impact disruption. More-recent lines of evidence for Phobos having a highly porous interior, and suggesting a composition containing mainly phyllosilicates and other minerals known from Mars, point toward an origin of Phobos from material ejected by an impact on Mars that reaccreted in Martian orbit, similar to the prevailing theory for the origin of Earth's satellite. Although the visible and near-infrared (VNIR) spectra of the moons of Mars resemble those of outer-belt asteroids, the thermal infrared spectra of Phobos are reported to be inconsistent with chondrites of any class. It is also possible that Phobos and Deimos were fragments of an older moon, formed by debris from a large impact on Mars, and then destroyed by a more recent impact upon the satellite. More recently, a study conducted by a team of researchers from multiple countries suggests that a lost moon, at least fifteen times the size of Phobos, may have existed in the past. By analyzing rocks which point to tidal processes on the planet, it is possible that these tides may have been regulated by a past moon. Human observations and exploration The history of observations of Mars is marked by oppositions of Mars when the planet is closest to Earth and hence is most easily visible, which occur every couple of years. Even more notable are the perihelic oppositions of Mars, which are distinguished because Mars is close to perihelion, making it even closer to Earth. The ancient Sumerians named Mars Nergal, the god of war and plague. During Sumerian times, Nergal was a minor deity of little significance, but, during later times, his main cult center was the city of Nineveh. In Mesopotamian texts, Mars is referred to as the "star of judgement of the fate of the dead". The existence of Mars as a wandering object in the night sky was also recorded by the ancient Egyptian astronomers and, by 1534 BCE, they were familiar with the retrograde motion of the planet. By the period of the Neo-Babylonian Empire, the Babylonian astronomers were making regular records of the positions of the planets and systematic observations of their behavior. For Mars, they knew that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. They invented arithmetic methods for making minor corrections to the predicted positions of the planets. In Ancient Greece, the planet was known as Πυρόεις. Commonly, the Greek name for the planet now referred to as Mars, was Ares. It was the Romans who named the planet Mars, for their god of war, often represented by the sword and shield of the planet's namesake. In the fourth century BCE, Aristotle noted that Mars disappeared behind the Moon during an occultation, indicating that the planet was farther away. Ptolemy, a Greek living in Alexandria, attempted to address the problem of the orbital motion of Mars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection later called the Almagest (from the Arabic for "greatest"), which became the authoritative treatise on Western astronomy for the next fourteen centuries. Literature from ancient China confirms that Mars was known by Chinese astronomers by no later than the fourth century BCE. In the East Asian cultures, Mars is traditionally referred to as the "fire star" (火星) based on the Wuxing system. In 1609 Johannes Kepler published a 10 year study of Martian orbit, using the diurnal parallax of Mars, measured by Tycho Brahe, to make a preliminary calculation of the relative distance to the planet. From Brahe's observations of Mars, Kepler deduced that the planet orbited the Sun not in a circle, but in an ellipse. Moreover, Kepler showed that Mars sped up as it approached the Sun and slowed down as it moved farther away, in a manner that later physicists would explain as a consequence of the conservation of angular momentum.: 433–437 In 1610 the first use of a telescope for astronomical observation, including Mars, was performed by Italian astronomer Galileo Galilei. With the telescope the diurnal parallax of Mars was again measured in an effort to determine the Sun-Earth distance. This was first performed by Giovanni Domenico Cassini in 1672. The early parallax measurements were hampered by the quality of the instruments. The only occultation of Mars by Venus observed was that of 13 October 1590, seen by Michael Maestlin at Heidelberg. By the 19th century, the resolution of telescopes reached a level sufficient for surface features to be identified. On 5 September 1877, a perihelic opposition to Mars occurred. The Italian astronomer Giovanni Schiaparelli used a 22-centimetre (8.7 in) telescope in Milan to help produce the first detailed map of Mars. These maps notably contained features he called canali, which, with the possible exception of the natural canyon Valles Marineris, were later shown to be an optical illusion. These canali were supposedly long, straight lines on the surface of Mars, to which he gave names of famous rivers on Earth. His term, which means "channels" or "grooves", was popularly mistranslated in English as "canals". Influenced by the observations, the orientalist Percival Lowell founded an observatory which had 30- and 45-centimetre (12- and 18-in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894, and the following less favorable oppositions. He published several books on Mars and life on the planet, which had a great influence on the public. The canali were independently observed by other astronomers, like Henri Joseph Perrotin and Louis Thollon in Nice, using one of the largest telescopes of that time. The seasonal changes (consisting of the diminishing of the polar caps and the dark areas formed during Martian summers) in combination with the canals led to speculation about life on Mars, and it was a long-held belief that Mars contained vast seas and vegetation. As bigger telescopes were used, fewer long, straight canali were observed. During observations in 1909 by Antoniadi with an 84-centimetre (33 in) telescope, irregular patterns were observed, but no canali were seen. The first spacecraft from Earth to visit Mars was Mars 1 of the Soviet Union, which flew by in 1963, but contact was lost en route. NASA's Mariner 4 followed and became the first spacecraft to successfully transmit from Mars; launched on 28 November 1964, it made its closest approach to the planet on 15 July 1965. Mariner 4 detected the weak Martian radiation belt, measured at about 0.1% that of Earth, and captured the first images of another planet from deep space. Once spacecraft visited the planet during the 1960s and 1970s, many previous concepts of Mars were radically broken. After the results of the Viking life-detection experiments, the hypothesis of a dead planet was generally accepted. The data from Mariner 9 and Viking allowed better maps of Mars to be made. Until 1997 and after Viking 1 shut down in 1982, Mars was only visited by three unsuccessful probes, two flying past without contact (Phobos 1, 1988; Mars Observer, 1993), and one (Phobos 2 1989) malfunctioning in orbit before reaching its destination Phobos. In 1997 Mars Pathfinder became the first successful rover mission beyond the Moon and started together with Mars Global Surveyor (operated until late 2006) an uninterrupted active robotic presence at Mars that has lasted until today. It produced complete, extremely detailed maps of the Martian topography, magnetic field and surface minerals. Starting with these missions a range of new improved crewless spacecraft, including orbiters, landers, and rovers, have been sent to Mars, with successful missions by the NASA (United States), Jaxa (Japan), ESA, United Kingdom, ISRO (India), Roscosmos (Russia), the United Arab Emirates, and CNSA (China) to study the planet's surface, climate, and geology, uncovering the different elements of the history and dynamic of the hydrosphere of Mars and possible traces of ancient life. As of 2023[update], Mars is host to ten functioning spacecraft. Eight are in orbit: 2001 Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, ExoMars Trace Gas Orbiter, the Hope orbiter, and the Tianwen-1 orbiter. Another two are on the surface: the Mars Science Laboratory Curiosity rover and the Perseverance rover. Collected maps are available online at websites including Google Mars. NASA provides two online tools: Mars Trek, which provides visualizations of the planet using data from 50 years of exploration, and Experience Curiosity, which simulates traveling on Mars in 3-D with Curiosity. Planned missions to Mars include: As of February 2024[update], debris from these types of missions has reached over seven tons. Most of it consists of crashed and inactive spacecraft as well as discarded components. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Key areas include establishing telecommunications, payload delivery and surface imaging. Habitability and habitation During the late 19th century, it was widely accepted in the astronomical community that Mars had life-supporting qualities, including the presence of oxygen and water. However, in 1894 W. W. Campbell at Lick Observatory observed the planet and found that "if water vapor or oxygen occur in the atmosphere of Mars it is in quantities too small to be detected by spectroscopes then available". That observation contradicted many of the measurements of the time and was not widely accepted. Campbell and V. M. Slipher repeated the study in 1909 using better instruments, but with the same results. It was not until the findings were confirmed by W. S. Adams in 1925 that the myth of the Earth-like habitability of Mars was finally broken. However, even in the 1960s, articles were published on Martian biology, putting aside explanations other than life for the seasonal changes on Mars. The current understanding of planetary habitability – the ability of a world to develop environmental conditions favorable to the emergence of life – favors planets that have liquid water on their surface. Most often this requires the orbit of a planet to lie within the habitable zone, which for the Sun is estimated to extend from within the orbit of Earth to about that of Mars. During perihelion, Mars dips inside this region, but Mars's thin (low-pressure) atmosphere prevents liquid water from existing over large regions for extended periods. The past flow of liquid water demonstrates the planet's potential for habitability. Recent evidence has suggested that any water on the Martian surface may have been too salty and acidic to support regular terrestrial life. The environmental conditions on Mars are a challenge to sustaining organic life: the planet has little heat transfer across its surface, it has poor insulation against bombardment by the solar wind due to the absence of a magnetosphere and has insufficient atmospheric pressure to retain water in a liquid form (water instead sublimes to a gaseous state). Mars is nearly, or perhaps totally, geologically dead; the end of volcanic activity has apparently stopped the recycling of chemicals and minerals between the surface and interior of the planet. Evidence suggests that the planet was once significantly more habitable than it is today, but whether living organisms ever existed there remains unknown. The Viking probes of the mid-1970s carried experiments designed to detect microorganisms in Martian soil at their respective landing sites and had positive results, including a temporary increase in CO2 production on exposure to water and nutrients. This sign of life was later disputed by scientists, resulting in a continuing debate, with NASA scientist Gilbert Levin asserting that Viking may have found life. A 2014 analysis of Martian meteorite EETA79001 found chlorate, perchlorate, and nitrate ions in sufficiently high concentrations to suggest that they are widespread on Mars. UV and X-ray radiation would turn chlorate and perchlorate ions into other, highly reactive oxychlorines, indicating that any organic molecules would have to be buried under the surface to survive. Small quantities of methane and formaldehyde detected by Mars orbiters are both claimed to be possible evidence for life, as these chemical compounds would quickly break down in the Martian atmosphere. Alternatively, these compounds may instead be replenished by volcanic or other geological means, such as serpentinite. Impact glass, formed by the impact of meteors, which on Earth can preserve signs of life, has also been found on the surface of the impact craters on Mars. Likewise, the glass in impact craters on Mars could have preserved signs of life, if life existed at the site. The Cheyava Falls rock discovered on Mars in June 2024 has been designated by NASA as a "potential biosignature" and was core sampled by the Perseverance rover for possible return to Earth and further examination. Although highly intriguing, no definitive final determination on a biological or abiotic origin of this rock can be made with the data currently available. Several plans for a human mission to Mars have been proposed, but none have come to fruition. The NASA Authorization Act of 2017 directed NASA to study the feasibility of a crewed Mars mission in the early 2030s; the resulting report concluded that this would be unfeasible. In addition, in 2021, China was planning to send a crewed Mars mission in 2033. Privately held companies such as SpaceX have also proposed plans to send humans to Mars, with the eventual goal to settle on the planet. As of 2024, SpaceX has proceeded with the development of the Starship launch vehicle with the goal of Mars colonization. In plans shared with the company in April 2024, Elon Musk envisions the beginning of a Mars colony within the next twenty years. This would be enabled by the planned mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. The moon Phobos has been proposed as an anchor point for a space elevator. Besides national space agencies and space companies, groups such as the Mars Society and The Planetary Society advocate for human missions to Mars. In culture Mars is named after the Roman god of war (Greek Ares), but was also associated with the demi-god Heracles (Roman Hercules) by ancient Greek astronomers, as detailed by Aristotle. This association between Mars and war dates back at least to Babylonian astronomy, in which the planet was named for the god Nergal, deity of war and destruction. It persisted into modern times, as exemplified by Gustav Holst's orchestral suite The Planets, whose famous first movement labels Mars "The Bringer of War". The planet's symbol, a circle with a spear pointing out to the upper right, is also used as a symbol for the male gender. The symbol dates from at least the 11th century, though a possible predecessor has been found in the Greek Oxyrhynchus Papyri. The idea that Mars was populated by intelligent Martians became widespread in the late 19th century. Schiaparelli's "canali" observations combined with Percival Lowell's books on the subject put forward the standard notion of a planet that was a drying, cooling, dying world with ancient civilizations constructing irrigation works. Many other observations and proclamations by notable personalities added to what has been termed "Mars Fever". In the present day, high-resolution mapping of the surface of Mars has revealed no artifacts of habitation, but pseudoscientific speculation about intelligent life on Mars still continues. Reminiscent of the canali observations, these speculations are based on small scale features perceived in the spacecraft images, such as "pyramids" and the "Face on Mars". In his book Cosmos, planetary astronomer Carl Sagan wrote: "Mars has become a kind of mythic arena onto which we have projected our Earthly hopes and fears." The depiction of Mars in fiction has been stimulated by its dramatic red color and by nineteenth-century scientific speculations that its surface conditions might support not just life but intelligent life. This gave way to many science fiction stories involving these concepts, such as H. G. Wells's The War of the Worlds, in which Martians seek to escape their dying planet by invading Earth; Ray Bradbury's The Martian Chronicles, in which human explorers accidentally destroy a Martian civilization; as well as Edgar Rice Burroughs's series Barsoom, C. S. Lewis's novel Out of the Silent Planet (1938), and a number of Robert A. Heinlein stories before the mid-sixties. Since then, depictions of Martians have also extended to animation. A comic figure of an intelligent Martian, Marvin the Martian, appeared in Haredevil Hare (1948) as a character in the Looney Tunes animated cartoons of Warner Brothers, and has continued as part of popular culture to the present. After the Mariner and Viking spacecraft had returned pictures of Mars as a lifeless and canal-less world, these ideas about Mars were abandoned; for many science-fiction authors, the new discoveries initially seemed like a constraint, but eventually the post-Viking knowledge of Mars became itself a source of inspiration for works like Kim Stanley Robinson's Mars trilogy. See also Notes References Further reading External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of". |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Weak_artificial_intelligence#cite_ref-4] | [TOKENS: 594] |
Contents Weak artificial intelligence Weak artificial intelligence (weak AI) is artificial intelligence that implements a limited part of the mind, or, as narrow AI, artificial narrow intelligence (ANI), is focused on one narrow task. Weak AI is contrasted with strong AI, which can be interpreted in various ways: Narrow AI can be classified as being "limited to a single, narrowly defined task. Most modern AI systems would be classified in this category." Artificial general intelligence is conversely the opposite. Applications and risks Some examples of narrow AI are AlphaGo, self-driving cars, robot systems used in the medical field, and diagnostic doctors. Narrow AI systems are sometimes dangerous if unreliable. And the behavior that it follows can become inconsistent. It could be difficult for the AI to grasp complex patterns and get to a solution that works reliably in various environments. This "brittleness" can cause it to fail in unpredictable ways. Narrow AI failures can sometimes have significant consequences. It could for example cause disruptions in the electric grid, damage nuclear power plants, cause global economic problems, and misdirect autonomous vehicles. Medicines could be incorrectly sorted and distributed. Also, medical diagnoses can ultimately have serious and sometimes deadly consequences if the AI is faulty or biased. Simple AI programs have already worked their way into society, oftentimes unnoticed by the public. Autocorrection for typing, speech recognition for speech-to-text programs, and vast expansions in the data science fields are examples. Narrow AI has also been the subject of some controversy, including resulting in unfair prison sentences, discrimination against women in the workplace for hiring, resulting in death via autonomous driving, among other cases. Despite being "narrow" AI, recommender systems are efficient at predicting user reactions based their posts, patterns, or trends. For instance, TikTok's "For You" algorithm can determine a user's interests or preferences in less than an hour. Some other social media AI systems are used to detect bots that may be involved in propaganda or other potentially malicious activities. Weak AI versus strong AI John Searle contests the possibility of strong AI (by which he means conscious AI). He further believes that the Turing test (created by Alan Turing and originally called the "imitation game", used to assess whether a machine can converse indistinguishably from a human) is not accurate or appropriate for testing whether an AI is "strong". Scholars such as Antonio Lieto have argued that the current research on both AI and cognitive modelling are perfectly aligned with the weak-AI hypothesis (that should not be confused with the "general" vs "narrow" AI distinction) and that the popular assumption that cognitively inspired AI systems espouse the strong AI hypothesis is ill-posed and problematic since "artificial models of brain and mind can be used to understand mental phenomena without pretending that that they are the real phenomena that they are modelling" (as, on the other hand, implied by the strong AI assumption). See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Smalltalk] | [TOKENS: 9732] |
Contents Smalltalk Smalltalk is a purely object-oriented programming language that was originally created in the 1970s for educational use, specifically for constructionist learning, but later found use in business. It was created at Xerox PARC by Learning Research Group (LRG) scientists, including Alan Kay, Dan Ingalls, Adele Goldberg, Ted Kaehler, Diana Merry, and Scott Wallace. In Smalltalk, executing programs are built of opaque, atomic objects, which are instances of template code stored in classes. These objects intercommunicate by passing of messages, via an intermediary virtual machine environment (VM). A relatively small number of objects, called primitives, are not amenable to live redefinition, sometimes being defined independently of the Smalltalk programming environment. Having undergone significant industry development toward other uses, including business and database functions, Smalltalk is still in use today. When first publicly released, Smalltalk-80 presented numerous foundational ideas for the nascent field of object-oriented programming. Since inception, the language provided interactive programming via an integrated development environment. This requires reflection and late binding in the language execution of code. Later development has led to at least one instance of Smalltalk execution environment which lacks such an integrated graphical user interface or front-end. Smalltalk-like languages are in active development and have gathered communities of users around them. American National Standards Institute (ANSI) Smalltalk was ratified in 1998 and represents the standard version of Smalltalk. Smalltalk took second place for "most loved programming language" in the Stack Overflow Developer Survey in 2017, but it was not among the 26 most loved programming languages of the 2018 survey. History There are a large number of Smalltalk variants. The unqualified word Smalltalk is often used to indicate the Smalltalk-80 language and compatible VM, the first version to be made publicly available and created in 1980. The first hardware-environments which ran the Smalltalk VMs were Xerox Alto computers. Smalltalk was the product of research led by Alan Kay at Xerox Palo Alto Research Center (PARC); Alan Kay designed most of the early Smalltalk versions, Adele Goldberg wrote most of the documentation, and Dan Ingalls implemented most of the early versions. Smalltalk-71 was an unpublished language design by Kay (circa 1971). In September 1972, Kay made a bet that the core of a programming language based on the idea of message passing could be implemented in "a page of code"; by about the eighth morning, a working interpreter scheme had emerged, forming the basis for what is now termed Smalltalk-72. Its syntax and execution model were very different from modern Smalltalk variants. The first Smalltalk interpreter actually implemented was for Smalltalk-72, and was written by Dan Ingalls in about 700 lines of BASIC in October 1972 for the Data General Nova. This version was demonstrated at the MIT AI Lab by Alan Kay in November that year; published accounts of the Actor model cite this period and Smalltalk-72's message-passing ideas as an influence on the model's development. The first bitmap line drawing routines were implemented by Ted Kaehler in late December 1972. Smalltalk-72 was ported to the Xerox Alto in April 1973, the same month the first units began operation. After significant revisions which froze some aspects of execution semantics to gain performance (by adopting a Simula-like class inheritance model of execution), Smalltalk-76 was created. This system had a development environment featuring most of the now-familiar tools, including a class library code browser/editor. Smalltalk-80 added metaclasses, to help maintain the "everything is an object" (except variables) paradigm by associating properties and behavior with individual classes, and even primitives such as integer and Boolean values (for example, to support different ways to create instances). Smalltalk-80 was the first language variant made available outside of PARC. In 1981, it was shared with Tektronix, Hewlett-Packard, Apple Computer, and DEC for review and debugging on their platforms. The August 1981 issue of Byte Magazine was devoted to Smalltalk-80 and brought its ideas to a large audience. Several books on Smalltalk-80 were also published. Smalltalk-80 became the basis for all future commercial versions of Smalltalk. The final release of Smalltalk-80 Version 1 was in November 1981. Xerox only distributed Version 1 to Apple, DEC, HP, and Tektronix, but these companies were allowed unrestricted redistribution via any system they built. This encouraged the wide spread of Smalltalk. Later, in 1983, Xerox released Smalltalk-80 Version 2. This version was generally available to the public, although under a restrictive license. Versions 1 and 2 were fairly similar, although Version 2 did have some added features such as a spelling corrector. Each release consisted of a virtual image (platform-independent file with object definitions) and a virtual machine specification. ANSI Smalltalk has been the standard language reference since 1998. Two currently popular Smalltalk implementation variants are descendants of those original Smalltalk-80 images. Squeak is an open source implementation derived from Smalltalk-80 Version 1 by way of Apple Smalltalk. VisualWorks is derived from Smalltalk-80 version 2 by way of Smalltalk-80 2.5 and ObjectWorks (both products of ParcPlace Systems, a Xerox PARC spin-off company formed to bring Smalltalk to the market). As an interesting link between generations, in 2001, Vassili Bykov implemented Hobbes, a virtual machine running Smalltalk-80 inside VisualWorks. (Dan Ingalls later ported Hobbes to Squeak.) During the late 1980s to mid-1990s, Smalltalk environments, including support, training and add-ons, were sold by two competing organizations: ParcPlace Systems and Digitalk, both California-based. ParcPlace Systems tended to focus on the Unix/Sun microsystems market, while Digitalk focused on Intel-based PCs running Microsoft Windows or IBM's OS/2. Both firms struggled to take Smalltalk mainstream due to Smalltalk's substantial memory needs, limited run-time performance, and initial lack of supported connectivity to SQL-based relational database servers. While the high price of ParcPlace Smalltalk limited its market penetration to mid-sized and large commercial organizations, the Digitalk products initially tried to reach a wider audience with a lower price. IBM initially supported the Digitalk product, but then entered the market with a Smalltalk product in 1995 named VisualAge/Smalltalk. Easel introduced Enfin at this time on Windows and OS/2. Enfin became far more popular in Europe, as IBM introduced it into IT shops before their development of IBM Smalltalk (later VisualAge). Enfin was later acquired by Cincom Systems, and is now sold under the name ObjectStudio, and is part of the Cincom Smalltalk product suite. In 1995, ParcPlace and Digitalk merged into ParcPlace-Digitalk and then rebranded in 1997 as ObjectShare, located in Irvine, California. ObjectShare (NASDAQ: OBJS) was traded publicly until 1999, when it was delisted and dissolved. The merged firm never managed to find an effective response to Java as to market positioning, and by 1997 its owners were looking to sell the business. In 1999, Seagull Software acquired the ObjectShare Java development lab (including the original Smalltalk/V and Visual Smalltalk development team), and still owns VisualSmalltalk, although worldwide distribution rights for the Smalltalk product remained with ObjectShare who then sold them to Cincom. VisualWorks was sold to Cincom and is now part of Cincom Smalltalk. Cincom has backed Smalltalk strongly, releasing multiple new versions of VisualWorks and ObjectStudio each year since 1999. Cincom, GemTalk, and Instantiations, continue to sell Smalltalk environments. IBM ended VisualAge Smalltalk, having in the late 1990s decided to back Java instead and, as of 2005[update], is supported by Instantiations, Inc. Instantiations renamed the product VA Smalltalk (VAST Platform) and continue to release new versions yearly. The open Squeak implementation has an active community of developers, including many of the original Smalltalk community, and was used to provide the Etoys environment on the One Laptop per Child (OLPC) project, a toolkit for developing collaborative applications Croquet Project, and the Open Cobalt virtual world application. GNU Smalltalk is a free software implementation of a derivative of Smalltalk-80 from the GNU project. Pharo Smalltalk is a fork of Squeak oriented toward research and use in commercial environments. As of 2016, a significant development that has spread across all Smalltalk environments is the increasing usage of two web frameworks, Seaside and AIDA/Web, to simplify the building of complex web applications. Seaside has seen considerable market interest with Cincom, Gemstone, and Instantiations incorporating and extending it. Influences Smalltalk was one of many object-oriented programming languages based on Simula. Smalltalk is also one of the most influential programming languages.[citation needed] Virtually all of the object-oriented languages that came after—Flavors, CLOS, Objective-C, Java, Python, Ruby, and many others—were influenced by Smalltalk. Smalltalk was also one of the most popular languages for agile software development methods, rapid application development (RAD) or prototyping, and software design patterns. The highly productive environment provided by Smalltalk platforms made them ideal for rapid, iterative development. Smalltalk emerged from a larger program of Advanced Research Projects Agency-funded research that in many ways defined the modern world of computing. In addition to Smalltalk, working prototypes of things such as hypertext, GUIs, multimedia, the mouse, telepresence, and the Internet were developed by ARPA researchers in the 1960s. Alan Kay (one of the inventors of Smalltalk) also described a tablet computer he named the Dynabook which resembles modern tablet computers. Smalltalk environments were often the first to develop what are now common object-oriented software design patterns. One of the most popular is the model–view–controller (MVC) pattern for user interface design. The MVC pattern enables developers to have multiple consistent views of the same underlying data. It's ideal for software development environments, where there are various views (e.g., entity-relation, dataflow, object model, etc.) of the same underlying specification. Also, for simulations or games where the underlying model may be viewed from various angles and levels of abstraction. In addition to the MVC pattern, the Smalltalk language and environment were influential in the history of the graphical user interface (GUI) and the what you see is what you get (WYSIWYG) user interface, font editors, and desktop metaphors for UI design. The powerful built-in debugging and object inspection tools that came with Smalltalk environments set the standard for all the integrated development environments, starting with Lisp Machine environments, that came after. Smalltalk uses several collection filter operators that rhyme with the "-ect" suffix, collect:, select:, inject:into:, et al. This was inspired by a line from the 1967 Arlo Guthrie monologue "Alice's Restaurant Massacree," in which Guthrie underwent a battery of being "injected, inspected, detected, infected, neglected and selected." Object-oriented programming As in other object-oriented languages, the central concept in Smalltalk-80 (but not in Smalltalk-72) is that of an object. An object is always an instance of a class. Classes are "blueprints" that describe the properties and behavior of their instances. For example, a GUI's window class might declare that windows have properties such as the label, the position and whether the window is visible or not. The class might also declare that instances support operations such as opening, closing, moving and hiding. Each particular window object would have its own values of those properties, and each of them would be able to perform operations defined by its class. A Smalltalk object can do exactly three things: The state an object holds is always private to that object. Other objects can query or change that state only by sending requests (messages) to the object to do so. Any message can be sent to any object: when a message is received, the receiver determines whether that message is appropriate. If the message is not understood by the object then the virtual machine sends the doesNotUnderstand: message with the original message as an argument, and the default implementation of doesNotUnderstand: raises an exception that if not caught opens the system's debugger. Alan Kay has commented that despite the attention given to objects, messaging is the most important concept in Smalltalk: "The big idea is 'messaging'—that is what the kernel of Smalltalk/Squeak is all about (and it's something that was never quite completed in our Xerox PARC phase)." Unlike most other languages, Smalltalk code can be modified while the system is running. Live coding and applying fixes 'on-the-fly' is a dominant programming methodology for Smalltalk and is one of the main reasons for its productivity. Smalltalk is a "pure" object-oriented programming language, meaning that, unlike C++ and Java, there are no primitive types. All values are represented as objects and computation on integers uses message sending just like any other object. In Smalltalk, types such as integers, Booleans and characters are also objects, in the sense that they are instances of corresponding classes, and operations on them are invoked by sending messages. For efficiency and generality integers are implemented by four classes, Integer, the abstract superclass of all integers, SmallInteger, whose instances fit in a machine word, for example having a 61-bit signed range in a 64-bit implementation, LargePositiveInteger and LargeNegativeInteger, the last two being vectors of bytes. Consequently Smalltalk can evaluate 52 factorial to produce 80658175170943878571660636856403766975289505440883277824000000000000. The transition from small to large integers is transparent to the programmer; variables do not require type declarations. This makes the system both concise and flexible. A programmer can change or extend (through subclassing) the classes that implement what in other languages would be primitive values, so that new behavior can be defined for their instances—for example, to implement new control structures—or even so that their existing behavior will be changed. This fact is summarized in the commonly-heard phrase "In Smalltalk everything is an object", which may be more accurately expressed as "all values are objects", as variables are not. Since all values are objects, classes are also objects. Each class is an instance of the metaclass of that class. Metaclasses in turn are also objects, and are all instances of a class named Metaclass. Classes contain method dictionaries that map selectors (the equivalent of function procedure names in other languages) to method objects, objects that are executed to evaluate messages. Classes inherit from other classes, with either Object or ProtoObject at the root of the class hierarchy. Sending a message to an object at the most abstract involves fetching the class of the receiver (the object being sent the message) and looking up the message's selector in the class's method dictionary, followed by the superclass and so on until the method is found or doesNotUnderstand is sent. Smalltalk virtual machines use various techniques to speed up message lookup so the system provides both a simple consistent message binding mechanism and good efficiency. Code blocks—Smalltalk's way of expressing anonymous functions—are also objects. They have a very lightweight syntax and are used throughout the system to implement control structures, especially for the Collection hierarchy. Reflection Reflection is a feature of having a meta-model as Smalltalk does. The meta-model is the part of the system that implements the programming system itself, and developers can use the meta-model to do things like walk through, examine, and modify code in the running system, or find all the instances of a certain kind of structure (e.g., all instances of the Method class in the meta-model). Smalltalk-80 is a totally reflective system. Smalltalk-80 provides both structural and computational reflection. Smalltalk is a structurally reflective system which structure is defined by Smalltalk-80 objects. The classes and methods that define the system are also objects and fully part of the system that they help define. The Smalltalk compiler, which is itself written in Smalltalk and exists alongside all the other code in the system, compiles textual source code into method objects, typically instances of CompiledMethod. These get added to classes by storing them in a class' method dictionary. The part of the class hierarchy that defines classes can add new classes to the system. The system is extended by running Smalltalk-80 code that creates or defines classes and methods. In this way a Smalltalk-80 system is a "living" system, carrying around the ability to extend itself at run time. One can even extend the compiler at run-time; indeed this is how the compiler is developed and maintained. Since the classes are objects, they can be asked questions such as "what methods do you implement?" or "what fields/slots/instance variables do you define?". So objects can easily be inspected, copied, (de)serialized and so on with generic code that applies to any object in the system. Smalltalk-80 also provides computational reflection, the ability to observe the computational state of the system. In languages derived from the original Smalltalk-80 the current activation of a method is accessible as an object named via a pseudo-variable (one of the six reserved words), thisContext, which corresponds to a stack frame in conventional language implementations, and is called a "context". Sending a message is done within some context, and to evaluate the message another context is created, the first being the sender of the former. In this way the stack is a linked list of context objects, and the debugger is essentially an inspector of this spaghetti stack. By sending messages to thisContext a method activation can ask questions like "who sent this message to me". These facilities make it possible to implement coroutines or Prolog-like back-tracking without modifying the virtual machine. The exception system is implemented using this facility. One of the more interesting uses of this is in the Seaside web framework which relieves the programmer of dealing with the complexity of a web browser's back button by storing continuations for each edited page and switching between them as the user navigates a web site. Programming the web server using Seaside can then be done using a more conventional programming style. As with message sending Smalltalk-80 virtual machines optimize away the expensive use of contexts internally, providing the illusion and flexibility of a spaghetti stack without most its costs. Essentially, context objects are created lazily as required, for example when a message is sent to the thisContext variable. An example of how Smalltalk can use reflection is the mechanism for handling errors. When an object is sent a message that it does not implement, the virtual machine sends the object the doesNotUnderstand: message with a reification of the message as an argument. The message (another object, an instance of Message) contains the selector of the message and an Array of its arguments. In an interactive Smalltalk system the default implementation of doesNotUnderstand: is one that opens an error window (a notifier) reporting the error to the user. Through this and the reflective facilities the user can examine the context in which the error occurred, redefine the offending code, and continue, all within the system, using Smalltalk-80's reflective facilities. By creating a class that understands (implements) only doesNotUnderstand:, one can create an instance that can intercept any message sent to it via its doesNotUnderstand: method. Such instances are called transparent proxies. Such proxies can then be used to implement a number of facilities such as distributed Smalltalk where messages are exchanged between multiple Smalltalk systems, database interfaces where objects are transparently faulted out of a database, promises, etc. The design of distributed Smalltalk influenced such systems as CORBA. Syntax Smalltalk-80 syntax is rather minimalist, based on only a handful of declarations. In fact, there are only five "keywords" in Smalltalk, names of pseudo-variables with a special meaning: true, false, nil, self, and super. These are properly termed pseudo-variables, identifiers that follow the rules for variable identifiers but denote bindings that a programmer cannot change. The true, false, and nil pseudo-variables are singleton instances. self and super refer to the receiver of a message within a method activated in response to that message, but sends to super are looked up in the superclass of the method's defining class rather than the class of the receiver, which allows methods in subclasses to invoke methods of the same name in superclasses. The only built-in language constructs are message sends, assignment, method return, literal syntax for some objects, including block literals (closures). From its origins as a language for children of all ages, standard Smalltalk syntax uses punctuation in a manner more like English than mainstream coding languages. The remainder of the language, including control structures for conditional evaluation and iteration, is implemented on top of the built-in constructs by the standard Smalltalk class library (for performance reasons, implementations may recognize and treat as special some of those messages; however, this is only an optimization and is not coded into the language syntax). The pseudo-variable thisContext may have been added in some implementations, but is not mentioned in either Smalltalk-80 or the ANSI standard. Pseudo-variables in general realise arguments passed to messages or blocks, content of these variables is read-only and cannot be modified. The adage that "Smalltalk syntax fits on a postcard" may have originated in Alan Kay's original conception of the language, as related by him in practically each of tens or hundreds of public lectures, op. cit., or perhaps it could refer to a code snippet by Ralph Johnson, demonstrating all the basic standard syntactic elements of methods: The following examples illustrate the most common objects which can be written as literal values in Smalltalk-80 methods. The following list illustrates some of the possibilities for numbers. The last two entries are a binary and a hexadecimal number, respectively. The number before the 'r' is the radix or base. The base does not have to be a power of two; for example 36rSMALLTALK is a valid number equal to 80738163270632 decimal. Characters are written by preceding them with a dollar sign: Strings are sequences of characters enclosed in single quotes: To include a quote in a string, escape it using a second quote: Double quotes do not need escaping, since single quotes delimit a string: Two equal strings (strings are equal if they contain all the same characters) can be different objects residing in different places in memory. In addition to strings, Smalltalk has a class of character sequence objects named Symbol. Symbols are guaranteed to be unique—there can be no two equal symbols which are different objects. Because of that, symbols are very cheap to compare and are often used for language artifacts such as message selectors (see below). Symbols are written as # followed by a string literal. For example: If the sequence does not include whitespace or punctuation characters, this can also be written as: Arrays: defines an array of four integers. defines a seven element array whose first element is a literal array, second element a byte array, third element the string 'four', and so on. Many implementations support the following literal syntax for ByteArrays: defines a ByteArray of four integers. And last but not least, blocks (anonymous function literals) The following takes two arguments and compares any two objects which can understand "less than", for example numbers, and strings Blocks are explained in detail further in the text. Many Smalltalk dialects implement additional syntaxes for other objects, but the ones above are the essentials supported by all. The two kinds of variables commonly used in Smalltalk are instance variables and temporary variables. Other variables and related terminology depend on the particular implementation. For example, VisualWorks has class shared variables and namespace shared variables, while Squeak and many other implementations have class variables, pool variables and global variables. Temporary variable declarations in Smalltalk are variables declared inside a method (see below). They are declared at the top of the method as names separated by spaces and enclosed by vertical bars. For example: declares a temporary variable named index which contains initially the value nil. Multiple variables may be declared within one set of bars: declares two variables: index and vowels. All variables are initialized. Variables are initialized to nil except the indexed variables of strings, which are initialized to the null character or ByteArrays which are initialized to 0. A variable is assigned a value via the ':=' syntax. So: Assigns the string 'aeiou' to the formerly declared vowels variable. The string is an object (a sequence of characters between single quotes is the syntax for literal strings), created by the compiler at compile time. In the original Parc Place image, the glyph of the underscore character ⟨_⟩ appeared as a left-facing arrow ⟨←⟩ (like in the 1963 version of the ASCII code). Smalltalk originally accepted this left-arrow as the only assignment operator. Some modern code still contains what appear to be underscores acting as assignments, harkening back to this original usage. Most modern Smalltalk implementations accept either the underscore or the colon-equals syntax. The message is the most fundamental language construct in Smalltalk. Even control structures are implemented as message sends. Smalltalk adopts by default a dynamic dispatch and single dispatch strategy (as opposed to multiple dispatch, used by some other object-oriented languages). There are three kinds of message sends, unary messages, which have a single keyword, such as class and size, binary messages, which for example are used for arithmetic, such as a < b, a ~= b, and keyword messages where a keyword followed by a colon precedes each argument in the message, so that a between: b and: c sends the #between:and: message to a with arguments b and c. Unary messages have higher precedence than binary messages, which have higher precedence than keyword messages, and evaluation is strictly left-to-right. There is no arithmetic precedence. 1 + 2 * 3 evaluates to 9, not to 7. The following example sends the message 'factorial' to number 42: In this situation 42 is called the message receiver, while 'factorial' is the message selector. The receiver responds to the message by returning a value (presumably in this case the factorial of 42). Among other things, the result of the message can be assigned to a variable: "factorial" above is what is called a unary message because only one object, the receiver, is involved. Messages can carry additional objects as arguments, as follows: In this expression two objects are involved: 2 as the receiver and 4 as the message argument. The message result, or in Smalltalk parlance, the answer is supposed to be 16. Such messages are called keyword messages. A message can have more arguments, using the following syntax: which answers the index of character 'o' in the receiver string, starting the search from index 6. The selector of this message is "indexOf:startingAt:", consisting of two pieces, or keywords. Such interleaving of keywords and arguments is meant to improve readability of code, since arguments are explained by their preceding keywords. For example, an expression to create a rectangle using a C++ or Java-like syntax might be written as: It's unclear which argument is which. By contrast, in Smalltalk, this code would be written as: The receiver in this case is "Rectangle", a class, and the answer will be a new instance of the class with the specified width and height. Finally, most of the special (non-alphabetic) characters can be used as what are called binary messages. These allow mathematical and logical operators to be written in their traditional form: which sends the message "+" to the receiver 3 with 4 passed as the argument (the answer of which will be 7). Similarly, is the message ">" sent to 3 with argument 4 (the answer of which will be false). The programmer is free to define new binary selectors just as they are free to define novel unary and keyword messages. Notice that the Smalltalk-80 language itself does not imply the meaning of those operators. The outcome of the above is only defined by how the receiver of the message (in this case a Number instance) responds to messages "+" and ">". A side effect of this mechanism is operator overloading. A message ">" can also be understood by other objects, allowing the use of expressions of the form "a > b" to compare them. Smalltalk is an expression-based language. Every statement, including control constructs, has a value, which is some object. An expression can include multiple message sends. In this case expressions are parsed according to a simple order of precedence. Unary messages have the highest precedence, followed by binary messages, followed by keyword messages. For example: is evaluated as follows: The answer of the last message sent is the result of the entire expression. Parentheses can alter the order of evaluation when needed. For example, will change the meaning so that the expression first computes "3 factorial + 4" yielding 10. That 10 then receives the second "factorial" message, yielding 3628800. 3628800 then receives "between:and:", answering false. Because the meaning of binary messages is not coded into Smalltalk-80 syntax, all of them are considered to have equal precedence and are evaluated simply from left to right. Because of this, the meaning of Smalltalk expressions using binary messages can be different from their "traditional" interpretation: is evaluated as "(3 + 4) * 5", producing 35. To obtain the expected answer of 23, parentheses must be used to explicitly define the order of operations: Unary messages can be chained by writing them one after another: which sends "factorial" to 3, then "factorial" to the result (6), then "log" to the result (720), producing the result 2.85733. A series of expressions can be written as in the following (hypothetical) example, each separated by a period (period is a statement separator, not a statement terminator). This example first creates a new instance of class Window, stores it in a variable, and then sends two messages to it. If a series of messages are sent to the same receiver as in the example above, they can also be written as a cascade with individual messages separated by semicolons: This rewrite of the earlier example as a single expression avoids the need to store the new window in a temporary variable. According to the usual precedence rules, the unary message "new" is sent first, and then "label:" and "open" are sent to the receiver of "new". A block of code (an anonymous function) can be expressed as a literal value (which is an object, since all values are objects). This is achieved with square brackets: Where :params is the list of parameters the code can take. This means that the Smalltalk code: can be understood as: or expressed in lambda terms as: and can be evaluated as Or in lambda terms as: The resulting block object can form a closure: it can access the variables of its enclosing lexical scopes at any time. Blocks are first-class objects. Blocks can be executed by sending them the value message. Compound variations exist to provide parameters to the block e.g., value:value: and valueWithArguments:. The literal representation of blocks was an innovation which on the one hand allowed certain code to be significantly more readable; it allowed algorithms involving iteration to be coded in a clear and concise way. Code that would typically be written with loops in some languages can be written concisely in Smalltalk using blocks, sometimes in a single line. But more importantly blocks allow control structure to be expressed using messages and polymorphism, since blocks defer computation and polymorphism can be used to select alternatives. So if-then-else in Smalltalk is written and implemented as True methods for evaluation False methods for evaluation This is related to functional programming, wherein patterns of computation (here selection) are abstracted into higher-order functions. For example, the message select: on a Collection is equivalent to the higher-order function filter on an appropriate functor. Control structures Control structures do not have special syntax in Smalltalk. They are instead implemented as messages sent to objects. For example, conditional execution is implemented by sending the message ifTrue: to a Boolean object, passing as an argument the block of code to be executed if and only if the Boolean receiver is true. The two subclasses of Boolean both implement ifTrue:, where the implementation in subclass True always evaluates the block and the implementation in subclass False never evaluates the block. The following code demonstrates this: Blocks are also used to implement user-defined control structures, enumerators, visitors, exception handling, pluggable behavior and many other patterns. For example: In the last line, the string is sent the message select: with an argument that is a code block literal. The code block literal will be used as a predicate function that should answer true if and only if an element of the String should be included in the Collection of characters that satisfy the test represented by the code block that is the argument to the "select:" message. A String object responds to the "select:" message by iterating through its members (by sending itself the message "do:"), evaluating the selection block ("aBlock") once with each character it contains as the argument. When evaluated (by being sent the message "value: each"), the selection block (referenced by the parameter "aBlock", and defined by the block literal "[:aCharacter | aCharacter isVowel]"), answers a Boolean, which is then sent "ifTrue:". If the Boolean is the object true, the character is added to a string to be returned. Because the "select:" method is defined in the abstract class Collection, it can also be used like this: The exception handling mechanism uses blocks as handlers (similar to CLOS-style exception handling): The exception handler's "ex" argument provides access to the state of the suspended operation (stack frame, line-number, receiver and arguments etc.) and is also used to control how the computation is to proceed (by sending one of "ex proceed", "ex reject", "ex restart" or "ex return"). Classes This is a stock class definition: Often, most of this definition will be filled in by the environment. Notice that this is a message to the Object class to create a subclass named MessagePublisher. In other words: classes are first-class objects in Smalltalk which can receive messages just like any other object and can be created dynamically at execution time. When an object receives a message, a method matching the message name is invoked. The following code defines a method publish, and so defines what will happen when this object receives the 'publish' message. The following method demonstrates receiving multiple arguments and returning a value: The method's name is #quadMultiply:and:. The return value is specified with the ^ operator. Objects are responsible for determining dynamically at runtime which method to execute in response to a message—while in many languages this may be (sometimes, or even always) determined statically at compile time. The following code: creates (and returns) a new instance of the MessagePublisher class. This is typically assigned to a variable: However, it is also possible to send a message to a temporary, anonymous object: Integrated Development Environment Smalltalk is one of the first systems to be based around an Integrated Development Environment. There are a rich variety of tools to support code development, and other activities, such as graphics, and music. Smalltalk was the first system in which the modern desktop paradigm of Windows, Icons, Menus, and Pointers (WIMP) was created. Although pointers had already been invented, Smalltalk was the first system to implement overlapping windows and pop-up menus. While there are several programming tools we shall describe the following five major tools. The images of the tools are from a 2024 Squeak system Smalltalk-80 derived systems organize classes within "system categories", such as Kernel-Numbers, Kernel-Objects, Collections-Abstract, Collections-Sequenceable, etc, and within classes methods are organized in named categories such as accessing, arithmetic, instance creation, etc. From this follows the classic five paned browser, with four panes in the upper half of the window containing from high to left the list of system categories, that when one is selected displays in the second window the list of classes in that category, that when one is selected displays the list of message categories in the selected class, that when one is selected displays in the last pane the selectors of the methods in the selected category in the selected class. When one of the selectors in the fourth pane is selected the source for that method is displayed in the fifth pane. If only a category is selected and not a method, the fifth pane shows a template for defining a new method. If a system category is selected but no class, a template for creating a class in the category is displayed. Various pop-up menus allow one to query the tool such as searching for a class by name, finding all senders of a selected message, or all implementors of the message, and so on. In this way the browser is both a code reading and system exploration tool and a code authoring tool. A Workspace is a simple text editor editing a single string. One can type arbitrary text in the workspace, including Smalltalk expressions. On the pop-up menu "do it" (evaluate the selected expression), "print it" (evaluate the selected expression and insert the print string of the result immediately after the selection), and "inspect it" (open an inspector on the result of the evaluation of the selected expression, see "Inspector" below) are three oft used actions. Note that the fifth pane in the Browser is also a workspace, so that one can evaluate expressions and insert their results while editing method definitions, and a common thing is to include evalteable expressions, typically examples, in comments in a method, because almost everywhere the text of a method is shown (for example in the debugger) code is executable as in a workspace. Both workspaces and the browser's text panes are typically syntax highlighted. By using blocks to separate different expressions one can have several syntax highlighted expressions, each with their own temporaries in a single workspace. The Transcript is a special workspace associated with the global Transcript. So evaluating Transcript print: 52 factorial; cr; flush causes 80658175170943878571660636856403766975289505440883277824000000000000 followed by a newline to appear on the Transcript window. The Transcript therefore serves as a place to emit logging messages, although it can also function as a Workspace There are various inspectors, some tailored to displaying different kinds of object. The most basic inspector has two panes. To the left is a list of the object itself (with the label "self"), followed by the instance variables in the object, which will include numbered instance variables in sequences such as strings and arrays. To the right is a workspace pane. Selecting a name in the list replaces the workspace's contents with a print string of the selected variable. Editing and "accepting" the text in the workspace pane when an instance variable is selected will assign the result of the evaluation to the selected variable. One can "drill down" by using the "inspect" command on the list menu which will apply to the selected instance variable. More sophisticated inspectors (e.g. explorers) support finder-like tree access so that object structure can be traversed without opening additional windows. The default response to an unhandled exception is to open a Notifier, which is a window containing a stack backtrace of the first few activations, and buttons such as "Debug", "Proceed", "Close", etc. If the programmer chooses "Debug" then the full debugger opens. This has six panes. At the top is the stack window, containing a list of the contexts in the stack. Selecting a context causes the middle pane to display the text of the context's method, and to highlight the current expression within the method. Selecting the top context will display the method raising the exception and the message raising the exception will be highlighted. Selecting a context causes the bottom four panes to be updated. The bottom left two panes are the receiver inspector, that inspect the receiver of the selected message. The bottom right two panes are the context inspector that show the argument and temporary variable names in the selected context and allow display and modification of these variables. Sending the message self halt causes an exception which opens a notifier, providing a simple breakpoint facility (typically breakpoint facilities provide more than just the simple halt, but it was the first such facility). Workspaces also provide a "debug it" evaluator which opens a debugger on the selected expression positioned at the first message send within the expression. So selecting 52 factorial and choosing "debug it" from the pop-up menu opens a debugger with the "doit context" selected and the factorial selector highlighted. The debugger provides buttons to do "step into", "step over", etc. Hence by choosing "step into" one can explore the evaluation of 52 factorial. In this way the debugger provides an inspector of a process, allowing one to explore a halted computation. If an exception results from a doesNotUnderstand:, or subclassResponsibility send, then the notifier will include a "Create" button, allowing the programmer to choose where in the receiver's hierarchy to define an "initial draft" of the method to be implemented. Redefining a method in the debugger causes the selected context to reset back to the first statement (arguments are not modifiable in Smalltalk so this gets the execution state back to the start of a method). In this way the debugger supports live programming, defining methods as the computation proceeds. This is an extremely productive and enjoyable way to program. Everything in the system is at your finger tips. One has the full power of workspaces to evaluate subexpressions, and the browser to search for supporting code as one programs. Clicking on the Debug button opens the Notifier into a Debugger allowing inspecting the call stack and editing and continuing from any method activation. In this case the Notifier has created a template of the missing method that the programmer can edit, compile, and then continue the computation. Hello World example The Hello world program is used by virtually all texts to new programming languages as the first program learned to show the most basic syntax and environment of the language. For Smalltalk, the program is extremely simple to write. The following code, the message "show:" is sent to the object "Transcript" with the String literal 'Hello, world!' as its argument. Invocation of the "show:" method causes the characters of its argument (the String literal 'Hello, world!') to be displayed in the transcript ("terminal") window. To see the results of this example, a Transcript window must be open. Image-based persistence Most popular programming systems separate static program code (in the form of class definitions, functions or procedures) from dynamic, or run time, program state (such as objects or other forms of program data). They load program code when a program starts, and any prior program state must be recreated explicitly from configuration files or other data sources. Any settings the program (and programmer) does not explicitly save must be set up again for each restart. A traditional program also loses much useful document information each time a program saves a file, quits, and reloads. This loses details such as undo history or cursor position. Image based systems don't force losing all that just because a computer is turned off, or an OS updates. Many Smalltalk systems, however, do not differentiate between program data (objects) and code (classes). In fact, classes are objects. Thus, most Smalltalk systems store the entire program state (including both Class and non-Class objects) in an image file. The image can then be loaded by the Smalltalk virtual machine to restore a Smalltalk-like system to a prior state. This was inspired by FLEX, a language created by Alan Kay and described in his M.Sc. thesis. Smalltalk images are similar to (restartable) core dumps and can provide the same functionality as core dumps, such as delayed or remote debugging with full access to the program state at the time of error. Other languages that model application code as a form of data, such as Lisp, often use image-based persistence as well (see EMACS, for example). This method of persistence is powerful for rapid development because all the development information (e.g. parse trees of the program) is saved which facilitates debugging. However, it also has serious drawbacks[citation needed] as a true persistence mechanism. For one thing, developers may often[citation needed] want to hide implementation details and not make them available in a run time environment. For reasons of legality and maintenance, allowing anyone to modify a program at run time inevitably[citation needed] introduces complexity and potential errors that would not be possible with a compiled system that exposes no source code in the run time environment. Also, while the persistence mechanism is easy to use, it lacks the true persistence abilities needed for most multi-user systems.[citation needed] The most obvious is the ability to do transactions with multiple users accessing the same database in parallel.[citation needed] Level of access Everything in Smalltalk-80, unless customised to avoid the possibility, is available for modification from within a running program. This means that, for example, the IDE can be changed in a running system without restarting it. In some implementations, the syntax of the language or the garbage collection implementation can also be changed on the fly. Even the statement true become: false is valid in Smalltalk, although executing it is not recommended except for demonstration purposes (see virtual machine, image-based persistence, and backups). Just-in-time compilation Smalltalk programs are usually[citation needed] compiled to bytecode, which is then interpreted by a virtual machine or dynamically translated into native machine-code. The results of previous message lookups are cached in self-modifying machine-code resulting in very high-performance sends that can out-perform the indirect function calls in C++ virtual method calls. List of implementations OpenSmalltalk VM (OS VM) is a relatively high-performance implementation of the Smalltalk virtual machine on which several modern open-source Smalltalk dialects are based. The OS VM derives from the original Back-to-the-Future (BTTF) Squeak interpreter implemented by Dan Ingalls, Ted Khaeler, John Maloney and many other contributors. As with the BTTF VM, OS VM is transpiled from the Smalltalk system in which it is developed (using a subset of Smalltalk named Slang) to native C language source code, which is in turn compiled against specific platform and architecture of the hardware practically enabling cross-platform execution of the Smalltalk images. The OS VM differs from the BTTF VM in The notable Smalltalk dialects based on the OS VM are: See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Maor_Farid#cite_note-11] | [TOKENS: 1458] |
Contents Maor Farid Dr. Maor Farid (Hebrew: מאור פריד; born April 20, 1992) is an Israeli scientist, engineer and artificial intelligence researcher at Massachusetts Institute of Technology, social activist, and author. He is the founder and CEO of Learn to Succeed (Hebrew: ללמוד להצליח) for empowering of youths from the Israeli socio-economic periphery and youths at risk, a regional manager of the Israeli center of ScienceAbroad at MIT, and an activist in the American Technion Society. He is an alumnus of Unit 8200, and a fellow of Fulbright Program and the Israel Scholarship Educational Foundation [he]. Dr. Farid was elected to the Forbes 30 Under 30 list of 2019, and won the Moskowitz Prize for Zionism. Early life Maor was born in Ness Ziona, a city in central Israel, as the eldest son for parents from immigrating families of Mizrahi Jews from Iraq and Libya. Maor suffered from Attention deficit hyperactivity disorder (ADHD) from a young age, and was classified as a problematic and violent student. His ADHD issues were diagnosed only after he began his university studies. However, inspired by his parents' background, he aspired to excel at school for a better future for his family. During elementary school, Maor attended local quizzes about Jewish history and Zionism, which significantly shaped his identity and national perspective. Farid graduated high school with the highest GPA in school. Later he was recruited to the Israel Defense Forces and drafted to the Brakim Program [he] – an excellence program of the Israeli Intelligence Corps for training leading R&D officers for the Israeli military and defense industry. Maor graduated the program with honors and was elected by the Israeli Prime Minister's Office and Unit 8200, where he served as an artificial intelligence researcher, officer, and commander. During his Military service, he received various honors and awards, such as the Excellent Scientist Award, given to the top three academics serving in the Israel Defense Forces. In 2019, Farid completed his military service in the rank of a Captain. Education and academic career As part of the (4 years) Brakim Program, Maor completed his Bachelor's and Master's degrees at the Technion in Mechanical Engineering with honors. Then, he initiated his Ph.D. research as a collaboration with the Israel Atomic Energy Commission (IAEC) in parallel to his duty military service. The main goals of his Ph.D. research were predicting irreversible effects of major earthquakes on Israel's nuclear facilities, and improving their seismic resistance using energy absorption technologies. The mathematical models developed by Farid were able to forecast earthquake effects on facilities with major hazard potential, and predicted the failure of liquid storage tanks due to earthquakes took place in Italy (2012) and Mexico (2017). The energy absorption technologies used, increased in up to 90% the seismic resistance abilities of those sensitive facilities. The research results were published in multiple papers in peer-reviewed academic journals and presented in international academic conferences. Later, this research expanded to an official collaboration between the Technion and the Shimon Peres Negev Nuclear Research Center, which aims to implement the findings obtained on existing sensitive systems, and won funding of 1.5 million NIS from the Pazy foundation of the Israel Atomic Energy Commission and the Council for Higher Education. In 2017, Farid completed his Ph.D. and as the youngest graduate at the Technion for that year, at the age of 24. In the graduation ceremonies, he honored his parents to receive the diplomas on his behalf. At the same year, he served as a lecturer at Ben-Gurion University in an original course he developed as a solution for knowledge gaps he identified in the Israeli defense industry. In 2018, Dr. Farid served as an artificial intelligence researcher at a Data Science team of Unit 8200, where he developed machine learning-based solutions for military and operational needs. In 2019, Farid won the Fulbright and the Israel Scholarship Educational Foundation scholarships, and was accepted to post-doctoral position at Massachusetts Institute of Technology where he develops real-time methods for predicting earthquake effects using machine learning techniques. In 2020, Farid was accepted to the Emerging Leaders Program at Harvard Kennedy School in Cambridge, Massachusetts. At the same year, he received the excellence research grant of the Israel Academy of Sciences and Humanities for leading his research in collaboration between MIT and the Technion. Social activism Farid social activism focuses on empowering youths from disadvantaged backgrounds from an early age. In 2010–2015, he served as a mentor of a robotics team from Dimona in FIRST Robotics Competition, a mathematics tutor in "Aharai!" [he] program for high-school students at risk in Dimona and Be'er Sheva, and a mentor and private tutor of adolescence and reserve duty soldiers from disadvantaged backgrounds. In 2010, he initiated "Learn to Succeed" (Hebrew: ללמוד להצליח) project, for mitigating the social gaps in the Israeli society by empowering youths from the social, economical, and geographical periphery for excellence, self-fulfillment and gaining formal education. In 2018, Learn to Succeed became an official non-profit organization. At the same year, Farid led a crowdfunding project of 150,000 NIS in order to expand the organization to a national scale. In 2019, he published the book "Learn to Succeed", in which he describes his struggle with ADHD, the violent environment in which he grew up, and the changing process he went through from being a violent teenager to becoming the youngest Ph.D. graduate at the Technion. The book was given to more than two thousand youths at risk and became a top seller in Israel shortly after its publication. Maor dedicated the book to his parents and to the memorial of his friend Captain Tal Nachman who was killed in operational activity during his military service in 2014. The organization consists of hundreds of volunteers, gives full scholarships to STEM students from the periphery who serve as mentors of youths, both Jews and Arabs, from disadvantaged backgrounds, runs a hotline which gives online practical and mental support to hundreds of youths, parents and educators, initiates inspirational activities with military orientation to increase the motivation of its teen-age members for significant military service, and gives inspirational lectures to more than 5,000 youths each year. In 2019, Maor initiated a collaboration with Unit 8200 in which tens of the program's members are being interviewed to the unit. This opportunity is usually given to students with the highest grades in the matriculate exams in each class. In 2020, Dr. Farid established the ScienceAbroad center at MIT, aiming to strengthen the connections between Israeli researchers in the institute and the state of Israel. Moreover, he serves as a volunteer in the American Technion Society. Honors and awards Personal life Farid is married to Michal. Interviews and articles References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/J._Robert_Oppenheimer] | [TOKENS: 17071] |
Contents J. Robert Oppenheimer J. Robert Oppenheimer (born Julius Robert Oppenheimer /ˈɒpənhaɪmər/ ⓘ OP-ən-hy-mər; April 22, 1904 – February 18, 1967) was an American theoretical physicist who served as the director of the Manhattan Project's Los Alamos Laboratory during World War II. He is often called the "father of the atomic bomb" for his role in overseeing the development of the first nuclear weapons. Born in New York City, Oppenheimer obtained a degree in chemistry from Harvard University in 1925 and a doctorate in physics from the University of Göttingen in Germany in 1927, studying under Max Born. After research at other institutions, he joined the physics faculty at the University of California, Berkeley, where he was made a full professor in 1936. Oppenheimer made significant contributions to physics in the fields of quantum mechanics and nuclear physics, including the Born–Oppenheimer approximation for molecular wave functions; work on the theory of positrons, quantum electrodynamics, and quantum field theory; and the Oppenheimer–Phillips process in nuclear fusion. With his students, he also made major contributions to astrophysics, including the theory of cosmic ray showers, and the theory of neutron stars and black holes. In 1941, Oppenheimer was briefed about nuclear weapon design by Australian physicist Mark Oliphant. In 1942, Oppenheimer was recruited to work on the Manhattan Project, and in 1943 was appointed director of the project's Los Alamos Laboratory in New Mexico, tasked with developing the first nuclear weapons. His leadership and scientific expertise were instrumental in the project's success, and on July 16, 1945, he was present at the first test of the atomic bomb, Trinity. In August, the weapons were used on Japan in the atomic bombings of Hiroshima and Nagasaki, to date the only uses of nuclear weapons in conflict. In 1947, Oppenheimer was appointed director of the Institute for Advanced Study in Princeton, New Jersey, and chairman of the General Advisory Committee of the new United States Atomic Energy Commission (AEC). He lobbied for international control of nuclear power and weapons in order to avert an arms race with the Soviet Union, and later opposed the development of the hydrogen bomb, partly on ethical grounds. During the Second Red Scare, his stances, together with his past associations with the Communist Party USA, led to an AEC security hearing in 1954 and the revocation of his security clearance. He continued to lecture, write, and work in physics, and in 1963 received the Enrico Fermi Award for contributions to theoretical physics. The 1954 decision was vacated in 2022. Early life Oppenheimer was born Julius Robert Oppenheimer[note 1] into a non-observant Jewish family in New York City on April 22, 1904, to Ella (née Friedman), a painter, and Julius Seligmann Oppenheimer, a successful textile importer. Robert had a younger brother, Frank, who also became a physicist. His father was born in Hanau, when it was still part of the Hesse-Nassau province of the Kingdom of Prussia, and as a teenager made his way to the United States in 1888, without money, higher education, or English. He was hired by a textile company and within a decade was an executive there, eventually becoming wealthy. In 1912, the family moved to an apartment on Riverside Drive near West 88th Street, Upper West Side, New York. Their art collection included works by Pablo Picasso, Édouard Vuillard, and Vincent van Gogh. Oppenheimer was initially educated at Alcuin Preparatory School. In 1911, he entered the Ethical Culture Society School, founded by Felix Adler to promote training based on the Ethical movement, whose motto was "Deed before Creed". Oppenheimer's father had been a member of the Society for many years, serving on its board of trustees. Oppenheimer was a versatile student, interested in English and French literature, and particularly mineralogy. He completed third and fourth grades in one year and skipped half of eighth grade. He took private music lessons by famous French flutist Georges Barrère. During his final year of school, Oppenheimer became interested in chemistry. He graduated in 1921, but his further education was delayed a year by an attack of colitis contracted while prospecting in Jáchymov during a family vacation in Czechoslovakia. He recovered in New Mexico, where he developed a love for horseback riding and the southwestern United States. Oppenheimer entered Harvard College in 1922 at age 18. He majored in chemistry; Harvard also required studies in history, literature, and philosophy or mathematics. To compensate for the delay caused by his illness, he took six courses each term instead of the usual four. He was admitted to the undergraduate honor society Phi Beta Kappa and was granted graduate standing in physics on the basis of independent study, allowing him to bypass basic courses in favor of advanced ones. He was attracted to experimental physics by a course on thermodynamics taught by Percy Bridgman. Oppenheimer graduated from Harvard in 1925 with a Bachelor of Arts, summa cum laude, after only three years of study. After being accepted at Christ's College, Cambridge, in 1924, Oppenheimer wrote to Ernest Rutherford requesting permission to work at the Cavendish Laboratory, though Bridgman's letter of recommendation said that Oppenheimer's clumsiness in the laboratory suggested that theoretical, rather than experimental, physics would be his forte. Rutherford was unimpressed, but Oppenheimer went to Cambridge nonetheless; J. J. Thomson ultimately accepted him on the condition that he complete a basic laboratory course. Oppenheimer was very unhappy at Cambridge and wrote to a friend: "I am having a pretty bad time. The lab work is a terrible bore, and I am so bad at it that it is impossible to feel that I am learning anything." He developed an antagonistic relationship with his tutor, Patrick Blackett, a future Nobel laureate. According to Oppenheimer's friend Francis Fergusson, Oppenheimer once confessed to leaving a poisoned apple on Blackett's desk, and Oppenheimer's parents convinced the university authorities not to expel him. There are no records of either a poisoning incident or probation, but Oppenheimer had regular sessions with a psychiatrist in Harley Street, London, and according to Charles Oppenheimer, his grandson, the poison apple story is unsubstantiated and American Prometheus conceded that it was unproven. Oppenheimer was a tall, thin chain smoker, who often neglected to eat during periods of intense concentration. Many friends said he could be self-destructive. Fergusson once tried to distract Oppenheimer from apparent depression by telling him about his girlfriend, Frances Keeley, and how he had proposed to her. Oppenheimer jumped on Fergusson and tried to strangle him. Oppenheimer was plagued by periods of depression throughout his life, and once told his brother, "I need physics more than friends." In 1926, Oppenheimer left Cambridge for the University of Göttingen to study under Max Born; Göttingen was one of the world's leading centers for theoretical physics. Oppenheimer made friends who went on to great success, including Werner Heisenberg, Pascual Jordan, Wolfgang Pauli, Paul Dirac, Enrico Fermi and Edward Teller. He was enthusiastic in discussions to the point of sometimes taking them over. Maria Goeppert presented Born with a petition signed by herself and others threatening a boycott of the class unless he made Oppenheimer quiet down. Born left it out on his desk where Oppenheimer could read it, and it was effective without a word being said. Oppenheimer obtained his Doctor of Philosophy degree in March 1927 at age 23, supervised by Born. After the oral exam, James Franck, the professor administering it, reportedly said, "I'm glad that's over. He was on the point of questioning me." Oppenheimer published more than a dozen papers while in Europe, including many important contributions to the new field of quantum mechanics. He and Born published a famous paper on the Born–Oppenheimer approximation, which separates nuclear motion from electronic motion in the mathematical treatment of molecules, allowing nuclear motion to be neglected to simplify calculations. It remains his most cited work. Early career Oppenheimer was awarded a United States National Research Council fellowship to the California Institute of Technology (Caltech) in September 1927. Bridgman also wanted him at Harvard, so a compromise was reached whereby he split his fellowship for the 1927–28 academic year between Harvard in 1927 and Caltech in 1928. At Caltech, he struck up a close friendship with Linus Pauling; they planned to mount a joint attack on the nature of the chemical bond, a field in which Pauling was a pioneer, with Oppenheimer supplying the mathematics and Pauling interpreting the results. The collaboration, and their friendship, ended after Oppenheimer invited Pauling's wife, Ava Helen Pauling, to join him on a tryst in Mexico. Oppenheimer later invited Pauling to be head of the Chemistry Division of the Manhattan Project, but Pauling refused, saying he was a pacifist. In the autumn of 1928, Oppenheimer visited Paul Ehrenfest's institute at the University of Leiden in the Netherlands, where he impressed by giving lectures in Dutch, despite having little experience with the language. There, he was given the nickname of Opje, later anglicized by his students as "Oppie". From Leiden, he continued on to the Swiss Federal Institute of Technology in Zurich to work with Wolfgang Pauli on quantum mechanics and the continuous spectrum. Oppenheimer respected and liked Pauli and may have emulated his personal style as well as his critical approach to problems. On returning to the United States, Oppenheimer accepted an associate professorship from the University of California, Berkeley, where Raymond Thayer Birge wanted him so badly that he expressed a willingness to share him with Caltech. Before he began his Berkeley professorship, Oppenheimer was diagnosed with a mild case of tuberculosis and spent some weeks with his brother Frank at a New Mexico ranch, which he leased and eventually purchased. When he heard the ranch was available for lease, he exclaimed, "Hot dog!", and he later called it Perro Caliente ("hot dog" in Spanish). Later, he used to say that "physics and desert country" were his "two great loves". He recovered from tuberculosis and returned to Berkeley, where he prospered as an advisor and collaborator to a generation of physicists who admired him for his intellectual virtuosity and broad interests. His students and colleagues saw him as mesmerizing: hypnotic in private interaction, but often frigid in more public settings. His associates fell into two camps: one saw him as an aloof and impressive genius and aesthete, the other as a pretentious and insecure poseur. His students almost always fell into the former category, adopting his walk, speech, and other mannerisms, and even his inclination for reading entire texts in their original languages. Hans Bethe said of him: Probably the most important ingredient he brought to his teaching was his exquisite taste. He always knew what were the important problems, as shown by his choice of subjects. He truly lived with those problems, struggling for a solution, and he communicated his concern to the group. In its heyday, there were about eight or ten graduate students in his group and about six Post-doctoral Fellows. He met this group once a day in his office and discussed with one after another the status of the student's research problem. He was interested in everything, and in one afternoon they might discuss quantum electrodynamics, cosmic rays, electron pair production and nuclear physics. Oppenheimer worked closely with Nobel Prize-winning experimental physicist Ernest Lawrence and his cyclotron pioneers, helping them understand the data that their machines were producing at Berkeley's Radiation Laboratory, which eventually developed into today's Lawrence Berkeley National Laboratory. In 1936, Berkeley promoted him to full professor at an annual salary of $3,300 (equivalent to $77,000 in 2025). In return, he was asked to curtail his teaching at Caltech, so a compromise was reached whereby Berkeley released him for six weeks each year, enough to teach one term at Caltech. Oppenheimer repeatedly attempted to get Robert Serber a position at Berkeley but was blocked by Birge, who felt that "one Jew in the department was enough". Oppenheimer did important research in astrophysics (especially as related to general relativity and nuclear theory), nuclear physics, spectroscopy, and quantum field theory, including its extension into quantum electrodynamics. His most significant work involved predictions about neutron star which were not observed until 1967. Initially, his major interest was the theory of the continuous spectrum. His first published paper, in 1926, concerned the quantum theory of molecular band spectra. He developed a method to carry out calculations of its transition probabilities. He calculated the photoelectric effect for hydrogen and X-rays, obtaining the absorption coefficient at the K-edge. His calculations accorded with observations of the X-ray absorption of the Sun, but not helium. Years later, it was realized that the Sun was largely composed of hydrogen and that his calculations were correct. Oppenheimer made important contributions to the theory of cosmic ray showers. He also worked on the problem of field electron emission. This work contributed to the development of the concept of quantum tunneling. In 1931, he co-wrote a paper, "Relativistic Theory of the Photoelectric Effect," with his student Harvey Hall, in which, based on empirical evidence, he correctly disputed Paul Dirac's assertion that two of the energy levels of the hydrogen atom have the same energy. Subsequently, one of his doctoral students, Willis Lamb, determined that this was a consequence of what became known as the Lamb shift, for which Lamb was awarded the Nobel Prize in Physics in 1955. With Melba Phillips, the first graduate student to begin her PhD under Oppenheimer's supervision,[note 2] Oppenheimer worked on calculations of artificial radioactivity under bombardment by deuterons. When Ernest Lawrence and Edwin McMillan bombarded nuclei with deuterons they found the results agreed closely with the predictions of George Gamow, but when higher energies and heavier nuclei were involved, the results did not conform to the predictions. In 1935, Oppenheimer and Phillips worked out a theory—subsequently known as the Oppenheimer–Phillips process—to explain the results. This theory is still in use today.[note 3] As early as 1930, Oppenheimer wrote a paper that essentially predicted the existence of the positron. This was after a paper by Dirac proposed that electrons could have both a positive charge and negative energy. Dirac's paper introduced an equation, later known as the Dirac equation, that unified quantum mechanics, special relativity and the then-new concept of electron spin, to explain the Zeeman effect. Drawing on the body of experimental evidence, Oppenheimer rejected the idea that the predicted positively charged electrons were protons. He argued that they would have to have the same mass as an electron, whereas experiments showed that protons were much heavier than electrons. Two years later, Carl David Anderson discovered the positron, for which he received the 1936 Nobel Prize in Physics. In the late 1930s, Oppenheimer became interested in astrophysics, most likely through his friendship with Richard Tolman, resulting in a series of papers. In the first of these, "On the Stability of Stellar Neutron Cores" (1938), co-written with Serber, Oppenheimer explored the properties of white dwarfs. This was followed by a paper co-written with one of his students, George Volkoff, "On Massive Neutron Cores," which demonstrated that there was a limit, known as the Tolman–Oppenheimer–Volkoff limit, to the mass of stars beyond which they would not remain stable as neutron stars and would undergo gravitational collapse. In 1939, Oppenheimer and another of his students, Hartland Snyder, produced the paper "On Continued Gravitational Contraction", which predicted the existence of what later became termed black holes. After the Born–Oppenheimer approximation paper, these papers remain his most cited, and were key factors in the rejuvenation of astrophysical research in the United States in the 1950s, mainly by John A. Wheeler. Oppenheimer's papers were considered difficult to understand even by the standards of the abstract topics he was expert in. He was fond of using elegant, if extremely complex, mathematical techniques to demonstrate physical principles, though he was sometimes criticized for making mathematical mistakes, presumably out of haste. "His physics was good", said his student Snyder, "but his arithmetic awful." After World War II, Oppenheimer published only five scientific papers, one of them in biophysics, and none after 1950. Murray Gell-Mann, a later Nobelist who, as a visiting scientist, worked with him at the Institute for Advanced Study in 1951, offered this opinion: He didn't have Sitzfleisch, "sitting flesh," when you sit on a chair. As far as I know, he never wrote a long paper or did a long calculation, anything of that kind. He didn't have patience for that; his own work consisted of little aperçus, but quite brilliant ones. But he inspired other people to do things, and his influence was fantastic. Private and political life Oppenheimer's mother died in 1931, after which he grew closer to his father, who, though still residing in New York, became a frequent visitor to California. When his father died in 1937, leaving $392,602 (equivalent to $8.6 million in 2024) to be divided between Oppenheimer and his brother Frank, Oppenheimer promptly wrote a will bequeathing his estate to the University of California to fund graduate scholarships. During the 1920s, Oppenheimer remained uninformed about world affairs. He claimed that he did not read newspapers or popular magazines and only learned of the Wall Street crash of 1929 while he was on a walk with Ernest Lawrence six months after the crash occurred. He once remarked that he never cast a vote until the 1936 presidential election. From 1934 on, he became increasingly concerned about politics and international affairs. In 1934, he earmarked three percent of his annual salary—about $100 (equivalent to $2,400 in 2025)—for two years to support German physicists fleeing Nazi Germany. During the 1934 West Coast Waterfront Strike, he and some of his students, including Melba Phillips and Serber, attended a longshoremen's rally. After the Spanish Civil War broke out in 1936, Oppenheimer hosted fundraisers for the Spanish Republican cause. In 1939, he joined the American Committee for Democracy and Intellectual Freedom, which campaigned against the persecution of Jewish scientists in Nazi Germany. Like most liberal groups of the era, the committee was later branded a communist front. Many of Oppenheimer's closest associates were active in the Communist Party in the 1930s or 1940s, including his brother Frank, Frank's wife Jackie, Kitty, Jean Tatlock, his landlady Mary Ellen Washburn, and several of his graduate students at Berkeley. Whether Oppenheimer was a party member has been debated. Cassidy states that he never openly joined the Communist Party USA (CPUSA), but Haynes, Klehr, and Vassiliev state that he "was, in fact, a concealed member of the CPUSA in the late 1930s". From 1937 to 1942, Oppenheimer was a member at Berkeley of what he called a "discussion group", which fellow members Haakon Chevalier and Gordon Griffiths later said was a "closed" (secret) unit of the Communist Party for Berkeley faculty. The Federal Bureau of Investigation (FBI) opened a file on Oppenheimer in March 1941. It recorded that he attended a meeting in December 1940 at Chevalier's home that was also attended by the Communist Party's California state secretary, William Schneiderman, and its treasurer, Isaac Folkoff. The FBI noted that Oppenheimer was on the executive committee of the American Civil Liberties Union, which it considered a communist front organization. Shortly thereafter, the FBI added Oppenheimer to its Custodial Detention Index, for arrest in case of national emergency. When he joined the Manhattan Project in 1942, Oppenheimer wrote on his personal security questionnaire that he had been "a member of just about every Communist Front organization on the West Coast." Years later, he claimed that he did not remember writing this, that it was not true, and that if he had written anything along those lines, it was "a half-jocular overstatement". He was a subscriber to the People's World, a Communist Party organ, and testified in 1954, "I was associated with the communist movement." In 1953, Oppenheimer was on the sponsoring committee for a conference on "Science and Freedom" organized by the Congress for Cultural Freedom, an anti-communist cultural organization. At his 1954 security clearance hearings, Oppenheimer denied being a member of the Communist Party but identified himself as a fellow traveler, which he defined as someone who agrees with many of communism's goals but is not willing to blindly follow orders from any Communist Party apparatus. According to biographer Ray Monk: "He was, in a very practical and real sense, a supporter of the Communist Party. Moreover, in terms of the time, effort and money spent on party activities, he was a very committed supporter." In 1936, Oppenheimer became involved with Jean Tatlock, the daughter of a Berkeley literature professor and a student at Stanford University School of Medicine. The two had similar political views; she wrote for the Western Worker, a Communist Party newspaper. In 1939, after a tempestuous relationship, Tatlock broke up with Oppenheimer. In August of that year, he met Katherine ("Kitty") Puening, a former Communist Party member. Kitty's first marriage had lasted only a few months. Her second, common-law, husband from 1934 to 1937 was Joe Dallet, an active member of the Communist Party killed in 1937 in the Spanish Civil War. Kitty returned from Europe to the U.S., where she obtained a Bachelor of Arts degree in botany from the University of Pennsylvania. In 1938 she married Richard Harrison, a physician and medical researcher, and in June 1939 moved with him to Pasadena, California, where he became chief of radiology at a local hospital and she enrolled as a graduate student at the University of California, Los Angeles. She and Oppenheimer created a minor scandal by sleeping together after one of Tolman's parties, and in the summer of 1940 she stayed with Oppenheimer at his ranch in New Mexico. When she became pregnant, Kitty asked Harrison for a divorce and he agreed to it. On November 1, 1940, she obtained a quick divorce in Reno, Nevada, and married Oppenheimer. Their first child, Peter, was born in May 1941, and their second, Katherine ("Toni"), was born in Los Alamos, New Mexico, on December 7, 1944. During his marriage, Oppenheimer rekindled his affair with Tatlock. Later, their continued contact became an issue in his security clearance hearings because of Tatlock's communist associations. Throughout the development of the atomic bomb, Oppenheimer was under investigation by both the FBI and the Manhattan Project's internal security arm for his past left-wing associations. He was followed by Army security agents during a trip to California in June 1943 to visit Tatlock, who was suffering from depression. Oppenheimer spent the night in her apartment. Tatlock killed herself on January 4, 1944, leaving Oppenheimer deeply grieved. At Los Alamos, Oppenheimer began an emotional affair with Ruth Tolman, a psychologist and the wife of his friend Richard Tolman. The affair ended after Oppenheimer returned east to become director of the Institute for Advanced Study but, after Richard's death in August 1948, they reconnected and saw each other occasionally until Ruth's death in 1957. Few of their letters survive, but those that do reflect a close and affectionate relationship, with Oppenheimer calling her "My Love". Oppenheimer worked very hard [in the spring of 1929] but had a gift of concealing his assiduous application with an air of easy nonchalance. Actually, he was engaged in a very difficult calculation of the opacity of surfaces of stars to their internal radiation, an important constant in the theoretical construction of stellar models. He spoke little of these problems and seemed to be much more interested in literature, especially the Hindu classics and the more esoteric Western writers. Pauli once remarked to me that Oppenheimer seemed to treat physics as an avocation and psychoanalysis as a vocation. Oppenheimer's diverse interests sometimes interrupted his focus on science. He liked things that were difficult and since much of the scientific work appeared easy for him, he developed an interest in the mystical and the cryptic. After going to Harvard, he began to acquaint himself with the classical Hindu texts through their English translations. He also had an interest in learning languages and learned Sanskrit,[note 4] under Arthur W. Ryder at Berkeley in 1933. He eventually read literary works such as the Bhagavad Gita and Meghaduta in the original Sanskrit, and deeply pondered them. He later cited the Gita as one of the books that most shaped his philosophy of life. He wrote to his brother that the Gita was "very easy and quite marvelous". He later called it "the most beautiful philosophical song existing in any known tongue", and gave copies of it as presents to his friends and kept a personal, worn-out copy on the bookshelf by his desk. He kept referring to it while directing the Los Alamos Laboratory, and quoted a passage from the Gita at the memorial service of President Franklin Roosevelt in Los Alamos. He nicknamed his car Garuda, the mount bird of the Hindu god Vishnu. Oppenheimer never became a Hindu in the traditional sense; he did not join any temple nor pray to any god. He "was really taken by the charm and the general wisdom of the Bhagavad-Gita," his brother said. It is speculated that Oppenheimer's interest in Hindu thought started during his earlier association with Niels Bohr. Both Bohr and Oppenheimer had been very analytical and critical about the ancient Hindu mythological stories and the metaphysics embedded in them. In one conversation with David Hawkins before the war, while talking about the literature of ancient Greece, Oppenheimer remarked, "I have read the Greeks; I find the Hindus deeper." Oppenheimer sat on the Board of Editors of the book series World Perspectives, which published a variety of books on philosophy. During the 1930s, while teaching at Berkeley, Oppenheimer became part of a group in the Bay Area that psychologist Siegfried Bernfeld convened to discuss psychoanalysis. His close confidant and colleague Isidor Isaac Rabi, who had seen Oppenheimer throughout his Berkeley, Los Alamos, and Princeton years, wondering "why men of Oppenheimer's gifts do not discover everything worth discovering", reflected that: Oppenheimer was overeducated in those fields which lie outside the scientific tradition, such as his interest in religion, in the Hindu religion in particular, which resulted in a feeling for the mystery of the universe that surrounded him almost like a fog. He saw physics clearly, looking toward what had already been done, but at the border he tended to feel there was much more of the mysterious and novel than there actually was ... [he turned] away from the hard, crude methods of theoretical physics into a mystical realm of broad intuition.... In Oppenheimer the element of earthiness was feeble. Yet it was essentially this spiritual quality, this refinement as expressed in speech and manner, that was the basis of his charisma. He never expressed himself completely. He always left a feeling that there were depths of sensibility and insight not yet revealed. These may be the qualities of the born leader who seems to have reserves of uncommitted strength. In spite of this, observers such as physicists Luis Alvarez and Jeremy Bernstein have suggested that if Oppenheimer had lived long enough to see his predictions substantiated by experiment, he might have won a Nobel Prize for his work on gravitational collapse, concerning neutron stars and black holes. In retrospect, some physicists and historians consider this his most important contribution, though it was not taken up by other scientists in his lifetime. The physicist and historian Abraham Pais once asked Oppenheimer what he considered his most important scientific contributions—Oppenheimer cited his work on electrons and positrons, not his work on gravitational contraction. Oppenheimer was nominated for the Nobel Prize in Physics four times, in 1946, 1951, 1955, and 1967, but never won. Manhattan Project During his visit to University of California, Berkeley in September 1941, Australian physicist Mark Oliphant briefed Oppenheimer about the U.K.'s atomic bomb program and its MAUD report. On October 9, 1941, two months before the United States entered World War II, President Franklin D. Roosevelt approved a crash program to develop an atomic bomb. On October 21, Ernest Lawrence brought Oppenheimer into what became the Manhattan Project. Oppenheimer was assigned to take over the project's specific bomb-design research by Arthur Compton at the Metallurgical Laboratory. On May 18, 1942, Gregory Breit resigned due to security concerns and skepticism towards the project. Not long after that, Arthur Compton asked Oppenheimer to take over work on fast neutron calculations from Breit, a task Oppenheimer threw himself into with full vigor. He was given the title "Coordinator of Rapid Rupture"; "rapid rupture" is a technical term that refers to the propagation of a fast neutron chain reaction in an atomic bomb. One of his first acts was to host a summer school for atomic bomb theory in Berkeley. The mix of European physicists and his own students—a group including Serber, Emil Konopinski, Felix Bloch, Hans Bethe, and Edward Teller—kept themselves busy by calculating what needed to be done, and in what order, to make the bomb. In June 1942, the U.S. Army established the Manhattan Engineer District to handle its part in the atom bomb project, beginning the process of transferring responsibility from the Office of Scientific Research and Development to the military. In September, Brigadier General Leslie R. Groves Jr., was appointed director of what became known as the Manhattan Project. By October 12, 1942, Groves and Oppenheimer had decided that for security and cohesion, they needed to establish a centralized, secret research laboratory in a remote location. Groves selected Oppenheimer to head the project's secret weapons laboratory, although it is not known precisely when. On October 15, 1942, after a meeting in Chicago on the Manhattan Project Groves invited Oppenheimer to join himself, James C. Marshall and Kenneth Nichols on their return trip to New York on the 20th Century Limited. After dinner on the train, they discussed the project. After Oppenheimer left the train, none of the three could name another suitable scientist to head the project. Shortly afterwards Oppenheimer was appointed to head the Los Alamos Laboratory. This decision surprised many, because Oppenheimer had left-wing political views and no record as a leader of large projects. Groves worried that because Oppenheimer did not have a Nobel Prize, he might not have had the prestige to direct fellow scientists, but Groves was impressed by Oppenheimer's singular grasp of the practical aspects of the project and by the breadth of his knowledge. As a military engineer, Groves knew that this would be vital in an interdisciplinary project that would involve not just physics but also chemistry, metallurgy, ordnance, and engineering. Groves also detected in Oppenheimer something that many others did not, an "overweening ambition", which Groves reckoned would supply the drive necessary to push the project to a successful conclusion. Oppenheimer's past associations were not overlooked, but on July 20, 1943, Groves directed that he receive a security clearance "without delay irrespective of the information which you have concerning Mr Oppenheimer. He is absolutely essential to the project." Rabi considered Oppenheimer's appointment "a real stroke of genius on the part of General Groves, who was not generally considered to be a genius". Oppenheimer favored a location for the laboratory in New Mexico, not far from his ranch. On November 16, 1942, he, Groves and others toured a prospective site. Oppenheimer feared that the high cliffs surrounding it would feel claustrophobic, and there was concern about possible flooding. He then suggested a site he knew well: a flat mesa near Santa Fe, New Mexico, which was the site of a private boys' school, the Los Alamos Ranch School. The engineers were concerned about the poor access road and the water supply but otherwise felt that it was ideal. The Los Alamos Laboratory was built on the site of the school, taking over some of its buildings, while many new buildings were erected in great haste. At the laboratory, Oppenheimer assembled a group of the top physicists of the time, whom he called the "luminaries". Los Alamos was initially supposed to be a military laboratory, and Oppenheimer and other researchers were to be commissioned into the Army. He went so far as to order himself a lieutenant colonel's uniform and take the Army physical test, which he failed. Army doctors considered him underweight at 128 pounds (58 kg), diagnosed his chronic cough as tuberculosis, and were concerned about his chronic lumbosacral joint pain. The plan to commission scientists fell through when Rabi and Robert Bacher balked at the idea. James B. Conant, Groves, and Oppenheimer devised a compromise whereby the University of California operated the laboratory under contract to the War Department. It soon turned out that Oppenheimer had hugely underestimated the magnitude of the project: Los Alamos grew from a few hundred people in 1943 to over 6,000 in 1945. Scientists were paid at the salary they were already receiving. However this meant that Oppenheimer, who had been paid by a state university, originally received much less than some of his subordinates. Groves decided to make an exception and increased his salary to be equal to that of the others (without consulting him). Oppenheimer at first had difficulty with the organizational division of large groups but rapidly learned the art of large-scale administration after he took up permanent residence at Los Alamos. He was noted for his mastery of all scientific aspects of the project and for his efforts to control the inevitable cultural conflicts between scientists and the military. Victor Weisskopf wrote: Oppenheimer directed these studies, theoretical and experimental, in the real sense of the words. Here his uncanny speed in grasping the main points of any subject was a decisive factor; he could acquaint himself with the essential details of every part of the work. He did not direct from the head office. He was intellectually and physically present at each decisive step. He was present in the laboratory or in the seminar rooms, when a new effect was measured, when a new idea was conceived. It was not that he contributed so many ideas or suggestions; he did so sometimes, but his main influence came from something else. It was his continuous and intense presence, which produced a sense of direct participation in all of us; it created that unique atmosphere of enthusiasm and challenge that pervaded the place throughout its time. At this point in the war, there was considerable anxiety among the scientists that the German nuclear weapons program might be progressing faster than the Manhattan Project. In a letter dated May 25, 1943, Oppenheimer responded to a proposal by Fermi to use radioactive materials to poison German food supplies. Oppenheimer asked Fermi whether he could produce enough strontium without letting too many in on the secret. Oppenheimer continued, "I think we should not attempt a plan unless we can poison food sufficient to kill a half a million men." In 1943, development efforts were directed to a plutonium gun-type fission weapon called "Thin Man". Initial research on the properties of plutonium was done using cyclotron-generated plutonium-239, which was extremely pure but could be created only in tiny amounts. When Los Alamos received the first sample of plutonium from the X-10 Graphite Reactor in April 1944, a problem was discovered: reactor-bred plutonium had a higher concentration of plutonium-240 (five times that of "cyclotron" plutonium), making it unsuitable for use in a gun-type weapon. In July 1944, Oppenheimer abandoned the Thin Man gun design in favor of an implosion-type weapon; a smaller version of Thin Man became Little Boy. Using chemical explosive lenses, a sub-critical sphere of fissile material could be squeezed into a smaller and denser form. The metal needed to travel only very short distances, so the critical mass would be assembled in much less time. In August 1944, Oppenheimer implemented a sweeping reorganization of the Los Alamos laboratory to focus on implosion. He concentrated the development efforts on the gun-type device, but now with a simpler design that only had to work with highly enriched uranium, in a single group. This device became Little Boy in February 1945. After a mammoth research effort, the more complex design of the implosion device, known as the "Christy gadget" after Robert Christy, another student of Oppenheimer's, was finalized as Fat Man in a meeting in Oppenheimer's office on February 28, 1945. In May 1945, an Interim Committee was created to advise and report on wartime and postwar policies regarding the use of nuclear energy. The Interim Committee established a scientific panel consisting of Oppenheimer, Arthur Compton, Fermi, and Lawrence to advise it on scientific issues. In its presentation to the Interim Committee, the panel offered its opinion not just on an atomic bomb's likely physical effects but also on its likely military and political impact. This included opinions on such sensitive issues as whether the Soviet Union should be advised of the weapon in advance of its use against Japan. In the early morning hours of July 16, 1945, near Alamogordo, New Mexico, the work at Los Alamos culminated in the test of the world's first nuclear weapon. Oppenheimer had code-named the site "Trinity" in mid-1944, saying later that the name came from John Donne's Holy Sonnets; he had been introduced to Donne's work in the 1930s by Jean Tatlock, who killed herself in January 1944. Brigadier General Thomas Farrell, who was present in the control bunker with Oppenheimer, recalled: Dr. Oppenheimer, on whom had rested a very heavy burden, grew tenser as the last seconds ticked off. He scarcely breathed. He held on to a post to steady himself. For the last few seconds, he stared directly ahead and then when the announcer shouted "Now!" and there came this tremendous burst of light followed shortly thereafter by the deep growling roar of the explosion, his face relaxed into an expression of tremendous relief. Oppenheimer's brother Frank recalled Oppenheimer's first words as "I guess it worked." According to a 1949 magazine profile, while witnessing the explosion Oppenheimer thought of verses from the Bhagavad Gita: "If the radiance of a thousand suns were to burst at once into the sky, that would be like the splendor of the mighty one ... Now I am become Death, the shatterer of worlds." In 1965 he recalled the moment this way: We knew the world would not be the same. A few people laughed, a few people cried. Most people were silent. I remembered the line from the Hindu scripture, the Bhagavad Gita; Vishnu is trying to persuade the Prince that he should do his duty and, to impress him, takes on his multi-armed form and says, "Now I am become Death, the destroyer of worlds." I suppose we all thought that, one way or another.[note 5] Rabi described seeing Oppenheimer somewhat later: "I'll never forget his walk ... like High Noon ... this kind of strut. He had done it." Despite many scientists' opposition to using the bomb on Japan, Compton, Fermi, and Oppenheimer believed that a test explosion would not convince Japan to surrender. At an August 6 assembly at Los Alamos, the evening of the atomic bombing of Hiroshima, Oppenheimer took to the stage and clasped his hands together "like a prize-winning boxer" while the crowd cheered. He expressed regret that the weapon was ready too late for use against Nazi Germany. On August 17, however, Oppenheimer traveled to Washington to hand-deliver a letter to Secretary of War Henry L. Stimson expressing his revulsion and his wish to see nuclear weapons banned. In October he met with President Harry S. Truman, who dismissed Oppenheimer's concern about an arms race with the Soviet Union and belief that atomic energy should be under international control. Truman became infuriated when Oppenheimer said, "Mr. President, I feel I have blood on my hands", responding that he (Truman) bore sole responsibility for the decision to use atomic weapons against Japan, and later said, "I don't want to see that son of a bitch in this office ever again." For his services as director of Los Alamos, Oppenheimer was awarded the Medal for Merit by Truman in 1946. Postwar activities Once the public learned of the Manhattan Project after the bombings of Hiroshima and Nagasaki, Oppenheimer—suddenly a household name as the "father of the atomic bomb"—became a national spokesman for science, emblematic of a new type of technocratic power; he appeared on the covers of Life and Time. Nuclear physics became a powerful force as nations realized the strategic and political power that atomic weapons conferred. Like many scientists of his generation, Oppenheimer felt that security from atomic bombs could come only from a transnational organization such as the newly formed United Nations, which could institute a program to stifle a nuclear arms race. In November 1945, Oppenheimer left Los Alamos to return to Caltech, but soon found that his heart was no longer in teaching. In 1947, he accepted an offer from Lewis Strauss to take up the directorship of the Institute for Advanced Study in Princeton, New Jersey. This meant moving back east and leaving Ruth Tolman, the wife of his friend Richard Tolman, with whom he had begun an affair after leaving Los Alamos. The job came with a salary of $20,000 per annum, plus rent-free accommodation in the director's house, a 17th-century manor with a cook and groundskeeper, surrounded by 265 acres (107 ha) of woodlands. He collected European furniture, and French Post-Impressionist and Fauvist artworks. His art collection included works by Cézanne, Derain, Despiau, de Vlaminck, Picasso, Rembrandt, Renoir, Van Gogh and Vuillard. Oppenheimer brought together intellectuals at the height of their powers and from a variety of disciplines to answer the most pertinent questions of the age. He directed and encouraged the research of many well-known scientists, including Freeman Dyson, and the duo of Chen Ning Yang and Tsung-Dao Lee, who won a Nobel Prize for their discovery of parity non-conservation. He also instituted temporary memberships for scholars from the humanities, such as T. S. Eliot and George F. Kennan. Some of these activities were resented by a few members of the mathematics faculty, who wanted the institute to stay a bastion of pure scientific research. Abraham Pais said that Oppenheimer himself thought that one of his failures at the institute was being unable to bring together scholars from the natural sciences and the humanities. During a series of conferences in New York—the Shelter Island Conference in 1947, the Pocono Conference in 1948, and the Oldstone Conference in 1949—physicists transitioned from war work back to theoretical issues. Under Oppenheimer's direction, physicists tackled the greatest outstanding problem of the pre-war years: infinite, divergent, and seemingly nonsensical expressions in the quantum electrodynamics of elementary particles. Julian Schwinger, Richard Feynman and Shin'ichiro Tomonaga tackled the problem of regularization, and developed techniques that became known as renormalization. Freeman Dyson was able to prove that their procedures gave similar results. The problem of meson absorption and Hideki Yukawa's theory of mesons as the carrier particles of the strong nuclear force were also tackled. Probing questions from Oppenheimer prompted Robert Marshak's innovative two-meson hypothesis: that there are actually two types of mesons, pions and muons. This led to Cecil Frank Powell's breakthrough and subsequent Nobel Prize for the discovery of the pion.[note 6] Oppenheimer served as director of the institute until 1966, when he gave up the position due to his failing health. As of 2023[update], he is the longest-serving director of the institute. As a member of the Board of Consultants to a committee appointed by Truman, Oppenheimer strongly influenced the 1946 Acheson–Lilienthal Report. In this report, the committee advocated the creation of an international Atomic Development Authority, which would own all fissionable material and the means of its production, such as mines and laboratories, and atomic power plants where it could be used for peaceful energy production. Bernard Baruch was appointed to translate this report into a proposal to the United Nations, resulting in the Baruch Plan of 1946. The Baruch Plan introduced many additional provisions regarding enforcement, in particular requiring inspection of the Soviet Union's uranium resources. It was seen as an attempt to maintain the United States' nuclear monopoly and rejected by the Soviets. With this, it became clear to Oppenheimer that an arms race was unavoidable, due to the mutual suspicion of the United States and the Soviet Union, which even Oppenheimer was starting to distrust. After the Atomic Energy Commission (AEC) came into being in 1947 as a civilian agency in control of nuclear research and weapons issues, Oppenheimer was appointed as the chairman of its General Advisory Committee (GAC). From this position, he advised on a number of nuclear-related issues, including project funding, laboratory construction and even international policy—though the GAC's advice was not always heeded. As chairman of the GAC, Oppenheimer lobbied vigorously for international arms control and funding for basic science, and attempted to influence policy away from a heated arms race. The first atomic bomb test by the Soviet Union in August 1949 came earlier than Americans expected, and over the next several months, there was an intense debate within the U.S. government, military, and scientific communities over whether to proceed with the development of the far more powerful, nuclear fusion–based hydrogen bomb, then known as "the Super". Oppenheimer had been aware of the possibility of a thermonuclear weapon since the days of the Manhattan Project and had allocated a limited amount of theoretical research work toward the possibility at the time, but nothing more than that, given the pressing need to develop a fission weapon. Immediately following the end of the war, Oppenheimer argued against continuing work on the Super at that time, due to both lack of need and the enormous human casualties that would result from its use. Now in October 1949, Oppenheimer and the GAC recommended against the development of the Super. He and the other GAC members were motivated partly by ethical concerns, feeling that such a weapon could only be strategically used, resulting in millions of deaths: "Its use therefore carries much further than the atomic bomb itself the policy of exterminating civilian populations." They also had practical qualms, as there was no workable design for a hydrogen bomb at the time. Regarding the possibility of the Soviet Union developing a thermonuclear weapon, the GAC felt that the United States could have an adequate stockpile of atomic weapons to retaliate against any thermonuclear attack. In that connection, Oppenheimer and the others were concerned about the opportunity costs that would be incurred if nuclear reactors were diverted from materials needed for atom bomb production to the materials such as tritium needed for a thermonuclear weapon. A majority of the AEC subsequently endorsed the GAC recommendation, and Oppenheimer thought that the fight against the Super would triumph, but proponents of the weapon lobbied the White House vigorously. On January 31, 1950, Truman, who was predisposed to proceed with the development of the weapon anyway, made the formal decision to do so. Oppenheimer and other GAC opponents of the project, especially James B. Conant, felt disheartened and considered resigning from the committee. They stayed on, though their views on the hydrogen bomb were well known. In 1951, Teller and mathematician Stanislaw Ulam developed the Teller–Ulam design for a hydrogen bomb. This new design seemed technically feasible and Oppenheimer officially acceded to the weapon's development, while still looking for ways in which its testing or deployment or use could be questioned. As he later recalled: The program we had in 1949 was a tortured thing that you could well argue did not make a great deal of technical sense. It was therefore possible to argue also that you did not want it even if you could have it. The program in 1951 was technically so sweet that you could not argue about that. The issues became purely the military, the political and the humane problem of what you were going to do about it once you had it. Oppenheimer, Conant, and Lee DuBridge, another member who had opposed the H-bomb decision, left the GAC when their terms expired in August 1952. Truman had declined to reappoint them, as he wanted new voices on the committee who were more in support of H-bomb development. In addition, various opponents of Oppenheimer had communicated to Truman their desire that Oppenheimer leave the committee. Oppenheimer played a role on a number of government panels and study projects during the late 1940s and early 1950s, some of which thrust him into controversies and power struggles. In 1948, Oppenheimer chaired the Department of Defense's Long-Range Objectives Panel, a body created by AEC liaison Donald F. Carpenter. It looked at the military utility of nuclear weapons, including how they might be delivered. After a year's worth of study, in spring 1952, Oppenheimer wrote the draft report of Project GABRIEL, which examined the dangers of nuclear fallout. Oppenheimer was also a member of the Science Advisory Committee of the Office of Defense Mobilization. Oppenheimer participated in Project Charles during 1951, which examined the possibility of creating an effective air defense of the United States against atomic attack, and in the follow-on Project East River in 1952, which, with Oppenheimer's input, recommended building a warning system that would provide one-hour notice of an impending atomic attack against American cities. Those two projects led to Project Lincoln in 1952, a large effort on which Oppenheimer was one of the senior scientists. Undertaken at the MIT Lincoln Laboratory, which had recently been founded to study issues of air defense, this in turn led to the Lincoln Summer Study Group, in which Oppenheimer became a key figure. Oppenheimer's and other scientists' urging that resources be allocated to air defense in preference to large retaliatory strike capabilities brought an immediate response of objection from the United States Air Force (USAF), and debate ensued about whether Oppenheimer and allied scientists, or the Air Force, was embracing an inflexible "Maginot Line" philosophy. In any case, the Summer Study Group's work eventually led to the building of the Distant Early Warning Line. Teller, who had been so uninterested in work on the atomic bomb at Los Alamos during the war that Oppenheimer had given him time instead to work on his own project of the hydrogen bomb, left Los Alamos in 1951 to help found, in 1952, a second laboratory at what would become the Lawrence Livermore National Laboratory. Oppenheimer had defended the history of work done at Los Alamos and opposed the creation of the second laboratory. Project Vista looked at improving U.S. tactical warfare capabilities. Oppenheimer was a late addition to the project in 1951 but wrote a key chapter of the report that challenged the doctrine of strategic bombardment and advocated smaller tactical nuclear weapons which would be more useful in a limited theater conflict against enemy forces. Strategic thermonuclear weapons delivered by long-range jet bombers would necessarily be under the control of the U.S. Air Force, whereas the Vista conclusions recommended an increased role for the U.S. Army and U.S. Navy as well. The Air Force reaction to this was immediately hostile, and it succeeded in getting the Vista report suppressed. During 1952, Oppenheimer chaired the five-member State Department Panel of Consultants on Disarmament, which first urged that the United States postpone its planned first test of the hydrogen bomb and seek a thermonuclear test ban with the Soviet Union, on the grounds that avoiding a test might forestall the development of a catastrophic new weapon and open the way for new arms agreements between the two nations. But the panel lacked political allies in Washington, and the Ivy Mike shot went ahead as scheduled. The panel then issued a final report in January 1953, which, influenced by many of Oppenheimer's deeply felt beliefs, presented a pessimistic vision of the future in which neither the United States nor the Soviet Union could establish effective nuclear superiority but both sides could inflict terrible damage on the other. One of the panel's recommendations, which Oppenheimer felt was especially important, was that the U.S. government practice less secrecy and more openness toward the American people about the realities of the nuclear balance and the dangers of nuclear warfare. This notion found a receptive audience in the new Eisenhower administration and led to the creation of Operation Candor. Oppenheimer subsequently presented his view on the lack of utility of ever-larger nuclear arsenals to the American public in a June 1953 article in Foreign Affairs, and it received attention in major American newspapers. Thus by 1953, Oppenheimer had reached another peak of influence, being involved in multiple different government posts and projects and having access to crucial strategic plans and force levels. But at the same time, he had become the enemy of the proponents of strategic bombardment, who viewed his opposition to the H-bomb, followed by these accumulated positions and stances, with a combination of bitterness and distrust. This view was paired with their fear that Oppenheimer's fame and powers of persuasion had made him dangerously influential in government, military, and scientific circles. The FBI under J. Edgar Hoover had been following Oppenheimer since before the war, when he showed Communist sympathies as a professor at Berkeley and had been close to members of the Communist Party, including his wife and brother. They strongly suspected that he himself was a member of the party, based on wiretaps in which party members referred to him or appeared to refer to him as a communist, as well as reports from informers within the party. He had been under close surveillance since the early 1940s, his home and office bugged, his phone tapped and his mail opened. In August 1943, Oppenheimer told Manhattan Project security agents that George Eltenton, whom he did not know, had solicited three men at Los Alamos for nuclear secrets on behalf of the Soviet Union. When pressed on the issue in later interviews, Oppenheimer admitted that the only person who had approached him was his friend Haakon Chevalier, a Berkeley professor of French literature, who had mentioned the matter privately at a dinner at Oppenheimer's house. The FBI furnished Oppenheimer's political enemies with evidence that intimated communist ties. These enemies included Strauss, an AEC commissioner who had long harbored resentment against Oppenheimer both for his activity in opposing the hydrogen bomb and for his humiliation of Strauss before Congress some years earlier. Strauss had expressed opposition to exporting radioactive isotopes to other nations, and Oppenheimer had called them "less important than electronic devices but more important than, let us say, vitamins." On June 7, 1949, Oppenheimer testified before the House Un-American Activities Committee that he had associations with the Communist Party USA in the 1930s. He testified that some of his students, including David Bohm, Giovanni Rossi Lomanitz, Philip Morrison, Bernard Peters, and Joseph Weinberg had been communists at the time they had worked with him at Berkeley. Frank Oppenheimer and his wife Jackie testified before HUAC that they had been members of the Communist Party USA. Frank was subsequently fired from his University of Minnesota position. Unable to find work in physics for many years, he became a cattle rancher in Colorado. He later taught high school physics and was the founder of the San Francisco Exploratorium. The triggering event for the security hearing happened on November 7, 1953, when William Liscum Borden, who until earlier in the year had been the executive director of the United States Congress Joint Committee on Atomic Energy, sent Hoover a letter saying that "more probably than not J. Robert Oppenheimer is an agent of the Soviet Union." Eisenhower never exactly believed the allegations in the letter but felt compelled to move forward with an investigation, and on December 3, he ordered that a "blank wall" be placed between Oppenheimer and any government or military secrets. On December 21, 1953, Strauss told Oppenheimer that his security clearance had been suspended, pending resolution of a series of charges outlined in a letter, and discussed his resigning by way of requesting termination of his consulting contract with the AEC. Oppenheimer chose not to resign and requested a hearing instead. The charges were outlined in a letter from Kenneth D. Nichols, general manager of the AEC. Nichols, who had thought highly of Oppenheimer's work on the earlier Long-Range Objectives Panel, said that "in spite of [Oppenheimer's] record he is loyal to the United States." He nonetheless drafted the letter, but later wrote that he was "not happy with the inclusion of a reference concerning Oppenheimer's opposition to the hydrogen bomb development." The hearing that followed in April–May 1954, which was held in secret, focused on Oppenheimer's past communist ties and his association during the Manhattan Project with suspected disloyal or communist scientists. It then continued with an examination of Oppenheimer's opposition to the H-bomb and stances in subsequent projects and study groups. A transcript of the hearings was published in June 1954, with some redactions. In 2014, the U.S. Department of Energy made the full transcript public. One of the key elements in this hearing was Oppenheimer's earliest testimony about George Eltenton's approach to various Los Alamos scientists, a story that Oppenheimer confessed he had fabricated to protect his friend Haakon Chevalier. Unknown to Oppenheimer, both versions were recorded during his interrogations of a decade before. He was surprised on the witness stand with transcripts of these, which he had not been given a chance to review. In fact, Oppenheimer had never told Chevalier that he had finally named him, and the testimony had cost Chevalier his job. Both Chevalier and Eltenton confirmed mentioning that they had a way to get information to the Soviets, Eltenton admitting he said this to Chevalier and Chevalier admitting he mentioned it to Oppenheimer, but both put the matter in terms of gossip and denied any thought or suggestion of treason or thoughts of espionage, either in planning or in deed. Neither was ever convicted of any crime. Teller testified that he considered Oppenheimer loyal to the U.S. government, but that: In a great number of cases, I have seen Dr. Oppenheimer act—I understand that Dr. Oppenheimer acted—in a way which for me was exceedingly hard to understand. I thoroughly disagreed with him in numerous issues and his actions frankly appeared to me confused and complicated. To this extent I feel that I would like to see the vital interests of this country in hands which I understand better, and therefore trust more. In this very limited sense I would like to express a feeling that I would feel personally more secure if public matters would rest in other hands. Teller's testimony outraged the scientific community, and he was virtually ostracized from academic science. Ernest Lawrence refused to testify, pleading an attack of ulcerative colitis, but an interview in which Lawrence condemned Oppenheimer was submitted in evidence. Many top scientists, as well as government and military figures, testified on Oppenheimer's behalf. Physicist Isidor Isaac Rabi said that the suspension of the security clearance was unnecessary: "he is a consultant, and if you don't want to consult the guy, you don't consult him, period." But Groves testified that, under the stricter security criteria in effect in 1954, he "would not clear Dr. Oppenheimer today". At the conclusion of the hearings, the board revoked Oppenheimer's clearance by a 2–1 vote. It unanimously cleared him of disloyalty, but a majority found that 20 of the 24 charges were either true or substantially true and that Oppenheimer would represent a security risk. Then on June 29, 1954, the AEC upheld the findings of the Personnel Security Board, by a 4–1 decision, with Strauss writing the majority opinion. In that opinion, he stressed Oppenheimer's "defects of character", "falsehoods, evasions and misrepresentations", and past associations with Communists and people close to Communists as the primary reasons for his determination. He did not comment on Oppenheimer's loyalty. During his hearing, Oppenheimer testified on the left-wing activities of ten of his colleagues and previous acquaintances, mostly in reference to activities in the late 1930s. These ten people's activities were already public knowledge through prior hearings and activities (such as Addis, Chevelier, Lambert, May, Pitman, and I. Folkoff) or already known to the FBI. Some believe that had his clearance not been stripped, he might have been remembered as someone who "named names" to save his own reputation, but as it happened, most in the scientific community saw him as a martyr to McCarthyism, an eclectic liberal unjustly attacked by warmongering enemies, symbolic of the shift of scientific work from academia into the military. Wernher von Braun told a Congressional committee: "In England, Oppenheimer would have been knighted." In a seminar at The Wilson Center in 2009, based on an extensive analysis of the Vassiliev notebooks taken from the KGB archives, John Earl Haynes, Harvey Klehr and Alexander Vassiliev confirmed that Oppenheimer never was involved in espionage for the Soviet Union, though Soviet intelligence tried repeatedly to recruit him. Further, he had several persons removed from the Manhattan Project who had sympathies to the Soviet Union. For their part, Jerrold and Leona Schecter conclude that based on The Merkulov Letter, Oppenheimer must have been only a "facilitator", not a spy in the strict sense (although he would fall under that legal category in the U.S.). On December 16, 2022, United States Secretary of Energy Jennifer Granholm vacated the 1954 revocation of Oppenheimer's security clearance. Her statement said, "In 1954, the Atomic Energy Commission revoked Dr. Oppenheimer's security clearance through a flawed process that violated the Commission's own regulations. As time has passed, more evidence has come to light of the bias and unfairness of the process that Dr. Oppenheimer was subjected to while the evidence of his loyalty and love of country have only been further affirmed." Granholm's decision has drawn criticism. Final years The frontiers of science are separated now by long years of study, by specialized vocabularies, arts, techniques, and knowledge from the common heritage even of a most civilized society; and anyone working at the frontier of such science is in that sense a very long way from home, a long way too from the practical arts that were its matrix and origin, as indeed they were of what we today call art. Starting in 1954, Oppenheimer lived for several months of each year on the island of Saint John in the U.S. Virgin Islands. In 1957, he purchased a two-acre (0.8-hectare) tract of land on Gibney Beach, where he built a spartan home on the beach. He spent considerable time sailing with his daughter Toni and wife Kitty. Oppenheimer's first public appearance following the stripping of his security clearance was a lecture titled "Prospects in the Arts and Sciences" for the Columbia University Bicentennial radio show Man's Right to Knowledge, in which he outlined his philosophy and his thoughts on the role of science in the modern world. He had been selected for the final episode of the lecture series two years prior to the security hearing, though the university remained adamant that he stay on even after the controversy. In February 1955, the president of the University of Washington, Henry Schmitz, abruptly canceled an invitation to Oppenheimer to deliver a series of lectures there. Schmitz's decision caused an uproar among the students; 1,200 of them signed a petition protesting the decision, and Schmitz was burned in effigy. While they marched in protest, the state of Washington outlawed the Communist Party, and required all government employees to swear a loyalty oath. Edwin Albrecht Uehling, the chairman of the physics department and a colleague of Oppenheimer's from Berkeley, appealed to the university senate, and Schmitz's decision was overturned by a vote of 56 to 40. Oppenheimer stopped briefly in Seattle to change planes on a trip to Oregon and was joined for coffee during his layover by several University of Washington faculty, but Oppenheimer never lectured there. Oppenheimer gave two lectures on the "Constitution of Matter" at Oregon State University during this trip. Oppenheimer was increasingly concerned about the danger that scientific inventions could pose to humanity. He joined with Albert Einstein, Bertrand Russell, Joseph Rotblat, and other eminent scientists and academics to establish what would eventually, in 1960, become the World Academy of Art and Science. Significantly, after his public humiliation, he did not sign the major open protests against nuclear weapons of the 1950s, including the Russell–Einstein Manifesto of 1955, nor, though invited, did he attend the first Pugwash Conferences on Science and World Affairs in 1957. In his speeches and public writings, Oppenheimer continually stressed the difficulty of managing the power of knowledge in a world in which the freedom of science to exchange ideas was more and more hobbled by political concerns. Oppenheimer delivered the Reith Lectures on the BBC in 1953, which were subsequently published as Science and the Common Understanding. In 1955, Oppenheimer published The Open Mind, a collection of eight lectures that he had given since 1946 on the subject of nuclear weapons and popular culture. Oppenheimer rejected the idea of nuclear gunboat diplomacy. "The purposes of this country in the field of foreign policy", he wrote, "cannot in any real or enduring way be achieved by coercion." In 1957, the philosophy and psychology departments at Harvard invited Oppenheimer to deliver the William James Lectures. An influential group of Harvard alumni led by Edwin Ginn that included Archibald Roosevelt protested the decision. 1,200 people attended Oppenheimer's six lectures, "The Hope of Order", in Sanders Theatre. In 1962, Oppenheimer delivered the Whidden Lectures at McMaster University, which were published in 1964 as The Flying Trapeze: Three Crises for Physicists. Deprived of political influence, Oppenheimer continued to lecture, write, and work on physics. He toured Europe and Japan, giving talks about the history of science, the role of science in society, and the nature of the universe. Oppenheimer was warmly received during his three-week lecture tour in Japan in 1960, just 15 years after the bombings of Hiroshima and Nagasaki. He indicated interest in seeing Hiroshima, but the Japan Committee for Intellectual Interchange, which sponsored the tour, decided it would be best not to stop at Hiroshima or Nagasaki. In 1963 he spoke about the importance of studying the history of science at the dedication of the Niels Bohr Library and Archives of the American Institute of Physics. Oppenheimer continued to visit academic institutions throughout his final years. He remained a controversial figure to students, faculty, and communities. In November 1955, Oppenheimer became the inaugural week-long visiting fellow at the Phillips Exeter Academy in Exeter, New Hampshire. In September 1957, France made Oppenheimer an Officer of the Legion of Honor, and on May 3, 1962, he was elected a Foreign Member of the Royal Society in Britain. In 1959, then-Senator John F. Kennedy voted to deny Lewis Strauss, Oppenheimer's greatest detractor in his security hearings, confirmation as Secretary of Commerce, effectively ending Strauss's political career. In 1962, Kennedy―now President of the United States―invited Oppenheimer to a ceremony honoring 49 Nobel Prize winners. At the event, AEC chairman Glenn Seaborg asked Oppenheimer whether he wanted another security hearing. Oppenheimer declined. In March 1963, the General Advisory committee of the AEC selected Oppenheimer to receive its Enrico Fermi Award, an award Congress had created in 1954. Kennedy was assassinated before he could present the award to Oppenheimer, but his successor, Lyndon Johnson, did so in a December 1963 ceremony at which he cited Oppenheimer's "contributions to theoretical physics as a teacher and originator of ideas, [and] leadership of the Los Alamos Laboratory and the atomic energy program during critical years." He called the signing of the award one of Kennedy's greatest acts as president. Oppenheimer told Johnson, "I think it is just possible, Mr. President, that it has taken some charity and some courage for you to make this award today." Kennedy's widow, Jackie, made a point of attending the ceremony so she could tell Oppenheimer how much her husband had wanted him to have the medal. Also present were Teller, who had recommended Oppenheimer receive the award in hopes that it would heal the rift between them, and Henry D. Smyth, who in 1954 had been the lone dissenter from the AEC's 4–1 decision to define Oppenheimer as a security risk. But congressional hostility to Oppenheimer lingered. Senator Bourke B. Hickenlooper formally protested Oppenheimer's selection just eight days after Kennedy was killed, and several Republican members of the House AEC Committee boycotted the ceremony. The rehabilitation represented by the award was symbolic, as Oppenheimer still lacked a security clearance and could have no effect on official policy, but the award came with a $50,000 tax-free stipend. Death In late 1965, Oppenheimer was diagnosed with throat cancer, likely caused by chain smoking cigarettes for much of his life. After inconclusive surgery, he underwent unsuccessful radiation treatment and chemotherapy late in 1966. On February 18, 1967, he died in his sleep at his home in Princeton, aged 62 years. A memorial service was held a week later at Alexander Hall on the campus of Princeton University. The service was attended by 600 of his scientific, political, and military associates, including Bethe, Groves, Kennan, Lilienthal, Rabi, Smyth, and Wigner. His brother Frank and the rest of his family were there, as was the historian Arthur M. Schlesinger Jr., the novelist John O'Hara, and George Balanchine, the director of the New York City Ballet. Bethe, Kennan and Smyth gave brief eulogies. Oppenheimer's body was cremated and his ashes placed in an urn, which Kitty dropped into the sea within sight of the Saint John beach house. In October 1972, Kitty died from an intestinal infection complicated by a pulmonary embolism. She was 62. Oppenheimer's ranch in New Mexico was then inherited by their son Peter, and the beach property was inherited by their daughter Katherine "Toni" Oppenheimer Silber. Toni's two marriages ended in divorce. She obtained a temporary position as a translator at the United Nations in 1969, but the position required an FBI security clearance, which never came through due to the old charges against her father. She moved to the family beach house on Saint John and committed suicide by hanging there in 1977. She left the property to "the people of Saint John." The house was built too close to the coast and was destroyed by a hurricane. As of 2007[update], the Virgin Islands Government maintained a Community Center nearby. Legacy When Oppenheimer was stripped of his political influence in 1954, he symbolized for many the folly of scientists who believed they could control the use of their research, and the dilemmas of moral responsibility presented by science in the nuclear age. The hearings were motivated by politics and personal enmities, and reflected a stark divide in the nuclear weapons community. One group passionately feared the Soviet Union as a mortal enemy, and believed having the most powerful weaponry capable of providing the most massive retaliation was the best strategy to combat that threat. The other group thought developing the H-bomb would not improve Western security and that using the weapon against large civilian populations would be genocide; they advocated instead a more flexible response to the Soviets involving tactical nuclear weapons, strengthened conventional forces, and arms control agreements. The first of these groups was the more powerful in political terms, and Oppenheimer became its target. Rather than consistently oppose the "Red-baiting" of the late 1940s and early 1950s, Oppenheimer testified against former colleagues and students, before and during his hearing. In one incident, his damning testimony against former student Bernard Peters was selectively leaked to the press. Historians have interpreted this as an attempt by Oppenheimer to please his colleagues in the government and perhaps to divert attention from his own previous left-wing ties and those of his brother. In the end, it became a liability when it became clear Oppenheimer had really doubted Peters's loyalty, and recommending him for the Manhattan Project was reckless, or at least contradictory. Popular depictions of Oppenheimer view his security struggles as a confrontation between right-wing militarists (represented by Teller) and left-wing intellectuals (represented by Oppenheimer) over the moral question of weapons of mass destruction. Biographers and historians have often viewed Oppenheimer's story as a tragedy. National security advisor and academic McGeorge Bundy, who worked with Oppenheimer on the State Department Panel of Consultants, wrote: "Quite aside from Oppenheimer's extraordinary rise and fall in prestige and power, his character has fully tragic dimensions in its combination of charm and arrogance, intelligence and blindness, awareness and insensitivity, and perhaps above all daring and fatalism. All these, in different ways, were turned against him in the hearings." The question of scientists' responsibility toward humanity inspired Bertolt Brecht's drama Life of Galileo (1955), left its imprint on Friedrich Dürrenmatt's The Physicists, and is the basis of John Adams's 2005 opera Doctor Atomic, which was commissioned to portray Oppenheimer as a modern-day Faust. Heinar Kipphardt's play In the Matter of J. Robert Oppenheimer, after appearing on West German television, had its theatrical release in Berlin and Munich in October 1964. The 1967 Finnish television film Oppenheimerin tapaus (The Case of Oppenheimer) is based on the same play and produced by the Yleisradio company. Oppenheimer's objections resulted in an exchange of correspondence with Kipphardt, in which Kipphardt offered to make corrections but defended the play. It premiered in New York in 1968, with Joseph Wiseman as Oppenheimer. New York Times theater critic Clive Barnes called it an "angry play and a partisan play" that sided with Oppenheimer but portrayed him as a "tragic fool and genius." Oppenheimer had difficulty with this portrayal. After reading a transcript of Kipphardt's play soon after it began to be performed, Oppenheimer threatened to sue Kipphardt, decrying "improvisations which were contrary to history and to the nature of the people involved." Later Oppenheimer told an interviewer: The whole damn thing [his security hearing] was a farce, and these people are trying to make a tragedy out of it. ... I had never said that I had regretted participating in a responsible way in the making of the bomb. I said that perhaps he [Kipphardt] had forgotten Guernica, Coventry, Hamburg, Dresden, Dachau, Warsaw, and Tokyo; but I had not, and that if he found it so difficult to understand, he should write a play about something else. Oppenheimer is the subject of many biographies, including American Prometheus (2005) by Kai Bird and Martin J. Sherwin, which won the 2006 Pulitzer Prize for Biography or Autobiography. The 1980 BBC TV serial Oppenheimer, starring Sam Waterston, won three BAFTA Television Awards. The Day After Trinity, a 1980 documentary about Oppenheimer and the atomic bomb, was nominated for an Academy Award and received a Peabody Award. Oppenheimer's life is explored in Tom Morton-Smith's 2015 play Oppenheimer, and the 1989 film Fat Man and Little Boy, where he was portrayed by Dwight Schultz. Also in 1989, David Strathairn played Oppenheimer in the TV film Day One. In the 2023 American film Oppenheimer, directed by Christopher Nolan and based on American Prometheus, Oppenheimer is portrayed by Cillian Murphy. The film won the Academy Award for best picture, and Murphy won for best actor. A centennial conference about Oppenheimer's legacy was held in 2004 at the University of California, Berkeley, alongside a digital exhibition on his life, with the conference proceedings published in 2005 as Reappraising Oppenheimer: Centennial Studies and Reflections. His papers are in the Library of Congress. As a scientist, Oppenheimer was remembered by his students and colleagues as a brilliant researcher and engaging teacher who founded modern theoretical physics in the United States. "More than any other man", Bethe wrote, "he was responsible for raising American theoretical physics from a provincial adjunct of Europe to world leadership." Because his scientific attentions often changed rapidly, he never worked long enough on any one topic and carried it to fruition to merit the Nobel Prize, though his investigations contributing to the theory of black holes might have warranted the prize had he lived long enough to see them brought to fruition by later astrophysicists. An asteroid, 67085 Oppenheimer, was named in his honor on January 4, 2000, as was the lunar crater Oppenheimer in 1970. As a military and public policy advisor, Oppenheimer was a leader in the shift toward technocracy in the interactions between science and the military, and in the emergence of "big science". During World War II, scientists became involved in military research to an unprecedented degree. Because of the threat fascism posed to Western civilization, they volunteered in great numbers for technological, and organizational, assistance to the Allied effort, resulting in powerful tools such as radar, the proximity fuze and operations research. As a cultured, intellectual, theoretical physicist who became a disciplined military organizer, Oppenheimer represented the shift away from the idea that scientists had their "heads in the clouds" and that knowledge of esoteric subjects like the composition of the atomic nucleus had no "real-world" applications. Two days before the Trinity test, Oppenheimer expressed his hopes and fears in a quotation from Bhartṛhari's Śatakatraya: In battle, in the forest, at the precipice in the mountains, On the dark great sea, in the midst of javelins and arrows, In sleep, in confusion, in the depths of shame, The good deeds a man has done before defend him. Publications Notes References Sources Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Isabel_Kershner] | [TOKENS: 253] |
Contents Isabel Kershner Isabel Kershner is a British-born Israeli journalist and author, who began reporting from Jerusalem for The New York Times in 2007. Kershner had previously worked as senior Middle East editor for The Jerusalem Report magazine. She has also written for The New Republic and has provided commentary on Middle East affairs on BBC Radio and elsewhere. Her latest book is "The Land of Hope and Fear: Israel's Battle for its Inner Soul", published in 2023. Career Kershner was born in Manchester, England. She completed a degree in Oriental Studies from the University of Oxford. In April 1992, she married author Hirsh Goodman, a fellow immigrant to Israel; the couple have two children, Gavriel and Lev. Kershner speaks Hebrew and Arabic. Criticism In her role reporting on Israeli-Palestinian issues, she has been accused of conflict of interest, as her son has served in the Israel Defense Forces, and her husband is an employee of the Institute for National Security Studies, which is involved in promoting a positive image of Israel, and which Kershner often relies on as a source. Bibliography Notes This article about a British journalist is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Category:All_Wikipedia_articles_needing_clarification] | [TOKENS: 133] |
Category:All Wikipedia articles needing clarification This is a category to help keep count of the total number of articles which need clarification indicated with one of the following templates: {{Ambiguous}}, {{Clarify}}. {{Clarify span}}, {{Confusing}}, {{Definition}}, {{Definition span}}, {{Sentence fragment}}, {{Unclear date}}, {{Vague}} or {{Why}}. They should all be in one of the dated categories. See also Category:Wikipedia articles needing clarification. Pages in category "All Wikipedia articles needing clarification" The following 200 pages are in this category, out of approximately 6,017 total. This list may not reflect recent changes. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Harvard_Computers] | [TOKENS: 3648] |
Contents Harvard Computers The Harvard Computers were a team of women working as skilled workers to process astronomical data at the Harvard College Observatory in Cambridge, Massachusetts, United States. The team was directed by Edward Charles Pickering (1877 to 1919) and, following his death in 1919, by Annie Jump Cannon. The women were challenged to make sense of these patterns by devising a scheme for sorting the stars into categories. Annie Jump Cannon's success at this activity made her famous in her own lifetime, and she produced a stellar classification system that is still in use today. Antonia Maury discerned in the spectra a way to assess the relative sizes of stars, and Henrietta Leavitt showed how the cyclic changes of certain variable stars could serve as distance markers in space. Other computers on the team included Mary Anna Draper, Williamina Fleming, Anna Winlock, and Florence Cushman. Although these women started primarily as calculators, they made significant contributions to astronomy, much of which they published in research articles. History In the 19th century, the Harvard College Observatory faced the challenge of working through an overwhelming amount of astronomical data due to improvements in photographic technology. Harvard Observatory's director, Edward Charles Pickering, hired a group of women to analyze the astronomical data recorded on the growing collection of plate negatives. While Pickering was the director of the Harvard Observatory, he hired over eighty women. These women were known as computers. Although Pickering believed that gathering data at astronomical observatories was not the most appropriate work, it seems that several factors contributed to his decision to hire women instead of men. Among them was that men were paid much more than women, so he could employ more staff with the same budget. This was relevant in a time when the amount of astronomical data was surpassing the capacity of the Observatories to process it. Although some of Pickering's female staff were astronomy graduates, their wages were similar to those of unskilled workers. They usually earned between 25 and 50 cents per hour (between $8 and $16 in 2025), more than a factory worker but less than a clerical one. Most of the women depended financially on their friends and family members and lived with coworkers to combat the low wages. Although the wages Pickering provided were low, it was common to pay women less than men during the 20th century and does not discount his advocation for women in astronomy. In describing the dedication and efficiency with which the Harvard Computers, including Cushman, undertook this effort, Edward Pickering said, "a loss of one minute in the reduction of each estimate would delay the publication of the entire work by the equivalent of the time of one assistant for two years." Another reason why Pickering decided to hire women over men was he thought allowing women to conduct astronomical research would show the general public that women were capable of higher thinking and worthy of higher education. The first female computer to be hired at the Harvard Observatory was Anna Winlock. Pickering's first hire was Williamina Fleming six years later in 1881. Together, Fleming and Pickering continued to hire female computers through the twentieth century. At times women offered to work at the observatory for free in order to gain experience in a field that was difficult to get into. The computer position was one of the lower class positions at the observatory due to the pay and little chance for promotion. Under the Henry Draper Memorial project, the women were often tasked with measuring the brightness, position, and color of stars. The goal of the project was to photograph the stars and classify their spectrum. Their work was often segregated from men, so teams of male astronomers would take photographs of the stars in the evening and send them to the women at Harvard for analysis. The work included such tasks as classifying stars by calculating their exact position and movement, predicting the return of comets, and by comparing the photographs to known catalogs and reducing the photographs while accounting for things like atmospheric refraction, parallax, and error in various instruments in order to render the clearest possible image. While the work was repetitive, it still required attention and accuracy. Fleming herself described the work as "so nearly alike that there will be little to describe outside ordinary routine work of measurement, examination of photographs, and of work involved in the reduction of these observations". The work would not have been possible without photographic plate technology. With such technology, dry, color sensitive plates are used to capture photo visual and photo-red magnitudes. The dry plates allowed for longer exposure over longer time intervals, increasing the accuracy of the photographs and range of stars capable of being photographed. The plate technology allowed the women to classify stars more accurately than before. The observatory, with the help of computers, made several breakthroughs in classifying and cataloging the stars. One such accomplishment was the Henry Draper Catalogue. Following the death of Henry Draper (1882), Mary Anna Palmer Draper funded the Mount Wilson Observatory. The work on the catalogue was led by Williamina Fleming. Following the initial classifications done by Fleming (1890), Antonia Maury helped place stars in their correct positions and did further research on the spectra of the stars with Pickering (1901). Henrietta Leavitt discovered a relationship between a Cepheid variable’s brightness and its pulsation period (1908). Annie Jump Cannon and her team classified an average of 5,000 stars per month from the years 1912–1915. Florence Cushman helped organize and process the data. The catalog was published between 1918 and 1924. Following the death of Pickering (1919), Cannon took control of the projects. An extension to the original works was published between 1925 and 1936, where over 46,850 stars were classified. In the later years of the program, following the publication of the catalog, several women joined and continued to make contributions. Margaret Walton Mayall contributed to the classification of stellar spectra. She later went on to lead the American Association of Variable Star Observers. Helen Sawyer Hogg specialized in cataloging variable stars within globular clusters. Her work helped lay the foundation for understanding stellar evolution and the structure of the universe. Cecilia Payne-Gaposchkin proved that stars are composed primarily of hydrogen and helium. Muriel Mussells Seyfert discovered three new ring nebulae on photographic plates, expanding the catalog of known planetary nebulae. Notable members Mary Anna Palmer Draper (1839–1914) was an American astronomer who helped found the Mount Wilson Observatory. Draper was the widow of Dr. Henry Draper, an astronomer who died before completing his work on the chemical composition of stars. She was very involved in her husband's work and wanted to finish his classification of stars after he died. Mary Draper quickly realized the task facing her was far too daunting for one person. She had received correspondence from Mr. Pickering, a close friend of hers and her husband's. Pickering offered to help finish her husband's work, and encouraged her to publish his findings up to the time of his death. Draper agreed to give Pickering the plates her husband had been working on, but took them to Harvard University herself since the plates were very small. While at the university, Draper met the Harvard Observatory's current computers and was able to observe some of the observatory's current projects. After some deliberation and much consideration, Draper decided in 1886 to donate money and a telescope of her husband's to the Harvard Observatory in order to photograph the spectra of stars. She had decided this would be the best way to continue her husband's work and erect his legacy in astronomy. She was very insistent on funding the memorial project with her own inheritance, as it would carry on her husband's legacy. She was a dedicated follower of the observatory and a great friend of Pickering's. In 1900, she funded an expedition to see the total solar eclipse occurring that year. Williamina Fleming (1857–1911) was a Scottish immigrant astronomer who helped with the photographic classification of stellar spectra. Fleming had no prior relation to Harvard, as she was a Scottish immigrant working as Pickering's housemaid. Her first assignment was to improve an existing catalog of stellar spectra, which later led to her appointment as head of the ‘’Henry Draper Catalogue’’ project. Fleming went on to help develop a classification of stars based on their hydrogen content, as well as play a major role in discovering the strange nature of white dwarf stars. Williamina continued her career in astronomy when she was appointed Harvard's Curator of Astronomical Photographs in 1899, also known as Curator of the Photographic Plates. At the age of 42, Fleming became the first woman at the observatory to hold a title of such nature. She remained the only woman curator until the 1950s. Her work also led to her becoming the first female American citizen to be elected to the Royal Astronomical Society in 1907. Throughout her career, Fleming was able to classify 10,000 spectra and found over 50 nebulae and over 300 stars. Fleming did not retire from working at the observatory, as she died at age 54 from pneumonia. Antonia Maury (1866-1952) was an American astronomer who worked on calculating the orbit of a spectroscopic binary. Maury was the niece of Henry Draper, and after recommendation from Mrs. Draper, was hired as a computer at the age of 22. She was a graduate from Vassar College with honors in physics, astronomy, and philosophy. Pickering was uncomfortable paying the average computer salary to someone with Antonia Maury's achievements, but ultimately ended up hiring her. Maury was first tasked with the spectral measurement of some of the brightest stars. Pickering then tasked Maury with reclassifying some of the stars after the publication of the Henry Draper Catalog. In 1889, Maury studied images of Mizar and found out that it was actually two stars based on two K-lines that became visible for the star every few weeks. Antonia took it upon herself to improve and redesigned the system of classification which was later adopted by the International Astronomical Union. Maury left the observatory in 1891 to begin teaching at the Gilman School in Cambridge Massachusetts. Later, Maury would return to the observatory in 1893 and 1895 to publish many of her observations of stellar spectra. Her work was finished with the help of Pickering and the computing staff and was published in 1897. Maury would return to Harvard College Observatory in 1918 as an adjunct professor. During this time, Maury's work began to be published under her own name due in part to the director Harlow Shapely. She would remain at the observatory until she retired in 1948. Anna Winlock (1857–1904) was an American astronomer who helped catalog stars for the Henry Draper Catalogue. Some of the first women who were hired to work as computers had familial connections to the Harvard Observatory’s male staff. For instance, Winlock, one of the first of the Harvard Computers, was the daughter of Joseph Winlock, the third director of the observatory and Pickering’s immediate predecessor. Anna Winlock joined the observatory in 1875 to assist in supporting her family after her father's unexpected passing. She tackled her father's unfinished data analysis, performing the arduous work of mathematically reducing meridian circle observations, which rescued a decade's worth of numbers that had been left in a useless state. Winlock also worked on a stellar cataloging section called the "Cambridge Zone". Working over twenty years on the project, the work done by her team on the Cambridge Zone contributed significantly to the Astronomische Gesellschaft Katalog, which contains information on more than one-hundred thousand stars and is used worldwide by many observatories and their researchers. Within a year of Anna Winlock's hiring, three other women joined the staff: Selina Bond, Rhoda Sauders, and a third, who was likely a relative of an assistant astronomer. In 1886, Anna's younger sister, Louisa Winlock, joined her in the computing room. Annie Jump Cannon (1863–1941) was an American astronomer who made a catalog of the stars, classifying and recording them. Following the death of Pickering in 1901 she took control over the observatory. Pickering hired Cannon, a graduate of Wellesley College, to classify the southern stars. While at Wellesley, she took astronomy courses from one of Pickering's star students, Sarah Frances Whiting. She became the first female assistant to study variable stars at night. She studied the light curve of variable stars which could help suggest the type and causation of variation. Cannon, adding to work done by fellow computer Antonia Maury, greatly simplified [Pickering and Fleming's star classification based on temperature] system, and in 1922, the International Astronomical Union adopted [Cannon's] as the official classification system for stars....During Pickering’s 42-year tenure at the Harvard Observatory, which ended only a year before he died, in 1919, he received many awards, including the Bruce Medal, the Astronomical Society of the Pacific’s highest honor. Craters on the moon and on Mars are named after him. And Annie Jump Cannon’s enduring achievement was dubbed the Harvard—not the Cannon—system of spectral classification. Cannon's Harvard Classification Scheme is the basis of the today's familiar O B A F G K M system. She also categorized the variable stars into tables so they could be identified and compared more easily. These systems connect the color of stars to their temperature. According to Rebecca Dinerstein Knight, Cannon was able to work at a pace of classifying the spectra of 300 stars an hour and therefore was able to classify over 350,000 stars in her lifetime. Cannon was the first female scientist to be recognized for many awards and titles in her field of study. She was the first woman to receive an honorary doctorate from the University of Oxford and the Henry Draper Medal from the National Academy of Sciences, and the first female officer in the American Astronomical Society. Cannon went on to establish her own Annie Jump Cannon Award for women in postdoctoral work. In 1934, Cannon awarded the first Annie Jump Cannon Award to Cecilia Payne-Gaposchkin for her contributions in analyzing stars and the stellar spectrum. The award was given out at an American Astronomical Society meeting, and for winning, Cannon awarded Gaposchkin $50 and a gold pin. Henrietta Swan Leavitt (1868-1921) was an American astronomer who worked to measure the distances between galaxies and determine the scale of modeling. Leavitt arrived at the observatory in 1893. She had experience through her college studies, traveling abroad, and teaching. In academia, Leavitt excelled in mathematics courses at Cambridge. When she began working at the observatory she was tasked with measuring star brightness through photometry. She found hundreds of new variable stars after starting to analyze the Great Nebula in Orion and her work was expanded to study the variables of the entire sky with Annie Jump Cannon and Evelyn Leland. With skills gained in photometry, Leavitt compared stars in different exposures. Studying Cepheid variables in the Small Magellanic Cloud, she discovered that their apparent brightness was dependent on their period. Since all those stars were approximately the same distance from Earth, that meant their absolute brightness must depend on their period as well, allowing the use of Cepheid variables as a standard candle for determining cosmic distances. That, in turn, led directly to the modern understanding of the true size of the universe, and Cepheid variables are still an essential rung in the cosmic distance ladder. Pickering published her work with his name as co-author. The legacy she left allowed future scientists to make further discoveries in space. Astronomer Edwin Hubble used Leavitt's method to calculate the distance of the nearest galaxy to the earth, the Andromeda Galaxy. This led to the realization that there are even more galaxies than previously thought. Florence Cushman (1860–1940) was an American astronomer at the Harvard College Observatory who worked on the Henry Draper Catalogue. Cushman was born in Boston, Massachusetts in 1860 and received her early education at Charlestown High School, where she graduated in 1877. In 1888, she began work at the Harvard College Observatory as an employee of Edward Pickering. Her classifications of stellar spectra contributed to Henry Draper Catalogue between 1918 and 1934. She stayed as an astronomer at the Observatory until 1937 and died in 1940 at the age of 80. Cushman worked at the Harvard College Observatory from 1918 to 1937. Over the course of her nearly fifty-year career, she employed the objective prism method to analyze, classify, and catalog the optical spectra of hundreds of thousands of stars. In the 19th century, the photographic revolution enabled more detailed analysis of the night sky than had been possible with solely eye-based observations. In order to obtain optical spectra for measurement, male astronomers at the Harvard College Observatory expose glass plates on which the astronomical images were captured at night. During the daytime, female assistants like Florence analyzed the resultant spectra by reducing values, computing magnitudes, and cataloging their findings. She is credited with determining the positions and magnitudes of the stars listed in the 1918 edition of the Henry Draper Catalogue, which featured the spectra of roughly 222,000 stars. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Israel_Atomic_Energy_Commission] | [TOKENS: 245] |
Contents Israel Atomic Energy Commission The Israel Atomic Energy Commission (IAEC; Hebrew: הוועדה לאנרגיה אטומית) is the governmental authority responsible for the State of Israel's activities in the nuclear field. History The establishment of the Israel Atomic Energy Commission was announced on 13 June 1952 by Prime Minister David Ben-Gurion. The prime minister appointed Professor Ernst David Bergmann to be its first director-general. Initially the committee was housed in temporary structures near Rehovot and is now located in Ramat Aviv. It oversaw the establishment of the Soreq Nuclear Research Center, the construction of which started in 1958 and the Negev Nuclear Research Center that began construction in late 1959. Functions The IAEC advises the government of Israel in areas of nuclear policy and in setting priorities in nuclear research and development. The commission implements governmental policies and represents Israel in international organizations in the nuclear field, such as the International Atomic Energy Agency. The IAEC maintains relationships with relevant national authorities of other countries. See also References External links This article about government in Israel is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Python_(programming_language)#cite_note-AutoNT-74-155] | [TOKENS: 4314] |
Contents Python (programming language) Python is a high-level, general-purpose programming language. Its design philosophy emphasizes code readability with the use of significant indentation. Python is dynamically type-checked and garbage-collected. It supports multiple programming paradigms, including structured (particularly procedural), object-oriented and functional programming. Guido van Rossum began working on Python in the late 1980s as a successor to the ABC programming language. Python 3.0, released in 2008, was a major revision and not completely backward-compatible with earlier versions. Beginning with Python 3.5, capabilities and keywords for typing were added to the language, allowing optional static typing. As of 2026[update], the Python Software Foundation supports Python 3.10, 3.11, 3.12, 3.13, and 3.14, following the project's annual release cycle and five-year support policy. Python 3.15 is currently in the alpha development phase, and the stable release is expected to come out in October 2026. Earlier versions in the 3.x series have reached end-of-life and no longer receive security updates. Python has gained widespread use in the machine learning community. It is widely taught as an introductory programming language. Since 2003, Python has consistently ranked in the top ten of the most popular programming languages in the TIOBE Programming Community Index, which ranks based on searches in 24 platforms. History Python was conceived in the late 1980s by Guido van Rossum at Centrum Wiskunde & Informatica (CWI) in the Netherlands. It was designed as a successor to the ABC programming language, which was inspired by SETL, capable of exception handling and interfacing with the Amoeba operating system. Python implementation began in December 1989. Van Rossum first released it in 1991 as Python 0.9.0. Van Rossum assumed sole responsibility for the project, as the lead developer, until 12 July 2018, when he announced his "permanent vacation" from responsibilities as Python's "benevolent dictator for life" (BDFL); this title was bestowed on him by the Python community to reflect his long-term commitment as the project's chief decision-maker. (He has since come out of retirement and is self-titled "BDFL-emeritus".) In January 2019, active Python core developers elected a five-member Steering Council to lead the project. The name Python derives from the British comedy series Monty Python's Flying Circus. (See § Naming.) Python 2.0 was released on 16 October 2000, featuring many new features such as list comprehensions, cycle-detecting garbage collection, reference counting, and Unicode support. Python 2.7's end-of-life was initially set for 2015, and then postponed to 2020 out of concern that a large body of existing code could not easily be forward-ported to Python 3. It no longer receives security patches or updates. While Python 2.7 and older versions are officially unsupported, a different unofficial Python implementation, PyPy, continues to support Python 2, i.e., "2.7.18+" (plus 3.11), with the plus signifying (at least some) "backported security updates". Python 3.0 was released on 3 December 2008, and was a major revision and not completely backward-compatible with earlier versions, with some new semantics and changed syntax. Python 2.7.18, released in 2020, was the last release of Python 2. Several releases in the Python 3.x series have added new syntax to the language, and made a few (considered very minor) backward-incompatible changes. As of January 2026[update], Python 3.14.3 is the latest stable release. All older 3.x versions had a security update down to Python 3.9.24 then again with 3.9.25, the final version in 3.9 series. Python 3.10 is, since November 2025, the oldest supported branch. Python 3.15 has an alpha released, and Android has an official downloadable executable available for Python 3.14. Releases receive two years of full support followed by three years of security support. Design philosophy and features Python is a multi-paradigm programming language. Object-oriented programming and structured programming are fully supported, and many of their features support functional programming and aspect-oriented programming – including metaprogramming and metaobjects. Many other paradigms are supported via extensions, including design by contract and logic programming. Python is often referred to as a 'glue language' because it is purposely designed to be able to integrate components written in other languages. Python uses dynamic typing and a combination of reference counting and a cycle-detecting garbage collector for memory management. It uses dynamic name resolution (late binding), which binds method and variable names during program execution. Python's design offers some support for functional programming in the "Lisp tradition". It has filter, map, and reduce functions; list comprehensions, dictionaries, sets, and generator expressions. The standard library has two modules (itertools and functools) that implement functional tools borrowed from Haskell and Standard ML. Python's core philosophy is summarized in the Zen of Python (PEP 20) written by Tim Peters, which includes aphorisms such as these: However, Python has received criticism for violating these principles and adding unnecessary language bloat. Responses to these criticisms note that the Zen of Python is a guideline rather than a rule. The addition of some new features had been controversial: Guido van Rossum resigned as Benevolent Dictator for Life after conflict about adding the assignment expression operator in Python 3.8. Nevertheless, rather than building all functionality into its core, Python was designed to be highly extensible via modules. This compact modularity has made it particularly popular as a means of adding programmable interfaces to existing applications. Van Rossum's vision of a small core language with a large standard library and easily extensible interpreter stemmed from his frustrations with ABC, which represented the opposite approach. Python claims to strive for a simpler, less-cluttered syntax and grammar, while giving developers a choice in their coding methodology. Python lacks do .. while loops, which Rossum considered harmful. In contrast to Perl's motto "there is more than one way to do it", Python advocates an approach where "there should be one – and preferably only one – obvious way to do it". In practice, however, Python provides many ways to achieve a given goal. There are at least three ways to format a string literal, with no certainty as to which one a programmer should use. Alex Martelli is a Fellow at the Python Software Foundation and Python book author; he wrote that "To describe something as 'clever' is not considered a compliment in the Python culture." Python's developers typically prioritize readability over performance. For example, they reject patches to non-critical parts of the CPython reference implementation that would offer increases in speed that do not justify the cost of clarity and readability.[failed verification] Execution speed can be improved by moving speed-critical functions to extension modules written in languages such as C, or by using a just-in-time compiler like PyPy. Also, it is possible to transpile to other languages. However, this approach either fails to achieve the expected speed-up, since Python is a very dynamic language, or only a restricted subset of Python is compiled (with potential minor semantic changes). Python is meant to be a fun language to use. This goal is reflected in the name – a tribute to the British comedy group Monty Python – and in playful approaches to some tutorials and reference materials. For instance, some code examples use the terms "spam" and "eggs" (in reference to a Monty Python sketch), rather than the typical terms "foo" and "bar". A common neologism in the Python community is pythonic, which has a broad range of meanings related to program style: Pythonic code may use Python idioms well; be natural or show fluency in the language; or conform with Python's minimalist philosophy and emphasis on readability. Syntax and semantics Python is meant to be an easily readable language. Its formatting is visually uncluttered and often uses English keywords where other languages use punctuation. Unlike many other languages, it does not use curly brackets to delimit blocks, and semicolons after statements are allowed but rarely used. It has fewer syntactic exceptions and special cases than C or Pascal. Python uses whitespace indentation, rather than curly brackets or keywords, to delimit blocks. An increase in indentation comes after certain statements; a decrease in indentation signifies the end of the current block. Thus, the program's visual structure accurately represents its semantic structure. This feature is sometimes termed the off-side rule. Some other languages use indentation this way; but in most, indentation has no semantic meaning. The recommended indent size is four spaces. Python's statements include the following: The assignment statement (=) binds a name as a reference to a separate, dynamically allocated object. Variables may subsequently be rebound at any time to any object. In Python, a variable name is a generic reference holder without a fixed data type; however, it always refers to some object with a type. This is called dynamic typing—in contrast to statically-typed languages, where each variable may contain only a value of a certain type. Python does not support tail call optimization or first-class continuations; according to Van Rossum, the language never will. However, better support for coroutine-like functionality is provided by extending Python's generators. Before 2.5, generators were lazy iterators; data was passed unidirectionally out of the generator. From Python 2.5 on, it is possible to pass data back into a generator function; and from version 3.3, data can be passed through multiple stack levels. Python's expressions include the following: In Python, a distinction between expressions and statements is rigidly enforced, in contrast to languages such as Common Lisp, Scheme, or Ruby. This distinction leads to duplicating some functionality, for example: A statement cannot be part of an expression; because of this restriction, expressions such as list and dict comprehensions (and lambda expressions) cannot contain statements. As a particular case, an assignment statement such as a = 1 cannot be part of the conditional expression of a conditional statement. Python uses duck typing, and it has typed objects but untyped variable names. Type constraints are not checked at definition time; rather, operations on an object may fail at usage time, indicating that the object is not of an appropriate type. Despite being dynamically typed, Python is strongly typed, forbidding operations that are poorly defined (e.g., adding a number and a string) rather than quietly attempting to interpret them. Python allows programmers to define their own types using classes, most often for object-oriented programming. New instances of classes are constructed by calling the class, for example, SpamClass() or EggsClass()); the classes are instances of the metaclass type (which is an instance of itself), thereby allowing metaprogramming and reflection. Before version 3.0, Python had two kinds of classes, both using the same syntax: old-style and new-style. Current Python versions support the semantics of only the new style. Python supports optional type annotations. These annotations are not enforced by the language, but may be used by external tools such as mypy to catch errors. Python includes a module typing including several type names for type annotations. Also, mypy supports a Python compiler called mypyc, which leverages type annotations for optimization. 1.33333 frozenset() Python includes conventional symbols for arithmetic operators (+, -, *, /), the floor-division operator //, and the modulo operator %. (With the modulo operator, a remainder can be negative, e.g., 4 % -3 == -2.) Also, Python offers the ** symbol for exponentiation, e.g. 5**3 == 125 and 9**0.5 == 3.0. Also, it offers the matrix‑multiplication operator @ . These operators work as in traditional mathematics; with the same precedence rules, the infix operators + and - can also be unary, to represent positive and negative numbers respectively. Division between integers produces floating-point results. The behavior of division has changed significantly over time: In Python terms, the / operator represents true division (or simply division), while the // operator represents floor division. Before version 3.0, the / operator represents classic division. Rounding towards negative infinity, though a different method than in most languages, adds consistency to Python. For instance, this rounding implies that the equation (a + b)//b == a//b + 1 is always true. Also, the rounding implies that the equation b*(a//b) + a%b == a is valid for both positive and negative values of a. As expected, the result of a%b lies in the half-open interval [0, b), where b is a positive integer; however, maintaining the validity of the equation requires that the result must lie in the interval (b, 0] when b is negative. Python provides a round function for rounding a float to the nearest integer. For tie-breaking, Python 3 uses the round to even method: round(1.5) and round(2.5) both produce 2. Python versions before 3 used the round-away-from-zero method: round(0.5) is 1.0, and round(-0.5) is −1.0. Python allows Boolean expressions that contain multiple equality relations to be consistent with general usage in mathematics. For example, the expression a < b < c tests whether a is less than b and b is less than c. C-derived languages interpret this expression differently: in C, the expression would first evaluate a < b, resulting in 0 or 1, and that result would then be compared with c. Python uses arbitrary-precision arithmetic for all integer operations. The Decimal type/class in the decimal module provides decimal floating-point numbers to a pre-defined arbitrary precision with several rounding modes. The Fraction class in the fractions module provides arbitrary precision for rational numbers. Due to Python's extensive mathematics library and the third-party library NumPy, the language is frequently used for scientific scripting in tasks such as numerical data processing and manipulation. Functions are created in Python by using the def keyword. A function is defined similarly to how it is called, by first providing the function name and then the required parameters. Here is an example of a function that prints its inputs: To assign a default value to a function parameter in case no actual value is provided at run time, variable-definition syntax can be used inside the function header. Code examples "Hello, World!" program: Program to calculate the factorial of a non-negative integer: Libraries Python's large standard library is commonly cited as one of its greatest strengths. For Internet-facing applications, many standard formats and protocols such as MIME and HTTP are supported. The language includes modules for creating graphical user interfaces, connecting to relational databases, generating pseudorandom numbers, arithmetic with arbitrary-precision decimals, manipulating regular expressions, and unit testing. Some parts of the standard library are covered by specifications—for example, the Web Server Gateway Interface (WSGI) implementation wsgiref follows PEP 333—but most parts are specified by their code, internal documentation, and test suites. However, because most of the standard library is cross-platform Python code, only a few modules must be altered or rewritten for variant implementations. As of 13 March 2025,[update] the Python Package Index (PyPI), the official repository for third-party Python software, contains over 614,339 packages. Development environments Most[which?] Python implementations (including CPython) include a read–eval–print loop (REPL); this permits the environment to function as a command line interpreter, with which users enter statements sequentially and receive results immediately. Also, CPython is bundled with an integrated development environment (IDE) called IDLE, which is oriented toward beginners.[citation needed] Other shells, including IDLE and IPython, add additional capabilities such as improved auto-completion, session-state retention, and syntax highlighting. Standard desktop IDEs include PyCharm, Spyder, and Visual Studio Code; there are web browser-based IDEs, such as the following environments: Implementations CPython is the reference implementation of Python. This implementation is written in C, meeting the C11 standard since version 3.11. Older versions use the C89 standard with several select C99 features, but third-party extensions are not limited to older C versions—e.g., they can be implemented using C11 or C++. CPython compiles Python programs into an intermediate bytecode, which is then executed by a virtual machine. CPython is distributed with a large standard library written in a combination of C and native Python. CPython is available for many platforms, including Windows and most modern Unix-like systems, including macOS (and Apple M1 Macs, since Python 3.9.1, using an experimental installer). Starting with Python 3.9, the Python installer intentionally fails to install on Windows 7 and 8; Windows XP was supported until Python 3.5, with unofficial support for VMS. Platform portability was one of Python's earliest priorities. During development of Python 1 and 2, even OS/2 and Solaris were supported; since that time, support has been dropped for many platforms. All current Python versions (since 3.7) support only operating systems that feature multithreading, by now supporting not nearly as many operating systems (dropping many outdated) than in the past. All alternative implementations have at least slightly different semantics. For example, an alternative may include unordered dictionaries, in contrast to other current Python versions. As another example in the larger Python ecosystem, PyPy does not support the full C Python API. Creating an executable with Python often is done by bundling an entire Python interpreter into the executable, which causes binary sizes to be massive for small programs, yet there exist implementations that are capable of truly compiling Python. Alternative implementations include the following: Stackless Python is a significant fork of CPython that implements microthreads. This implementation uses the call stack differently, thus allowing massively concurrent programs. PyPy also offers a stackless version. Just-in-time Python compilers have been developed, but are now unsupported: There are several compilers/transpilers to high-level object languages; the source language is unrestricted Python, a subset of Python, or a language similar to Python: There are also specialized compilers: Some older projects existed, as well as compilers not designed for use with Python 3.x and related syntax: A performance comparison among various Python implementations, using a non-numerical (combinatorial) workload, was presented at EuroSciPy '13. In addition, Python's performance relative to other programming languages is benchmarked by The Computer Language Benchmarks Game. There are several approaches to optimizing Python performance, despite the inherent slowness of an interpreted language. These approaches include the following strategies or tools: Language Development Python's development is conducted mostly through the Python Enhancement Proposal (PEP) process; this process is the primary mechanism for proposing major new features, collecting community input on issues, and documenting Python design decisions. Python coding style is covered in PEP 8. Outstanding PEPs are reviewed and commented on by the Python community and the steering council. Enhancement of the language corresponds with development of the CPython reference implementation. The mailing list python-dev is the primary forum for the language's development. Specific issues were originally discussed in the Roundup bug tracker hosted by the foundation. In 2022, all issues and discussions were migrated to GitHub. Development originally took place on a self-hosted source-code repository running Mercurial, until Python moved to GitHub in January 2017. CPython's public releases have three types, distinguished by which part of the version number is incremented: Many alpha, beta, and release-candidates are also released as previews and for testing before final releases. Although there is a rough schedule for releases, they are often delayed if the code is not ready yet. Python's development team monitors the state of the code by running a large unit test suite during development. The major academic conference on Python is PyCon. Also, there are special Python mentoring programs, such as PyLadies. Naming Python's name is inspired by the British comedy group Monty Python, whom Python creator Guido van Rossum enjoyed while developing the language. Monty Python references appear frequently in Python code and culture; for example, the metasyntactic variables often used in Python literature are spam and eggs, rather than the traditional foo and bar. Also, the official Python documentation contains various references to Monty Python routines. Python users are sometimes referred to as "Pythonistas". Languages influenced by Python See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-234] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-235] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/The_Hertz_Corporation] | [TOKENS: 5080] |
Contents Hertz Global Holdings Hertz Global Holdings, Inc. (formerly The Hertz Corporation), known as Hertz, is an American car rental company based in Estero, Florida. The company operates its namesake Hertz brand, along with the brands Dollar Rent A Car, Firefly Car Rental, and Thrifty Car Rental. It is one of the three big rental car holding companies in the United States, having a 36% market share, placing it ahead of Avis Budget Group and second to Enterprise Holdings. As one of the largest worldwide vehicle rental companies by sales, locations, and fleet size, Hertz operates in 160 countries in North America, Europe, Latin America, Africa, Asia, Australia, the Caribbean, the Middle East, and New Zealand. Hertz was ranked 326th in the 2020 Fortune 500 list. The company filed for bankruptcy on May 22, 2020, citing a sharp decline in revenue and future bookings caused by the COVID-19 pandemic. As of December 31, 2021, the company had revenues of $7.3 billion, assets of $19.7 billion, and 23,000 employees. As of July 1, 2021, the company is no longer in Chapter 11 bankruptcy. Company History John D. Hertz began selling cars in 1905 for Walden W. Shaw, whose family owned a Chicago bakery and a dealership for Berliet automobiles that were imported from France. The car business was losing money and Hertz agreed to help Shaw in exchange for part ownership of the dealership. Hertz introduced liberal trade-in terms which helped to sell new vehicles, but left Shaw with many used cars. Hertz convinced Shaw to monetize the used cars on by offering them for hire as taxis, names for the meters that showed the fare based on distance traveled. Shaw and Hertz began building their own taxis with Hertz naming the new company Yellow Truck and Coach Manufacturing Company. In 1916, the Walden W. Shaw Livery Company, the Shaw Cab Manufacturing Company, and the Yellow Cab Corporation were established as Shaw Corporation of New York. Shaw resigned as president of the company in 1920, and was succeeded by Hertz. Hertz developed an interest in the Rent-a-Car Inc., that was founded by Walter L. Jacobs in 1918, leading to him purchasing the company in 1923. It was then renamed to Hertz Drive-Ur-Self System. This small car rental operation began with a dozen Model T Ford cars. Within five years, the fleet expanded to 600 vehicles—generating annual revenues of approximately US$1 million. Jacobs continued to serve as president and chief operating officer of the "Hertz Drive-Ur-Self System" until 1961. Hertz initially planned to rename cars produced by Walden W. Shaw Livery from "Shaw" to "Ambassador". Instead, the cars were built from 1924 under the "Hertz" name for the you-drive-it business. The Hertz Model D-1 was available as a tourer or sedan, selling for $1,675. Hertz continued the car hiring business, but ended production of the Model D-1 in 1927, with vehicles from that point on provided by other manufacturers. Total D-1 production was approximately 4,000 vehicles. After three years of ownership, Hertz sold the rental car brand to General Motors Corporation in 1927. GM purchased the rest of Yellow Truck and Coach Manufacturing Company in 1943. Under the ownership of General Motors, the company released the first rental car charge card in 1926, opened its first rental car location at Chicago's Midway Airport in 1932, and introduced the first one-way rental plan in 1933. Hertz Drive-Ur-Self System expanded services to Canada in 1938 and Europe (France) in 1950. John Hertz repurchased the brand from General Motors in 1953 through his other company, The Omnibus Corporation, which he renamed to The Hertz Corporation. In 1954, its stock began trading on the New York Stock Exchange. It also purchased a New York-based truck leasing company, Metropolitan Distributors, which included a fleet of 4,000 trucks. This acquisition increased The Hertz Corporation's fleet to 15,500 trucks and 12,900 passenger cars. During General Motors' ownership, the company had sold many local Hertz franchises to independent business entities. With the financial backing of investment bankers Lehman Brothers, led by Lehman partner and Hertz director Frank J. Manheim, Hertz initiated a buyback program for its franchises in the US, which it then expanded globally. Manheim predicted the global growth and devised the strategy that made "Hertz" a household name and led to dynamic growth in the company's market capitalization from $7 million in 1953 to $100 million in 1965. The company expanded to South America in 1961. In 1967, The Hertz Corporation became a subsidiary of Radio Corporation of America. In 1985, the car rental company was sold to UAL Corporation, later known as Allegis Corporation, for a cash deal of US$587.5 million. This acquisition expanded Hertz's vehicle renting and leasing, with nearly 400,000 cars and trucks in 120 countries across the globe. In the summer of 1987, Allegis Corporation chairman and president Frank Olson announced the company would be selling Hertz due to internal changes. Park Ridge Corporation, which was owned and operated under Ford Motor Company, purchased Hertz in October 1987 for US$1.3 billion, and Hertz relocated its headquarters from Midtown Manhattan to Park Ridge, New Jersey in 1988. In 2002, Hertz became the first international car rental company to open in China. In 2013, Hertz began partnering with China's largest car rental company, China Auto Rental. In 2016, it reduced its ownership stake but announced a continuing commercial relationship through 2023. By the second quarter of 2005, Hertz produced about ten percent of Ford's overall pre-tax profit. However, after 18 years of ownership, the Ford Motor Company announced it would be selling the Hertz brand with the intent to focus more on building Ford cars and trucks. Private equity firms Clayton, Dubilier & Rice, The Carlyle Group, and Merrill Lynch Global Private Equity agreed to purchase all shares of common stock in Hertz for an estimated US$15 billion, including debt, and the business itself for US$5.6 billion in 2005. The Clayton, Dubilier & Rice consortium took Hertz Global Holdings public again on the New York Stock Exchange in November 2006, and Hertz began to expand through Europe. Hertz launched subbrand "Simply Wheelz" in September 2007 for economy-minded and leisure-market audiences. By 2008, the service expanded to airports in California, Florida, and the McCarran International Airport in Las Vegas, Nevada. Once an online reservation was made, customers were able to choose one of six types of vehicles at self-service rental kiosks. Simply Wheelz was rebranded as Advantage Rent-a-Car in the fall of 2009. In late December 2009, Clayton, Dubilier & Rice announced the acquisition of used cars dealer British Car Auctions (BCA) from London-based equity firm Montagu Private Equity for an estimated £390 million. BCA was subsequently sold to Haversham Holdings, an investment business, without ever being integrated into Hertz. In November 2012, Hertz Global Holdings Chairman and CEO Mark P. Frissora announced the company's purchase of Dollar Thrifty Automotive Group, a U.S.-based car rental brand with headquarters in Tulsa, Oklahoma, for US$2.3 billion. The business transaction included Hertz paying $87.50 per share of the Dollar Thrifty stock. The deal was finalized on November 19, 2012, and resulted in a combined 10,400 locations in approximately 150 countries. Before the merge, Dollar Thrifty was the fourth-largest car rental company. In December 2012, Hertz announced it would sell its Advantage Rent a Car unit to Franchise Services of North America and Macquarie Capital after the acquisition of Dollar Thrifty Automotive Group was finalized. In May 2013, Frissora and Florida Governor Rick Scott announced Hertz Global Holdings would relocate their worldwide headquarters from Park Ridge, New Jersey to Estero, Florida. Relocation to Southwest Florida was influenced by the state's travel and tourism industry, proximity to Orlando and Miami, to condense corporate offices, and to increase efficiency of Hertz Global brands. A temporary office building in Naples, Florida housed 640 employees until construction of a new facility was completed in 2015. John P. Tague replaced Frissora as chief executive officer and president of The Hertz Corporation in November 2014. Kathryn V. Marinello, former CEO of Stream Global Services, was appointed president and chief executive officer of The Hertz Corporation on January 2, 2017, following John Tague's retirement. Marinello resigned as CEO on May 18, 2020, and Hertz announced that Paul Stone as new president and chief executive. Stone previously served as Hertz's executive vice president and chief retail operations officer for North America. On October 5, 2021, Hertz announced it had named Mark Fields interim CEO and Paul Stone president and chief operations officer. In February 2022, Stephen M. Scherr was named Chief Executive Officer of Hertz. As a result of the COVID-19 pandemic, on April 30, 2020, Hertz announced that it had missed lease payments on its fleet and was seeking support from its lenders, including activist investor Carl Icahn, in an attempt to avoid bankruptcy. Marinello resigned as CEO on May 18, 2020, and Hertz announced that Paul Stone would be the new president and chief executive. Stone previously served as Hertz's executive vice president and chief retail operations officer for North America. On May 22, 2020, the Wall Street Journal reported that Hertz was preparing to file for bankruptcy because it did not reach an agreement with top lenders. That same day, the company filed for Chapter 11 bankruptcy. Carl Icahn held 39% of Hertz's shares when it filed for bankruptcy on May 22, 2020, and he controlled three board seats. He invested a total of $2.3 billion into Hertz shares from 2014 to 2020. Hertz financed itself mostly by taking out loans secured by its fleet of cars, and if the cars fell in value, Hertz's lenders had the right to demand an immediate payment, reducing the amount of the loan, so they were still comfortably covered by the cars’ now-lower value. Because of the crisis, used-car values and sales volumes fell right as Hertz lost most of its customers. The bankruptcy filing started a 60-day clock, during which Hertz's secured lenders must wait before they can foreclose on the 400,000 U.S. cars that were financed through such arrangements. Despite the bankruptcy filing, Hertz announced on June 11, 2020, that it was seeking to raise up to $1 billion in new equity (with disclaimers that there is a "risk that the common stock could ultimately be worthless"). The Wall Street Journal characterized the potential stock sale as a "seemingly unprecedented move for a large bankrupt company eager to capitalize on market anomalies," as its stock price rose nearly 1000% from a low of 59 cents after its bankruptcy filing to $5.50 a share. Hertz's stock has been heavily traded by retail investors, becoming one of the most-traded stocks. Hertz sold $29 million in stock before the Securities Exchange Commission halted further sales. The stock was delisted from the New York Stock Exchange in October 2020. In December 2020, the company's UK division filed for Chapter 15 bankruptcy. A large fire at Southwest Florida International Airport in April 2020 destroyed approximately 1,200 Hertz cars. Multiple car rental companies, including Hertz, stored thousands of cars in a temporary storage lot on a grass field after the COVID-19 pandemic impacted the rental car business. Investigators determined that the cars caught fire after heat from a car exhaust system ignited dry grass in the field, starting a large fire that took. The fire led to the total loss of the Hertz cars and over 2,000 vehicles owned by other companies, at a total cost of over $100 million. On July 1, 2021, Hertz emerged from Chapter 11 bankruptcy and changed its ticker symbol from HTZGQ to HTZZ. In October 2021, Mark Fields was named interim CEO of Hertz with a focus on forward looking investments. On October 25, 2021, Fields announced that Hertz will buy 100,000 Tesla vehicles citing his goal of fleet electrification and that Tesla is the "only manufacturer that can produce EVs at scale". In November 2021, the company went public again on the Nasdaq with symbol HTZ. In February 2022, Stephen M. Scherr was named CEO of Hertz. He stepped down in March 2024, being replaced by Gil West. Since 2019, Hertz has been embroiled in a mounting controversy where numerous reports have emerged of Hertz falsely accusing its customers of having stolen automobiles which were under an active rental agreement. Many of the victims report having been arrested in full view of their families and at gunpoint. Victims reported being jailed for months before being released, resulting in loss of licensure, professional credentials and employment while their falsely filed felonies languish through the system, depriving them of income and their rights. Victims also reported that Hertz claims that outdated computer systems are to blame, local and corporate systems "not communicating correctly", but victim profiles have indicated that these false arrests are more likely to happen to minority groups. In February 2022, a judge forced Hertz to reveal that it reports about 3,365 of its customers to the police every year for theft. Hertz claims that such instances are only a "rare situation" as this is only 0.014% of their 25 million annual transactions in the United States. According to reports, 47 customers have filed lawsuits against Hertz for misreporting cars as stolen resulting in many false arrests and imprisonment. In December 2022, Hertz announced the settlement of 364 pending claims relating to vehicle theft reporting, bringing resolution to more than 95% of its pending theft reporting claims. In this settlement, Hertz agreed to pay approximately $168 million by year-end to resolve these disputes. Car rental locations and operation Hertz has approximately 12,000 corporate and franchisee locations in 160 countries throughout North America, Europe, Latin America, Africa, Asia, Australia, the Caribbean, the Middle East and New Zealand. Wilford Gwilliam of Overland West purchased a portion of Hertz franchise in 1941. Overland West is the largest Hertz franchise licensee in North America, operating 27 car rental and four car sales locations in eight states. Gwilliam sold it to Devere J. Sparrow, who led the organization until selling it to his son-in-law, Jerry H. Petersen, in 1976. As the current owner, president and CEO, Petersen oversees franchises and employees. Rental fleet The Hertz rental car fleet has consisted of a variety of vehicle manufacturers, ranging from BYD, Mercedes, Infiniti, Cadillac, Land Rover, BMW, Porsche, Jaguar, Mazda, Volvo, Toyota, Jeep, and Lincoln, along with various others. By December 2012, the company had over 490,000 cars in the United States. As of 2014, 78 percent of Hertz's fleet includes vehicles that reach 28 miles per gallon or more on the highway. In 1966, Hertz engaged racing and automotive designer Carroll Shelby to develop an exclusive version of his modified Ford Mustang. The objective was to attract more customers to Hertz and the Hertz fleet would tempt car renters to buy a Mustang or a Shelby-Mustang. One thousand GT350H Mustangs were built as rental cars, although urban legend also maintains that many were missing original engines when returned. The "Rent-a-Racer" program was available in selected locations during the late 1960s for a limited time. The fleet has included Corvettes, Jaguar XK-Es, and AMC AMXs. In 2006, Hertz partnered with Shelby to rent specially-modified Ford Mustang GT-H Coupes as a tribute to the original 1966 Ford Mustang Shelby GT350-H. These 500 cars in 2006 were followed by convertibles in 2007. In 2008, Hertz began to rent modified Chevrolet Corvette (C6) "ZH-Z" Coupes, followed by convertibles in 2009. Hertz reintroduced the program in 2016, and includes vehicles such as Hendrick Motorsports-modified Chevrolet Corvette (C7) Stingray Z/06 and Chevrolet Camaro SS and ZL-1, and a Shelby-modified Ford Mustang GT. In 2000, Hertz introduced SiriusXM Satellite Radio to its North America rental fleet. In 2007, the company began testing hourly car rentals at three locations in New York City. It launched a global carsharing service under the name Connect by Hertz in December 2008, serving customers who paid a fee to rent cars by the hour in Park Ridge, New York, Orlando, London, Paris, and Sydney. Later branded as Hertz on Demand and Hertz 24/7, operation in the United States ceased in September 2015. In 2009, it began testing a photo system to record damage to its rental cars. It introduced ExpressRent kiosks at various rental locations in November 2011. This was the first introduction of large-scale car rental kiosks in the United States that used a live agent through video chat. Hertz launched its Green Collection of rental cars in September 2006. This fleet of environmentally friendly vehicles has included the Toyota Prius, Ford Fusion, Buick LaCrosse, Toyota Camry, and Hyundai Sonata. In the US vehicles within this group feature Environmental Protection Agency (EPA) highway fuel efficiency ratings of 28 mpg‑US (8.4 L/100 km; 33.6 mpg‑imp) on the highway. Different models and different standards apply in other markets. The Green Collection was introduced in Singapore in 2014. In October 2021, Hertz announced it was investing in electric vehicles with the purchase of 100,000 Teslas, primarily Tesla Model 3 sedans. In March 2022, Tesla Model Y crossovers were added to the order. The Hertz Tesla fleet will have access to Tesla’s supercharger network as Hertz develops their own vehicle charging infrastructure. Hertz and Uber announced a partnership in October 2022 where Hertz will offer up to 50,000 Tesla vehicles for rent to Uber drivers. In April 2022, Hertz announced its intent to purchase up to 65,000 Polestar 2 vehicles over 5 years from electric carmaker Polestar. In September 2022, Hertz unveiled a deal with General Motors to purchase up to 175,000 electric vehicles from Chevrolet, Buick, GMC, Cadillac, and BrightDrop over a period of five years. As part of its investment in electrification, Hertz announced in September 2022 a memorandum of understanding (MOU) for the development of a national network of EV charging stations powered by bp pulse, bp’s global electrification and charging solution brand. In February 2023, Hertz stated that it plans to make a quarter of its total vehicle fleet electric. In January 2024, Hertz announced plans to sell a third of its EV fleet through outlets such as Auto Trader and EV.com, and to reinvest in petrol/gas-powered cars due to weak demand and high repair costs for its battery-powered vehicles. The company placed much of the blame on Tesla, whose prices cuts forced the company to write down the value of its EVS more quickly than anticipated, and Hertz's CEO, Stephen Scherr, said Tesla was less willing than other automakers to give volume discounts on replacement parts. At the time of the announcement, Hertz offered EVs from Tesla as well as GM, Kia, Polestar, and Volkswagen. Scherr also cited how often EVs were involved in accidents at higher rates than other cars, with consumers less experienced in operating them. He noted, however, this was a partial reversal, not a complete abandonment of its fleet electrification. Analysts suggested the move would hurt the second-hand market for used EVs and might discourage some consumers from purchasing these vehicles. Scherr also said that the decision was based on declining resale values of its EV fleet affecting the price that Hertz could recoup after the fleet's lifetime and that dropping EVs form the company's catalogue was "the consequence of a material price decline in Teslas and EVs more generally." It would follow with an announcement in February 2024 that it was pausing purchases of Polestar vehicles, citing a loss in the value of EVs. Hertz opened a heavy equipment rental division in 1965, the Hertz Equipment Rental Corporation, with its first location in Houston, Texas and headquarters in Park Ridge, New Jersey. Herc Rentals became an independently publicly traded company called Herc Rentals on July 1, 2016. The company is headquartered in Bonita Springs, Florida. Herc Rentals wasn't affected by its former parent's bankruptcy on May 22, 2020. Advertising In December 1959, a 100 foot long by 23 foot high Hertz Rent-A-Car advertising sign was diagonally installed on the roof of the Texas School Book Depository in Texas in the United States. The sign included the words "HERTZ RENT A CAR" (red text on yellow background) and "CHEVROLETS" (yellow text on black background) made up of 128 metal plates, and an electronic display that alternately told the time and temperature (one of first billboards to do so). The building was Lee Harvey Oswald's vantage point from where he assassinated United States President John F. Kennedy on November 22, 1963. Many witnesses would recall the approximate time (12:30) of the assassination from reading the digital clock on the billboard. The aging sign was removed in 1979, and its metal plates are part of the collection of the Sixth Floor Museum at Dealey Plaza. In 1959, the advertising firm of Norman, Craig & Kummel (NCK) was selected as the new advertising agency for Hertz. NCK developed the slogan "Hertz puts you in the driver's seat", which was first used commercially in September 1959. NCK changed the wording to "Let Hertz put you in the driver's seat" by October 1959. Popular a cappella quartet The Hi-Lo's sang the Hertz song for the commercials. Hertz used the line in the early 1960s in print, signs, and television. The series is listed as number 65 in the top 100 advertising campaigns of the 20th century by Advertising Age magazine. In the 1980s and 1990s, former football player O. J. Simpson appeared as a spokesperson in Hertz ads. Simpson's place in advertising is said to have sparked black athletes being featured in film and television. One spot from the mid-1970s showed Simpson, at that point an American football player, running through an airport terminal, dressed in business attire, leaping over rows of seats to get to his Hertz rental car. A woman yelled, "Go, O. J., Go!" The tagline of the ad, as spoken by Simpson, was "Hertz, the superstar in rent-a-car". The ad campaign was highly successful for the first five years it was used, and helped to propel Simpson to get high-profile sponsorship deals with other companies. Through the 1980s and 1990s, Simpson appeared with golfer Arnold Palmer and actress Jamie Lee Curtis. After Simpson's Ford Bronco chase and his murder trial, Hertz cut all ties to him. References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Template_talk:Orion_(constellation)] | [TOKENS: 71] |
Contents Template talk:Orion (constellation) Move discussion in progress There is a move discussion in progress on Template talk:Stars of Andromeda which affects this page. Please participate on that page and not in this talk page section. Thank you. —RMCD bot 05:05, 3 February 2019 (UTC)[reply] |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Snap!_(programming_language)] | [TOKENS: 1122] |
Contents Snap! (programming language) Snap! (formerly Build Your Own Blocks) is a free block-based educational graphical programming language and online community. Snap allows students to explore, create, and remix interactive animations, games, stories, and more, while learning about mathematical and computational ideas. While inspired by Scratch, Snap! has many advanced features. The Snap! editor, and programs created in it, are web applications that run in the browser (like Scratch) without requiring installation.[Note 2] It is built on top of Morphic.js, a Morphic GUI, written by Jens Mönig as 'middle layer' between Snap! itself and 'bare' JavaScript. User interface In Snap!, the screen is organized in three resizable columns containing five regions: the block group selector (top of left column), the blocks palette (left column), the main area (middle column), and the stage area (top of right column) with the sprite selector (also called the sprite corral) showing sprite thumbnails below it.[Note 3] In the interactively resizable stage area are shown the graphical results of the scripts running in the script area and/or interactively double-clicked individual blocks in any palette. Individual blocks can be dragged from the palette onto the scripts area to be associated with the selected sprite. Snap!'s blocks are divided into eight groups: Motion, Looks, Sound, Pen, Control, Sensing, Operators, and Variables. The layout of these groups in the block group selector is shown in the table below. The central area can show scripts, costumes/backdrops, or sounds associated with the selected sprite. What that area shows depends on the selected tab. Features The most important features that Snap! offers, but Scratch does not, include: Mascot Alonzo, the mascot of Snap!, is a modified version of Gobo from Scratch, with permission from the Scratch Team. He is named after Alonzo Church[citation needed], the inventor of lambda calculus, and his hair is based on the Greek letter lambda. Special-purpose blocks (libraries) Extended sets of blocks can be found in Snap! libraries, such as the 'streams' library that enables one to make the complete, infinite Fibonacci sequence, for example, using the special blocks ('stream', 'show stream', 'tail of stream', and 'map ( ) over stream' block) from the library. Many other libraries are available, such as the 'list utilities' library, the 'words, sentences' library, the 'iterations' library, the 'animation' library, the 'frequency distribution' library, the 'audio computation' library, the 'bar charts' library, the 'world map' library, the 'colors and crayons' library, the 'strings and multi-line input' library, the 'parallelization' library, etc. for other special purposes. Limitations While the software itself has little restraints, it does have some limitations. These include: History The web-based Snap! and older desktop-based BYOB were both developed by Jens Mönig for Windows, OS X and Linux with design ideas and documentation provided by Brian Harvey from University of California, Berkeley and have been used to teach "The Beauty and Joy of Computing" introductory course in computer science (CS) for non-CS-major students. Jens was a member of the Scratch Team before creating Snap!. BYOB is still available for downloading. License The source code of Snap! is GNU Affero General Public License (AGPL) licensed and is hosted on GitHub. The earlier, desktop-based 3.x version's code is available under a license that allows modification for only non-commercial uses and can be downloaded from the UC Berkeley website or CNET's download.com and TechTracker download page. Platforms Snap! runs on the major web-browsers on Windows, iOS, MacOS and Linux devices. Supported web-browsers such as Implementation Snap! is built on top of Morphic.js, a Morphic GUI, which serves as 'middle layer' between Snap! itself and 'bare' JavaScript. It uses an HTML5 Canvas application programming interface (API). All things visible in Snap! are morphs themselves, i.e. all buttons, sliders, dialog boxes, menus, entry fields, text rendering, blinking cursors etc. are created with morphic.js rather than using HTML DOM elements. Snap! caches the shapes of sprites so the sprite doesn't have to be re-drawn onto a new Canvas element every time the mouse moves over its bounding box. It does not cache blocks, however. Instead it manages the insides of C-shaped blocks through the morphic "holes" mechanism. All user interaction is triggered by events, which are passed on from the root element "the world" to its submorphs. Dropping a morph causes it to become embedded in a new 'owner' ('parent') morph. In Morphic the preferred way to run an animation is to register it with the World by adding it to the World's animation queue. The World steps each registered animation once per display cycle independently of the Morphic stepping mechanism. Recognition Snap! has been recognized by the Logo Foundation, and reviewed in an online magazine for programmers. As of December 2014, 100 New York City (NYC) high schools introduced University of California, Berkeley's “Beauty and Joy of Computing” as a new AP Computer Science Principles course, using Snap!. Jens and Brian received the National Technology Leadership Summit (NTLS) 2020 Educational Leadership Award for lifetime achievement based in part on Snap!. Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Template:Orion_(constellation)] | [TOKENS: 239] |
Contents Template:Orion (constellation) {{Orion (constellation)}} is a {{navbox}}-based template to be used at the bottom of articles about astronomical objects beyond the Solar System, located in the constellation of Orion. The template is divided into categories of stars, star clusters, nebulae, exoplanets, galaxies, galaxy clusters, and a miscellaneous "other" category. Red links Please refrain from adding red links and redirects to this navbox unless you intend to create an article on the object linked. Per guidelines on red links, "Red links may be used on navigation templates with links to existing articles, but they cannot be excessive. Editors who add excessive red links to navboxes are expected to actively work on building those articles, or they may be removed from the template." Collapsible option This template's initial visibility currently defaults to autocollapse, meaning that if there is another collapsible item on the page (a navbox, sidebar, or table with the collapsible attribute), it is hidden apart from its title bar; if not, it is fully visible. To change this template's initial visibility, the |state= parameter may be used: Related templates |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/OpenAI#cite_note-210] | [TOKENS: 8773] |
Contents OpenAI OpenAI is an American artificial intelligence research organization comprising both a non-profit foundation and a controlled for-profit public benefit corporation (PBC), headquartered in San Francisco. It aims to develop "safe and beneficial" artificial general intelligence (AGI), which it defines as "highly autonomous systems that outperform humans at most economically valuable work". OpenAI is widely recognized for its development of the GPT family of large language models, the DALL-E series of text-to-image models, and the Sora series of text-to-video models, which have influenced industry research and commercial applications. Its release of ChatGPT in November 2022 has been credited with catalyzing widespread interest in generative AI. The organization was founded in 2015 in Delaware but evolved a complex corporate structure. As of October 2025, following restructuring approved by California and Delaware regulators, the non-profit OpenAI Foundation holds 26% of the for-profit OpenAI Group PBC, with Microsoft holding 27% and employees/other investors holding 47%. Under its governance arrangements, the OpenAI Foundation holds the authority to appoint the board of the for-profit OpenAI Group PBC, a mechanism designed to align the entity’s strategic direction with the Foundation’s charter. Microsoft previously invested over $13 billion into OpenAI, and provides Azure cloud computing resources. In October 2025, OpenAI conducted a $6.6 billion share sale that valued the company at $500 billion. In 2023 and 2024, OpenAI faced multiple lawsuits for alleged copyright infringement against authors and media companies whose work was used to train some of OpenAI's products. In November 2023, OpenAI's board removed Sam Altman as CEO, citing a lack of confidence in him, but reinstated him five days later following a reconstruction of the board. Throughout 2024, roughly half of then-employed AI safety researchers left OpenAI, citing the company's prominent role in an industry-wide problem. Founding In December 2015, OpenAI was founded as a not for profit organization by Sam Altman, Elon Musk, Ilya Sutskever, Greg Brockman, Trevor Blackwell, Vicki Cheung, Andrej Karpathy, Durk Kingma, John Schulman, Pamela Vagata, and Wojciech Zaremba, with Sam Altman and Elon Musk as the co-chairs. A total of $1 billion in capital was pledged by Sam Altman, Greg Brockman, Elon Musk, Reid Hoffman, Jessica Livingston, Peter Thiel, Amazon Web Services (AWS), and Infosys. However, the actual capital collected significantly lagged pledges. According to company disclosures, only $130 million had been received by 2019. In its founding charter, OpenAI stated an intention to collaborate openly with other institutions by making certain patents and research publicly available, but later restricted access to its most capable models, citing competitive and safety concerns. OpenAI was initially run from Brockman's living room. It was later headquartered at the Pioneer Building in the Mission District, San Francisco. According to OpenAI's charter, its founding mission is "to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity." Musk and Altman stated in 2015 that they were partly motivated by concerns about AI safety and existential risk from artificial general intelligence. OpenAI stated that "it's hard to fathom how much human-level AI could benefit society", and that it is equally difficult to comprehend "how much it could damage society if built or used incorrectly". The startup also wrote that AI "should be an extension of individual human wills and, in the spirit of liberty, as broadly and evenly distributed as possible", and that "because of AI's surprising history, it's hard to predict when human-level AI might come within reach. When it does, it'll be important to have a leading research institution which can prioritize a good outcome for all over its own self-interest." Co-chair Sam Altman expected a decades-long project that eventually surpasses human intelligence. Brockman met with Yoshua Bengio, one of the "founding fathers" of deep learning, and drew up a list of great AI researchers. Brockman was able to hire nine of them as the first employees in December 2015. OpenAI did not pay AI researchers salaries comparable to those of Facebook or Google. It also did not pay stock options which AI researchers typically get. Nevertheless, OpenAI spent $7 million on its first 52 employees in 2016. OpenAI's potential and mission drew these researchers to the firm; a Google employee said he was willing to leave Google for OpenAI "partly because of the very strong group of people and, to a very large extent, because of its mission." OpenAI co-founder Wojciech Zaremba stated that he turned down "borderline crazy" offers of two to three times his market value to join OpenAI instead. In April 2016, OpenAI released a public beta of "OpenAI Gym", its platform for reinforcement learning research. Nvidia gifted its first DGX-1 supercomputer to OpenAI in August 2016 to help it train larger and more complex AI models with the capability of reducing processing time from six days to two hours. In December 2016, OpenAI released "Universe", a software platform for measuring and training an AI's general intelligence across the world's supply of games, websites, and other applications. Corporate structure In 2019, OpenAI transitioned from non-profit to "capped" for-profit, with the profit being capped at 100 times any investment. According to OpenAI, the capped-profit model allows OpenAI Global, LLC to legally attract investment from venture funds and, in addition, to grant employees stakes in the company. Many top researchers work for Google Brain, DeepMind, or Facebook, which offer equity that a nonprofit would be unable to match. Before the transition, OpenAI was legally required to publicly disclose the compensation of its top employees. The company then distributed equity to its employees and partnered with Microsoft, announcing an investment package of $1 billion into the company. Since then, OpenAI systems have run on an Azure-based supercomputing platform from Microsoft. OpenAI Global, LLC then announced its intention to commercially license its technologies. It planned to spend $1 billion "within five years, and possibly much faster". Altman stated that even a billion dollars may turn out to be insufficient, and that the lab may ultimately need "more capital than any non-profit has ever raised" to achieve artificial general intelligence. The nonprofit, OpenAI, Inc., is the sole controlling shareholder of OpenAI Global, LLC, which, despite being a for-profit company, retains a formal fiduciary responsibility to OpenAI, Inc.'s nonprofit charter. A majority of OpenAI, Inc.'s board is barred from having financial stakes in OpenAI Global, LLC. In addition, minority members with a stake in OpenAI Global, LLC are barred from certain votes due to conflict of interest. Some researchers have argued that OpenAI Global, LLC's switch to for-profit status is inconsistent with OpenAI's claims to be "democratizing" AI. On February 29, 2024, Elon Musk filed a lawsuit against OpenAI and CEO Sam Altman, accusing them of shifting focus from public benefit to profit maximization—a case OpenAI dismissed as "incoherent" and "frivolous," though Musk later revived legal action against Altman and others in August. On April 9, 2024, OpenAI countersued Musk in federal court, alleging that he had engaged in "bad-faith tactics" to slow the company's progress and seize its innovations for his personal benefit. OpenAI also argued that Musk had previously supported the creation of a for-profit structure and had expressed interest in controlling OpenAI himself. The countersuit seeks damages and legal measures to prevent further alleged interference. On February 10, 2025, a consortium of investors led by Elon Musk submitted a $97.4 billion unsolicited bid to buy the nonprofit that controls OpenAI, declaring willingness to match or exceed any better offer. The offer was rejected on 14 February 2025, with OpenAI stating that it was not for sale, but the offer complicated Altman's restructuring plan by suggesting a lower bar for how much the nonprofit should be valued. OpenAI, Inc. was originally designed as a nonprofit in order to ensure that AGI "benefits all of humanity" rather than "the private gain of any person". In 2019, it created OpenAI Global, LLC, a capped-profit subsidiary controlled by the nonprofit. In December 2024, OpenAI proposed a restructuring plan to convert the capped-profit into a Delaware-based public benefit corporation (PBC), and to release it from the control of the nonprofit. The nonprofit would sell its control and other assets, getting equity in return, and would use it to fund and pursue separate charitable projects, including in science and education. OpenAI's leadership described the change as necessary to secure additional investments, and claimed that the nonprofit's founding mission to ensure AGI "benefits all of humanity" would be better fulfilled. The plan has been criticized by former employees. A legal letter named "Not For Private Gain" asked the attorneys general of California and Delaware to intervene, stating that the restructuring is illegal and would remove governance safeguards from the nonprofit and the attorneys general. The letter argues that OpenAI's complex structure was deliberately designed to remain accountable to its mission, without the conflicting pressure of maximizing profits. It contends that the nonprofit is best positioned to advance its mission of ensuring AGI benefits all of humanity by continuing to control OpenAI Global, LLC, whatever the amount of equity that it could get in exchange. PBCs can choose how they balance their mission with profit-making. Controlling shareholders have a large influence on how closely a PBC sticks to its mission. On October 28, 2025, OpenAI announced that it had adopted the new PBC corporate structure after receiving approval from the attorneys general of California and Delaware. Under the new structure, OpenAI's for-profit branch became a public benefit corporation known as OpenAI Group PBC, while the non-profit was renamed to the OpenAI Foundation. The OpenAI Foundation holds a 26% stake in the PBC, while Microsoft holds a 27% stake and the remaining 47% is owned by employees and other investors. All members of the OpenAI Group PBC board of directors will be appointed by the OpenAI Foundation, which can remove them at any time. Members of the Foundation's board will also serve on the for-profit board. The new structure allows the for-profit PBC to raise investor funds like most traditional tech companies, including through an initial public offering, which Altman claimed was the most likely path forward. In January 2023, OpenAI Global, LLC was in talks for funding that would value the company at $29 billion, double its 2021 value. On January 23, 2023, Microsoft announced a new US$10 billion investment in OpenAI Global, LLC over multiple years, partially needed to use Microsoft's cloud-computing service Azure. From September to December, 2023, Microsoft rebranded all variants of its Copilot to Microsoft Copilot, and they added MS-Copilot to many installations of Windows and released Microsoft Copilot mobile apps. Following OpenAI's 2025 restructuring, Microsoft owns a 27% stake in the for-profit OpenAI Group PBC, valued at $135 billion. In a deal announced the same day, OpenAI agreed to purchase $250 billion of Azure services, with Microsoft ceding their right of first refusal over OpenAI's future cloud computing purchases. As part of the deal, OpenAI will continue to share 20% of its revenue with Microsoft until it achieves AGI, which must now be verified by an independent panel of experts. The deal also loosened restrictions on both companies working with third parties, allowing Microsoft to pursue AGI independently and allowing OpenAI to develop products with other companies. In 2017, OpenAI spent $7.9 million, a quarter of its functional expenses, on cloud computing alone. In comparison, DeepMind's total expenses in 2017 were $442 million. In the summer of 2018, training OpenAI's Dota 2 bots required renting 128,000 CPUs and 256 GPUs from Google for multiple weeks. In October 2024, OpenAI completed a $6.6 billion capital raise with a $157 billion valuation including investments from Microsoft, Nvidia, and SoftBank. On January 21, 2025, Donald Trump announced The Stargate Project, a joint venture between OpenAI, Oracle, SoftBank and MGX to build an AI infrastructure system in conjunction with the US government. The project takes its name from OpenAI's existing "Stargate" supercomputer project and is estimated to cost $500 billion. The partners planned to fund the project over the next four years. In July, the United States Department of Defense announced that OpenAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and xAI. In the same month, the company made a deal with the UK Government to use ChatGPT and other AI tools in public services. OpenAI subsequently began a $50 million fund to support nonprofit and community organizations. In April 2025, OpenAI raised $40 billion at a $300 billion post-money valuation, which was the highest-value private technology deal in history. The financing round was led by SoftBank, with other participants including Microsoft, Coatue, Altimeter and Thrive. In July 2025, the company reported annualized revenue of $12 billion. This was an increase from $3.7 billion in 2024, which was driven by ChatGPT subscriptions, which reached 20 million paid subscribers by April 2025, up from 15.5 million at the end of 2024, alongside a rapidly expanding enterprise customer base that grew to five million business users. The company’s cash burn remains high because of the intensive computational costs required to train and operate large language models. It projects an $8 billion operating loss in 2025. OpenAI reports revised long-term spending projections totaling approximately $115 billion through 2029, with annual expenditures projected to escalate significantly, reaching $17 billion in 2026, $35 billion in 2027, and $45 billion in 2028. These expenditures are primarily allocated toward expanding compute infrastructure, developing proprietary AI chips, constructing data centers, and funding intensive model training programs, with more than half of the spending through the end of the decade expected to support research-intensive compute for model training and development. The company's financial strategy prioritizes market expansion and technological advancement over near-term profitability, with OpenAI targeting cash-flow-positive operations by 2029 and projecting revenue of approximately $200 billion by 2030. This aggressive spending trajectory underscores both the enormous capital requirements of scaling cutting-edge AI technology and OpenAI's commitment to maintaining its position as a leader in the artificial intelligence industry. In October 2025, OpenAI completed an employee share sale of up to $10 billion to existing investors which valued the company at $500 billion. The deal values OpenAI as the most valuable privately owned company in the world—surpassing SpaceX as the world's most valuable private company. On November 17, 2023, Sam Altman was removed as CEO when its board of directors (composed of Helen Toner, Ilya Sutskever, Adam D'Angelo and Tasha McCauley) cited a lack of confidence in him. Chief Technology Officer Mira Murati took over as interim CEO. Greg Brockman, the president of OpenAI, was also removed as chairman of the board and resigned from the company's presidency shortly thereafter. Three senior OpenAI researchers subsequently resigned: director of research and GPT-4 lead Jakub Pachocki, head of AI risk Aleksander Mądry, and researcher Szymon Sidor. On November 18, 2023, there were reportedly talks of Altman returning as CEO amid pressure placed upon the board by investors such as Microsoft and Thrive Capital, who objected to Altman's departure. Although Altman himself spoke in favor of returning to OpenAI, he has since stated that he considered starting a new company and bringing former OpenAI employees with him if talks to reinstate him didn't work out. The board members agreed "in principle" to resign if Altman returned. On November 19, 2023, negotiations with Altman to return failed and Murati was replaced by Emmett Shear as interim CEO. The board initially contacted Anthropic CEO Dario Amodei (a former OpenAI executive) about replacing Altman, and proposed a merger of the two companies, but both offers were declined. On November 20, 2023, Microsoft CEO Satya Nadella announced Altman and Brockman would be joining Microsoft to lead a new advanced AI research team, but added that they were still committed to OpenAI despite recent events. Before the partnership with Microsoft was finalized, Altman gave the board another opportunity to negotiate with him. About 738 of OpenAI's 770 employees, including Murati and Sutskever, signed an open letter stating they would quit their jobs and join Microsoft if the board did not rehire Altman and then resign. This prompted OpenAI investors to consider legal action against the board as well. In response, OpenAI management sent an internal memo to employees stating that negotiations with Altman and the board had resumed and would take some time. On November 21, 2023, after continued negotiations, Altman and Brockman returned to the company in their prior roles along with a reconstructed board made up of new members Bret Taylor (as chairman) and Lawrence Summers, with D'Angelo remaining. According to subsequent reporting, shortly before Altman’s firing, some employees raised concerns to the board about how he had handled the safety implications of a recent internal AI capability discovery. On November 29, 2023, OpenAI announced that an anonymous Microsoft employee had joined the board as a non-voting member to observe the company's operations; Microsoft resigned from the board in July 2024. In February 2024, the Securities and Exchange Commission subpoenaed OpenAI's internal communication to determine if Altman's alleged lack of candor misled investors. In 2024, following the temporary removal of Sam Altman and his return, many employees gradually left OpenAI, including most of the original leadership team and a significant number of AI safety researchers. In August 2023, it was announced that OpenAI had acquired the New York-based start-up Global Illumination, a company that deploys AI to develop digital infrastructure and creative tools. In June 2024, OpenAI acquired Multi, a startup focused on remote collaboration. In March 2025, OpenAI reached a deal with CoreWeave to acquire $350 million worth of CoreWeave shares and access to AI infrastructure, in return for $11.9 billion paid over five years. Microsoft was already CoreWeave's biggest customer in 2024. Alongside their other business dealings, OpenAI and Microsoft were renegotiating the terms of their partnership to facilitate a potential future initial public offering by OpenAI, while ensuring Microsoft's continued access to advanced AI models. On May 21, OpenAI announced the $6.5 billion acquisition of io, an AI hardware start-up founded by former Apple designer Jony Ive in 2024. In September 2025, OpenAI agreed to acquire the product testing startup Statsig for $1.1 billion in an all-stock deal and appointed Statsig's founding CEO Vijaye Raji as OpenAI's chief technology officer of applications. The company also announced development of an AI-driven hiring service designed to rival LinkedIn. OpenAI acquired personal finance app Roi in October 2025. In October 2025, OpenAI acquired Software Applications Incorporated, the developer of Sky, a macOS-based natural language interface designed to operate across desktop applications. The Sky team joined OpenAI, and the company announced plans to integrate Sky’s capabilities into ChatGPT. In December 2025, it was announced OpenAI had agreed to acquire Neptune, an AI tooling startup that helps companies track and manage model training, for an undisclosed amount. In January 2026, it was announced OpenAI had acquired healthcare technology startup Torch for approximately $60 million. The acquisition followed the launch of OpenAI’s ChatGPT Health product and was intended to strengthen the company’s medical data and healthcare artificial intelligence capabilities. OpenAI has been criticized for outsourcing the annotation of data sets to Sama, a company based in San Francisco that employed workers in Kenya. These annotations were used to train an AI model to detect toxicity, which could then be used to moderate toxic content, notably from ChatGPT's training data and outputs. However, these pieces of text usually contained detailed descriptions of various types of violence, including sexual violence. The investigation uncovered that OpenAI began sending snippets of data to Sama as early as November 2021. The four Sama employees interviewed by Time described themselves as mentally scarred. OpenAI paid Sama $12.50 per hour of work, and Sama was redistributing the equivalent of between $1.32 and $2.00 per hour post-tax to its annotators. Sama's spokesperson said that the $12.50 was also covering other implicit costs, among which were infrastructure expenses, quality assurance and management. In 2024, OpenAI began collaborating with Broadcom to design a custom AI chip capable of both training and inference, targeted for mass production in 2026 and to be manufactured by TSMC on a 3 nm process node. This initiative intended to reduce OpenAI's dependence on Nvidia GPUs, which are costly and face high demand in the market. In January 2024, Arizona State University purchased ChatGPT Enterprise in OpenAI's first deal with a university. In June 2024, Apple Inc. signed a contract with OpenAI to integrate ChatGPT features into its products as part of its new Apple Intelligence initiative. In June 2025, OpenAI began renting Google Cloud's Tensor Processing Units (TPUs) to support ChatGPT and related services, marking its first meaningful use of non‑Nvidia AI chips. In September 2025, it was revealed that OpenAI signed a contract with Oracle to purchase $300 billion in computing power over the next five years. In September 2025, OpenAI and NVIDIA announced a memorandum of understanding that included a potential deployment of at least 10 gigawatts of NVIDIA systems and a $100 billion investment from NVIDIA in OpenAI. OpenAI expected the negotiations to be completed within weeks. As of January 2026, this has not been realized, and the two sides are rethinking the future of their partnership. In October 2025, OpenAI announced a multi-billion dollar deal with AMD. OpenAI committed to purchasing six gigawatts worth of AMD chips, starting with the MI450. OpenAI will have the option to buy up to 160 million shares of AMD, about 10% of the company, depending on development, performance and share price targets. In December 2025, Disney said it would make a $1 billion investment in OpenAI, and signed a three-year licensing deal that will let users generate videos using Sora—OpenAI's short-form AI video platform. More than 200 Disney, Marvel, Star Wars and Pixar characters will be available to OpenAI users. In early 2026, Amazon entered advanced discussions to invest up to $50 billion in OpenAI as part of a potential artificial intelligence partnership. Under the proposed agreement, OpenAI’s models could be integrated into Amazon’s digital assistant Alexa and other internal projects. OpenAI provides LLMs to the Artificial Intelligence Cyber Challenge and to the Advanced Research Projects Agency for Health. In October 2024, The Intercept revealed that OpenAI's tools are considered "essential" for AFRICOM's mission and included in an "Exception to Fair Opportunity" contractual agreement between the United States Department of Defense and Microsoft. In December 2024, OpenAI said it would partner with defense-tech company Anduril to build drone defense technologies for the United States and its allies. In 2025, OpenAI's Chief Product Officer, Kevin Weil, was commissioned lieutenant colonel in the U.S. Army to join Detachment 201 as senior advisor. In June 2025, the U.S. Department of Defense awarded OpenAI a $200 million one-year contract to develop AI tools for military and national security applications. OpenAI announced a new program, OpenAI for Government, to give federal, state, and local governments access to its models, including ChatGPT. Services In February 2019, GPT-2 was announced, which gained attention for its ability to generate human-like text. In 2020, OpenAI announced GPT-3, a language model trained on large internet datasets. GPT-3 is aimed at natural language answering questions, but it can also translate between languages and coherently generate improvised text. It also announced that an associated API, named the API, would form the heart of its first commercial product. Eleven employees left OpenAI, mostly between December 2020 and January 2021, in order to establish Anthropic. In 2021, OpenAI introduced DALL-E, a specialized deep learning model adept at generating complex digital images from textual descriptions, utilizing a variant of the GPT-3 architecture. In December 2022, OpenAI received widespread media coverage after launching a free preview of ChatGPT, its new AI chatbot based on GPT-3.5. According to OpenAI, the preview received over a million signups within the first five days. According to anonymous sources cited by Reuters in December 2022, OpenAI Global, LLC was projecting $200 million of revenue in 2023 and $1 billion in revenue in 2024. After ChatGPT was launched, Google announced a similar chatbot, Bard, amid internal concerns that ChatGPT could threaten Google’s position as a primary source of online information. On February 7, 2023, Microsoft announced that it was building AI technology based on the same foundation as ChatGPT into Microsoft Bing, Edge, Microsoft 365 and other products. On March 14, 2023, OpenAI released GPT-4, both as an API (with a waitlist) and as a feature of ChatGPT Plus. On November 6, 2023, OpenAI launched GPTs, allowing individuals to create customized versions of ChatGPT for specific purposes, further expanding the possibilities of AI applications across various industries. On November 14, 2023, OpenAI announced they temporarily suspended new sign-ups for ChatGPT Plus due to high demand. Access for newer subscribers re-opened a month later on December 13. In December 2024, the company launched the Sora model. It also launched OpenAI o1, an early reasoning model that was internally codenamed strawberry. Additionally, ChatGPT Pro—a $200/month subscription service offering unlimited o1 access and enhanced voice features—was introduced, and preliminary benchmark results for the upcoming OpenAI o3 models were shared. On January 23, 2025, OpenAI released Operator, an AI agent and web automation tool for accessing websites to execute goals defined by users. The feature was only available to Pro users in the United States. OpenAI released deep research agent, nine days later. It scored a 27% accuracy on the benchmark Humanity's Last Exam (HLE). Altman later stated GPT-4.5 would be the last model without full chain-of-thought reasoning. In July 2025, reports indicated that AI models by both OpenAI and Google DeepMind solved mathematics problems at the level of top-performing students in the International Mathematical Olympiad. OpenAI's large language model was able to achieve gold medal-level performance, reflecting significant progress in AI's reasoning abilities. On October 6, 2025, OpenAI unveiled its Agent Builder platform during the company's DevDay event. The platform includes a visual drag-and-drop interface that lets developers and businesses design, test, and deploy agentic workflows with limited coding. On October 21, 2025, OpenAI introduced ChatGPT Atlas, a browser integrating the ChatGPT assistant directly into web navigation, to compete with existing browsers such as Google Chrome and Apple Safari. On December 11, 2025, OpenAI announced GPT-5.2. This model will be better at creating spreadsheets, building presentations, perceiving images, writing code and understanding long context. On January 27, 2026, OpenAI introduced Prism, a LaTeX-native workspace meant to assist scientists to help with research and writing. The platform utilizes GPT-5.2 as a backend to automate the process of drafting for scientific papers, including features for managing citations, complex equation formatting, and real-time collaborative editing. In March 2023, the company was criticized for disclosing particularly few technical details about products like GPT-4, contradicting its initial commitment to openness and making it harder for independent researchers to replicate its work and develop safeguards. OpenAI cited competitiveness and safety concerns to justify this repudiation. OpenAI's former chief scientist Ilya Sutskever argued in 2023 that open-sourcing increasingly capable models was increasingly risky, and that the safety reasons for not open-sourcing the most potent AI models would become "obvious" in a few years. In September 2025, OpenAI published a study on how people use ChatGPT for everyday tasks. The study found that "non-work tasks" (according to an LLM-based classifier) account for more than 72 percent of all ChatGPT usage, with a minority of overall usage related to business productivity. In July 2023, OpenAI launched the superalignment project, aiming within four years to determine how to align future superintelligent systems. OpenAI promised to dedicate 20% of its computing resources to the project, although the team denied receiving anything close to 20%. OpenAI ended the project in May 2024 after its co-leaders Ilya Sutskever and Jan Leike left the company. In August 2025, OpenAI was criticized after thousands of private ChatGPT conversations were inadvertently exposed to public search engines like Google due to an experimental "share with search engines" feature. The opt-in toggle, intended to allow users to make specific chats discoverable, resulted in some discussions including personal details such as names, locations, and intimate topics appearing in search results when users accidentally enabled it while sharing links. OpenAI announced the feature's permanent removal on August 1, 2025, and the company began coordinating with search providers to remove the exposed content, emphasizing that it was not a security breach but a design flaw that heightened privacy risks. CEO Sam Altman acknowledged the issue in a podcast, noting users often treat ChatGPT as a confidant for deeply personal matters, which amplified concerns about AI handling sensitive data. Management In 2018, Musk resigned from his Board of Directors seat, citing "a potential future conflict [of interest]" with his role as CEO of Tesla due to Tesla's AI development for self-driving cars. OpenAI stated that Musk's financial contributions were below $45 million. On March 3, 2023, Reid Hoffman resigned from his board seat, citing a desire to avoid conflicts of interest with his investments in AI companies via Greylock Partners, and his co-founding of the AI startup Inflection AI. Hoffman remained on the board of Microsoft, a major investor in OpenAI. In May 2024, Chief Scientist Ilya Sutskever resigned and was succeeded by Jakub Pachocki. Co-leader Jan Leike also departed amid concerns over safety and trust. OpenAI then signed deals with Reddit, News Corp, Axios, and Vox Media. Paul Nakasone then joined the board of OpenAI. In August 2024, cofounder John Schulman left OpenAI to join Anthropic, and OpenAI's president Greg Brockman took extended leave until November. In September 2024, CTO Mira Murati left the company. In November 2025, Lawrence Summers resigned from the board of directors. Governance and legal issues In May 2023, Sam Altman, Greg Brockman and Ilya Sutskever posted recommendations for the governance of superintelligence. They stated that superintelligence could happen within the next 10 years, allowing a "dramatically more prosperous future" and that "given the possibility of existential risk, we can't just be reactive". They proposed creating an international watchdog organization similar to IAEA to oversee AI systems above a certain capability threshold, suggesting that relatively weak AI systems on the other side should not be overly regulated. They also called for more technical safety research for superintelligences, and asked for more coordination, for example through governments launching a joint project which "many current efforts become part of". In July 2023, the FTC issued a civil investigative demand to OpenAI to investigate whether the company's data security and privacy practices to develop ChatGPT were unfair or harmed consumers (including by reputational harm) in violation of Section 5 of the Federal Trade Commission Act of 1914. These are typically preliminary investigative matters and are nonpublic, but the FTC's document was leaked. In July 2023, the FTC launched an investigation into OpenAI over allegations that the company scraped public data and published false and defamatory information. They asked OpenAI for comprehensive information about its technology and privacy safeguards, as well as any steps taken to prevent the recurrence of situations in which its chatbot generated false and derogatory content about people. The agency also raised concerns about ‘circular’ spending arrangements—for example, Microsoft extending Azure credits to OpenAI while both companies shared engineering talent—and warned that such structures could negatively affect the public. In September 2024, OpenAI's global affairs chief endorsed the UK's "smart" AI regulation during testimony to a House of Lords committee. In February 2025, OpenAI CEO Sam Altman stated that the company is interested in collaborating with the People's Republic of China, despite regulatory restrictions imposed by the U.S. government. This shift comes in response to the growing influence of the Chinese artificial intelligence company DeepSeek, which has disrupted the AI market with open models, including DeepSeek V3 and DeepSeek R1. Following DeepSeek's market emergence, OpenAI enhanced security protocols to protect proprietary development techniques from industrial espionage. Some industry observers noted similarities between DeepSeek's model distillation approach and OpenAI's methodology, though no formal intellectual property claim was filed. According to Oliver Roberts, in March 2025, the United States had 781 state AI bills or laws. OpenAI advocated for preempting state AI laws with federal laws. According to Scott Kohler, OpenAI has opposed California's AI legislation and suggested that the state bill encroaches on a more competent federal government. Public Citizen opposed a federal preemption on AI and pointed to OpenAI's growth and valuation as evidence that existing state laws have not hampered innovation. Before May 2024, OpenAI required departing employees to sign a lifelong non-disparagement agreement forbidding them from criticizing OpenAI and acknowledging the existence of the agreement. Daniel Kokotajlo, a former employee, publicly stated that he forfeited his vested equity in OpenAI in order to leave without signing the agreement. Sam Altman stated that he was unaware of the equity cancellation provision, and that OpenAI never enforced it to cancel any employee's vested equity. However, leaked documents and emails refute this claim. On May 23, 2024, OpenAI sent a memo releasing former employees from the agreement. OpenAI was sued for copyright infringement by authors Sarah Silverman, Matthew Butterick, Paul Tremblay and Mona Awad in July 2023. In September 2023, 17 authors, including George R. R. Martin, John Grisham, Jodi Picoult and Jonathan Franzen, joined the Authors Guild in filing a class action lawsuit against OpenAI, alleging that the company's technology was illegally using their copyrighted work. The New York Times also sued the company in late December 2023. In May 2024 it was revealed that OpenAI had destroyed its Books1 and Books2 training datasets, which were used in the training of GPT-3, and which the Authors Guild believed to have contained over 100,000 copyrighted books. In 2021, OpenAI developed a speech recognition tool called Whisper. OpenAI used it to transcribe more than one million hours of YouTube videos into text for training GPT-4. The automated transcription of YouTube videos raised concerns within OpenAI employees regarding potential violations of YouTube's terms of service, which prohibit the use of videos for applications independent of the platform, as well as any type of automated access to its videos. Despite these concerns, the project proceeded with notable involvement from OpenAI's president, Greg Brockman. The resulting dataset proved instrumental in training GPT-4. In February 2024, The Intercept as well as Raw Story and Alternate Media Inc. filed lawsuit against OpenAI on copyright litigation ground. The lawsuit is said to have charted a new legal strategy for digital-only publishers to sue OpenAI. On April 30, 2024, eight newspapers filed a lawsuit in the Southern District of New York against OpenAI and Microsoft, claiming illegal harvesting of their copyrighted articles. The suing publications included The Mercury News, The Denver Post, The Orange County Register, St. Paul Pioneer Press, Chicago Tribune, Orlando Sentinel, Sun Sentinel, and New York Daily News. In June 2023, a lawsuit claimed that OpenAI scraped 300 billion words online without consent and without registering as a data broker. It was filed in San Francisco, California, by sixteen anonymous plaintiffs. They also claimed that OpenAI and its partner as well as customer Microsoft continued to unlawfully collect and use personal data from millions of consumers worldwide to train artificial intelligence models. On May 22, 2024, OpenAI entered into an agreement with News Corp to integrate news content from The Wall Street Journal, the New York Post, The Times, and The Sunday Times into its AI platform. Meanwhile, other publications like The New York Times chose to sue OpenAI and Microsoft for copyright infringement over the use of their content to train AI models. In November 2024, a coalition of Canadian news outlets, including the Toronto Star, Metroland Media, Postmedia, The Globe and Mail, The Canadian Press and CBC, sued OpenAI for using their news articles to train its software without permission. In October 2024 during a New York Times interview, Suchir Balaji accused OpenAI of violating copyright law in developing its commercial LLMs which he had helped engineer. He was a likely witness in a major copyright trial against the AI company, and was one of several of its current or former employees named in court filings as potentially having documents relevant to the case. On November 26, 2024, Balaji died by suicide. His death prompted the circulation of conspiracy theories alleging that he had been deliberately silenced. California Congressman Ro Khanna endorsed calls for an investigation. On April 24, 2025, Ziff Davis sued OpenAI in Delaware federal court for copyright infringement. Ziff Davis is known for publications such as ZDNet, PCMag, CNET, IGN and Lifehacker. In April 2023, the EU's European Data Protection Board (EDPB) formed a dedicated task force on ChatGPT "to foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities" based on the "enforcement action undertaken by the Italian data protection authority against OpenAI about the ChatGPT service". In late April 2024 NOYB filed a complaint with the Austrian Datenschutzbehörde against OpenAI for violating the European General Data Protection Regulation. A text created with ChatGPT gave a false date of birth for a living person without giving the individual the option to see the personal data used in the process. A request to correct the mistake was denied. Additionally, neither the recipients of ChatGPT's work nor the sources used, could be made available, OpenAI claimed. OpenAI was criticized for lifting its ban on using ChatGPT for "military and warfare". Up until January 10, 2024, its "usage policies" included a ban on "activity that has high risk of physical harm, including", specifically, "weapons development" and "military and warfare". Its new policies prohibit "[using] our service to harm yourself or others" and to "develop or use weapons". In August 2025, the parents of a 16-year-old boy who died by suicide filed a wrongful death lawsuit against OpenAI (and CEO Sam Altman), alleging that months of conversations with ChatGPT about mental health and methods of self-harm contributed to their son's death and that safeguards were inadequate for minors. OpenAI expressed condolences and said it was strengthening protections (including updated crisis response behavior and parental controls). Coverage described it as a first-of-its-kind wrongful death case targeting the company's chatbot. The complaint was filed in California state court in San Francisco. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed seven lawsuits against OpenAI, of which four lawsuits alleged wrongful death. The suits were filed on behalf of Zane Shamblin, 23, of Texas; Amaurie Lacey, 17, of Georgia; Joshua Enneking, 26, of Florida; and Joe Ceccanti, 48, of Oregon, who each committed suicide after prolonged ChatGPT usage. In December 2025, Stein-Erik Soelberg, who was 56 years old at the time, allegedly murdered his mother Suzanne Adams. In the months prior the paranoid, delusional man often discussed his ideas with ChatGPT. Adam's estate then sued OpenAI claiming that the company shared responsibility due to the risk of chatbot psychosis despite the fact that chatbot psychosis is not a real medical diagnosis. OpenAI responded saying they will make ChatGPT safer for users disconnected from reality. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/SNOBOL] | [TOKENS: 2606] |
Contents SNOBOL SNOBOL (StriNg Oriented and symBOlic Language) is a series of programming languages developed between 1962 and 1967 at AT&T Bell Laboratories by David J. Farber, Ralph Griswold and Ivan P. Polonsky, culminating in SNOBOL4. It was one of a number of text-string-oriented languages developed during the 1950s and 1960s; others included COMIT and TRAC. Despite the similar name, it is entirely unlike COBOL. SNOBOL4 stands apart from most programming languages of its era by having patterns as a first-class data type, a data type whose values can be manipulated in all ways permitted to any other data type in the programming language, and by providing operators for pattern concatenation and alternation. SNOBOL4 patterns are a type of object and admit various manipulations, much like later object-oriented languages such as JavaScript whose patterns are known as regular expressions. In addition SNOBOL4 strings generated during execution can be treated as programs and either interpreted or compiled and executed (as in the eval function of other languages). SNOBOL4 was quite widely taught in larger U.S. universities in the late 1960s and early 1970s and was widely used in the 1970s and 1980s as a text manipulation language in the humanities. In the 1980s and 1990s, its use faded as newer languages such as AWK and Perl made string manipulation by means of regular expressions fashionable. SNOBOL4 patterns include a way to express BNF grammars, which are equivalent to context-free grammars and more powerful than regular expressions. The "regular expressions" in current versions of AWK and Perl are in fact extensions of regular expressions in the traditional sense, but regular expressions, unlike SNOBOL4 patterns, are not recursive, which gives a distinct computational advantage to SNOBOL4 patterns. (Recursive expressions did appear in Perl 5.10, though, released in December 2007.) The later SL5 (1977) and Icon (1978) languages were designed by Griswold to combine the backtracking of SNOBOL4 pattern matching with more standard ALGOL-like structuring. Development The initial SNOBOL language was created as a tool to be used by its authors to work with the symbolic manipulation of polynomials. It was written in assembly language for the IBM 7090. It had a simple syntax, only one datatype, the string, no functions, and no declarations and very little error control. However, despite its simplicity and its "personal" nature its use began to spread to other groups. As a result, the authors decided to extend it and tidy it up. SNOBOL2 did exist but it was a short-lived intermediate development version without user-defined functions and was never released. SNOBOL was rewritten to add functions, both standard and user-defined, and the result was released as SNOBOL3. SNOBOL3 became quite popular and was rewritten for other computers than the IBM 7090 by other programmers. As a result, several incompatible dialects arose. As SNOBOL3 became more popular, the authors received more and more requests for extensions to the language. They also began to receive complaints about incompatibility and bugs in versions that they hadn't written. To address this and to take advantage of the new computers being introduced in the late 1960s, the decision was taken to develop SNOBOL4 with many extra datatypes and features but based on a virtual machine to allow improved portability across computers. The SNOBOL4 language translator was still written in assembly language. However the macro features of the assembler were used to define the virtual machine instructions of the SNOBOL Implementation Language, the SIL. This very much improved the portability of the language by making it relatively easy to port the virtual machine which hosted the translator by recreating its virtual instructions on any machine which included a macro assembler or indeed a high level language. The machine-independent language SIL arose as a generalization of string manipulation macros by Douglas McIlroy, which were used extensively in the initial SNOBOL implementation. In 1969, McIlroy influenced the language again by insisting on addition of the table type to SNOBOL4. SNOBOL4 features SNOBOL is distinctive in format and programming style, which are radically different from contemporary procedural languages such as Fortran and ALGOL. SNOBOL4 supports a number of built-in data types, such as integers and limited precision real numbers, strings, patterns, arrays, and tables (associative arrays), and also allows the programmer to define additional data types and new functions. SNOBOL4's programmer-defined data type facility was advanced at the time—it is similar to the records of the earlier COBOL and the later Pascal programming languages. All SNOBOL command lines are of the form Each of the five elements is optional. In general, the subject is matched against the pattern. If the object is present, any matched portion is replaced by the object via rules for replacement. The transfer can be an absolute branch or a conditional branch dependent upon the success or failure of the subject evaluation, the pattern evaluation, the pattern match, the object evaluation or the final assignment. It can also be a transfer to code created and compiled by the program itself during a run. A SNOBOL pattern can be very simple or extremely complex. A simple pattern is just a text string (e.g. "ABCD"), but a complex pattern may be a large structure describing, for example, the complete grammar of a computer language. It is possible to implement a language interpreter in SNOBOL almost directly from a Backus–Naur form expression of it, with few changes. Creating a macro assembler and an interpreter for a completely theoretical piece of hardware could take as little as a few hundred lines, with a new instruction being added with a single line. Complex SNOBOL patterns can do things that would be impractical or impossible using the more primitive regular expressions used in most other pattern-matching languages. Some of this power derives from the so-called "SPITBOL extensions" (which have since been incorporated in basically all modern implementations of the original SNOBOL 4 language too), although it is possible to achieve the same power without them. Part of this power comes from the side effects that it is possible to produce during the pattern matching operation, including saving numerous intermediate/tentative matching results and the ability to invoke user-written functions during the pattern match which can perform nearly any desired processing, and then influence the ongoing direction the interrupted pattern match takes, or even to indeed change the pattern itself during the matching operation. Patterns can be saved like any other first-class data item, and can be concatenated, used within other patterns, and used to create very complex and sophisticated pattern expressions. It is possible to write, for example, a SNOBOL4 pattern which matches "a complete name and international postal mailing address", which is well beyond anything that is practical to even attempt using regular expressions. SNOBOL4 pattern-matching uses a backtracking algorithm similar to that used in the logic programming language Prolog, which provides pattern-like constructs via DCGs. This algorithm makes it easier to use SNOBOL as a logic programming language than is the case for most languages. SNOBOL stores variables, strings and data structures in a single garbage-collected heap. Example programs The "Hello, World!" program might be as follows... A simple program to ask for a user's name and then use it in an output sentence... Use :S (branch on successful match) to choose among three possible outputs... To continue requesting input until no more is forthcoming... Implementations The classic implementation was on the PDP-10; it has been used to study compilers, formal grammars, and artificial intelligence, especially machine translation and machine comprehension of natural languages. The original implementation was on an IBM 7090 at Bell Labs, Holmdel, N.J. SNOBOL4 was specifically designed for portability; the first implementation was started on an IBM 7094 in 1966 but completed on an IBM 360 in 1967. It was rapidly ported to many other platforms. It is normally implemented as an interpreter because of the difficulty in implementing some of its very high-level features, but there is a compiler, the SPITBOL compiler, which provides nearly all the facilities that the interpreter provides. The classic implementation on the PDP-10 was quite slow, and in 1972 James Gimpel of Bell Labs, Holmdel, N.J. designed a native implementation of SNOBOL4 for the PDP-10 that he named SITBOL. He used the design as the basis of a graduate class in string processing that he taught that year at Stevens Institute of Technology (which is why it was named SITBOL). Students were given sections to implement (in PDP-10 assembler) and the entire semester was focused on implementing SITBOL. It was over 80% complete by the end of the semester and was subsequently completed by Professor Gimpel and several students over the summer. SITBOL was a full-featured, high-performance SNOBOL4 interpreter. The GNAT Ada Compiler comes with a package (GNAT.Spitbol) that implements all of the Spitbol string manipulation semantics. This can be called from within an Ada program. The file editor for the Michigan Terminal System (MTS) provided pattern matching based on SNOBOL4 patterns. Several implementations are currently available. Macro SNOBOL4 in C written by Phil Budne is a free, open source implementation, capable of running on almost any platform. Catspaw, Inc provided a commercial implementation of the SNOBOL4 language for many different computer platforms, including DOS, Macintosh, Sun, RS/6000, and others, and these implementations are now available free from Catspaw. Minnesota SNOBOL4, by Viktors Berstis, the closest PC implementation to the original IBM mainframe version (even including Fortran-like FORMAT statement support) is also free. Although SNOBOL itself has no structured programming features, a SNOBOL preprocessor called Snostorm was designed and implemented during the 1970s by Fred G. Swartz for use under the Michigan Terminal System (MTS) at the University of Michigan. Snostorm was used at the eight to fifteen sites that ran MTS. It was also available at University College London (UCL) between 1982 and 1984. Snocone by Andrew Koenig adds block-structured constructs to the SNOBOL4 language. Snocone is a self-contained programming language, rather than a proper superset of SNOBOL4. The SPITBOL implementation also introduced a number of features which, while not using traditional structured programming keywords, nevertheless can be used to provide many of the equivalent capabilities normally thought of as "structured programming", most notably nested if/then/else type constructs. These features have since been added to most recent SNOBOL4 implementations. After many years as a commercial product, in April 2009 SPITBOL was released as free software under the GNU General Public License. Naming According to Dave Farber, he, Griswold and Polonsky "finally arrived at the name Symbolic EXpression Interpreter SEXI." All went well until one day I was submitting a batch job to assemble the system and as normal on my JOB card — the first card in the deck, I, in BTL standards, punched my job and my name — SEXI Farber. One of the Comp Center girls looked at it and said, "That's what you think" in a humorous way. That made it clear that we needed another name!! We sat and talked and drank coffee and shot rubber bands and after much too much time someone said — most likely Ralph — "We don't have a Snowball's chance in hell of finding a name". All of us yelled at once, "WE GOT IT — SNOBOL" in the spirit of all the BOL languages. We then stretched our mind to find what it stood for. Common backronyms of "SNOBOL" are 'String Oriented Symbolic Language' or (as a quasi-initialism) 'StriNg Oriented symBOlic Language'. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Caliphate] | [TOKENS: 15016] |
Contents Caliphate A caliphate (Arabic: خلافة, romanized: khilāfa [xiˈlaːfa]) is an institution or public office under the leadership of an Islamic steward with the title of caliph (/ˈkælɪf, ˈkeɪ-/; خليفة khalīfa [xaˈliːfa], pronunciationⓘ), a person considered a political–religious successor to the Islamic prophet Muhammad and a leader of the entire Muslim world (ummah). Historically, the caliphates were polities based on Islam which developed into multi-ethnic trans-national empires. During the medieval period, three major caliphates succeeded each other: the Rashidun Caliphate (632–661), the Umayyad Caliphate (661–750), and the Abbasid Caliphate (750–1517). In the fourth major caliphate, the Ottoman Caliphate, the rulers of the Ottoman Empire claimed caliphal authority from 1517 until the Ottoman Caliphate was formally abolished as part of the 1924 secularisation of Turkey. The Sharif of Mecca then claimed the title, but this caliphate fell quickly after its conquest by the Sultanate of Nejd (the predecessor of modern-day Saudi Arabia), leaving the claim in dormancy. Throughout the history of Islam, a few other Muslim states, almost all of which were hereditary monarchies, have claimed to be caliphates. Not all Muslim states have had caliphates. The Sunni branch of Islam stipulates that, as a head of state, a caliph should be elected by Muslims or their representatives. Shia Muslims, however, believe a caliph should be an imam chosen by God from the Ahl al-Bayt (the 'Household of the Prophet'). Some caliphates in history have been led by Shia Muslims, like the Fatimid Caliphate (909–1171). From the late 20th century towards the early 21st century, in the wake of the invasion of Afghanistan by the USSR, the war on terror and the Arab Spring, various Islamist groups have claimed the caliphate, although these claims have usually been widely rejected among Muslims, the most notable example being the Islamic State led by Abu Bakr al-Baghdadi which declared a caliphate in its territory in Iraq and Syria in 2014. Etymology Before the advent of Islam, Arabian monarchs traditionally used the title malik 'king', or another from the same Semitic root. The term caliph (/ˈkeɪlɪf, ˈkælɪf/) derives from the Arabic word khalīfah (خليفة, pronunciationⓘ), meaning 'successor', 'steward', or 'deputy', and has traditionally been considered a shortening of Khalīfah rasūl Allāh 'successor of the messenger of God'. However, studies of pre-Islamic texts suggest that the original meaning of the phrase was 'successor selected by God'. History In the immediate aftermath of the death of Muhammad, a gathering of the Ansar (natives of Medina) took place in the saqifa (courtyard) of the Banu Sa'ida clan. The meeting was thought to have convened so the Ansar could decide on a new leader of the Muslim community among themselves, with the intentional exclusion of the Muhajirun (migrants from Mecca), though this later became the subject of debate. Nevertheless, Abu Bakr and Umar, both prominent companions of Muhammad, upon learning of the meeting became concerned of a potential coup and hastened to the gathering. Upon arriving, Abu Bakr addressed the assembled men with a warning that an attempt to elect a leader outside of Muhammad's own tribe, the Quraysh, would likely result in dissension as only they can command the necessary respect among the community. He then took Umar and another companion, Abu Ubayda ibn al-Jarrah, by the hand and offered them to the Ansar as potential choices. He was countered with the suggestion that the Quraysh and the Ansar choose a leader each from among themselves, who would then rule jointly. The group grew heated upon hearing this proposal and began to argue among themselves. Umar hastily took Abu Bakr's hand and swore his own allegiance to the latter, an example followed by the gathered men. Abu Bakr was near-universally accepted as head of the Muslim community (under the title of caliph) as a result of Saqifah, though he did face contention as a result of the rushed nature of the event. Several companions, most prominent among them being Ali ibn Abi Talib, initially refused to acknowledge his authority. Ali may have been reasonably expected to assume leadership, being both cousin and son-in-law to Muhammad. The theologian Ibrahim al-Nakha'i stated that Ali also had support among the Ansar for his succession, explained by the genealogical links he shared with them. Whether his candidacy for the succession was raised during Saqifah is unknown, though it is not unlikely. Abu Bakr later sent Umar to confront Ali to gain his allegiance, resulting in an altercation which may have involved violence. However, after six months, the group made peace with Abu Bakr and Ali offered him his fealty. Abu Bakr nominated Umar as his successor on his deathbed. Umar, the second caliph, was killed by a Persian slave called Abu Lu'lu'a Firuz. His successor, Uthman, was elected by a council of electors (majlis). Uthman was killed by members of a disaffected group. Ali then took control but was not universally accepted as caliph by the governors of Egypt and later by some of his own guard. He faced two major rebellions and was assassinated by Abd-al-Rahman ibn Muljam, a Khawarij. Ali's tumultuous rule lasted only five years. This period is known as the Fitna, or the first Islamic civil war. The followers of Ali later became the Shi'a ("shiaat Ali", partisans of Ali.) minority sect of Islam and reject the legitimacy of the first three caliphs. The followers of all four Rāshidun Caliphs (Abu Bakr, Umar, Uthman and Ali) became the majority Sunni sect. Under the Rāshidun, each region (Sultanate, Wilayah, or Emirate) of the caliphate had its own governor (Sultan, Wāli or Emir). Muāwiyah, a relative of Uthman and governor (wali) of Syria, succeeded Ali as caliph. Muāwiyah transformed the caliphate into a hereditary office, thus founding the Umayyad dynasty. In areas which were previously under Sasanian Empire or Byzantine rule, the caliphs lowered taxes, provided greater local autonomy (to their delegated governors), greater religious freedom for Jews and some indigenous Christians, and brought peace to peoples demoralised and disaffected by the casualties and heavy taxation that had resulted from the decades of Byzantine–Persian warfare. Ali's reign was plagued by turmoil and internal strife. The Persians, taking advantage of this, infiltrated the two armies and attacked the other army causing chaos and internal hatred between the companions at the Battle of Siffin. The battle lasted several months, resulting in a stalemate. To avoid further bloodshed, Ali agreed to negotiate with Mu'awiyah. This caused a faction of approximately 4,000 people, who would come to be known as the Kharijites, to abandon the fight. After defeating the Kharijites at the Battle of Nahrawan, Ali was later assassinated by the Kharijite Ibn Muljam. Ali's son Hasan was elected as the next caliph, but abdicated in favour of Mu'awiyah a few months later to avoid any conflict within the Muslims. Mu'awiyah became the sixth caliph, establishing the Umayyad dynasty, named after the great-grandfather of Uthman and Mu'awiyah, Umayya ibn Abd Shams. Beginning with the Umayyads, the title of the caliph became hereditary. Under the Umayyads, the caliphate grew rapidly in territory, incorporating the Caucasus, Transoxiana, Sindh, the Maghreb and most of the Iberian Peninsula (Al-Andalus) into the Muslim world. At its greatest extent, the Umayyad Caliphate covered approximately 5 million square miles (approximately 13 million square kilometres), making it larger than the earlier Roman Empire or contemporary Tang China. Geographically, the empire was divided into several provinces, the borders of which changed a number of times during the Umayyad reign.[citation needed] Each province had a governor appointed by the caliph. However, for a variety of reasons, including that they were not elected by Shura and suggestions of impious behaviour, the Umayyad dynasty was not universally supported within the Muslim community. Some supported prominent early Muslims like Zubayr ibn al-Awwam; others felt that only members of Muhammad's clan, the Banu Hashim, or his own lineage, the descendants of Ali, should rule. There were multiple rebellions against the Umayyads, as well as splits within the Umayyad ranks (notably, the rivalry between Yaman and Qays). At the command of Yazid son of Muawiya, an army led by Umar ibn Saad, a commander by the name of Shimr Ibn Thil-Jawshan killed Ali's son Hussein and his family at the Battle of Karbala in 680, solidifying the Shia-Sunni split. Eventually, supporters of the Banu Hashim and the supporters of the lineage of Ali united to bring down the Umayyads in 750. However, the Shi‘at ‘Alī, "the Party of Ali", were again disappointed when the Abbasid dynasty took power, as the Abbasids were descended from Muhammad's uncle, ‘Abbas ibn ‘Abd al-Muttalib and not from Ali. In 750, the Umayyad dynasty was overthrown by another family of Meccan origin, the Abbasids. Their time represented a scientific, cultural and religious flowering. Islamic art and music also flourished significantly during their reign. Their major city and capital Baghdad began to flourish as a center of knowledge, culture and trade. This period of cultural fruition ended in 1258 with the sack of Baghdad by the Mongols under Hulagu Khan. The Abbasid Caliphate had, however, lost its effective power outside Iraq already by c. 920. By 945, the loss of power became official when the Buyids conquered Baghdad and all of Iraq. The empire fell apart and its parts were ruled for the next century by local dynasties. In the ninth century, the Abbasids created an army loyal only to their caliphate, composed predominantly of Turkic Cuman, Circassian and Georgian slave origin known as Mamluks.[better source needed] By 1250 the Mamluks came to power in Egypt. The Mamluk army, though often viewed negatively, both helped and hurt the caliphate. Early on, it provided the government with a stable force to address domestic and foreign problems. However, creation of this foreign army and al-Mu'tasim's transfer of the capital from Baghdad to Samarra created a division between the caliphate and the peoples they claimed to rule. In addition, the power of the Mamluks steadily grew until Ar-Radi (934–941) was constrained to hand over most of the royal functions to Muhammad ibn Ra'iq. In 1261, following the Mongol conquest of Baghdad, the Mamluk rulers of Egypt tried to gain legitimacy for their rule by declaring the re-establishment of the Abbasid Caliphate in Cairo.[citation needed] The Abbasid caliphs in Egypt had no political power; they continued to maintain the symbols of authority, but their sway was confined to religious matters.[citation needed] The first Abbasid caliph of Cairo was Al-Mustansir (r. June–November 1261). The Abbasid Caliphate of Cairo lasted until the time of Al-Mutawakkil III, who ruled as caliph from 1508 to 1516, then he was deposed briefly in 1516 by his predecessor Al-Mustamsik, but was restored again to the caliphate in 1517.[citation needed] The Ottoman sultan Selim I defeated the Mamluk Sultanate and made Egypt part of the Ottoman Empire in 1517. Al-Mutawakkil III was captured together with his family and transported to Constantinople as a prisoner where he had a ceremonial role. He died in 1543, following his return to Cairo. The Abbasid dynasty lost effective power over much of the Muslim realm by the first half of the tenth century. The Umayyad dynasty, which had survived and come to rule over Al-Andalus, reclaimed the title of caliph in 929, lasting until it was overthrown in 1031. During the Umayyad dynasty, the Iberian Peninsula was an integral province of the Umayyad Caliphate ruling from Damascus. The Umayyads lost the position of caliph in Damascus in 750, and Abd al-Rahman I became Emir of Córdoba in 756 after six years in exile. Intent on regaining power, he defeated the existing Islamic rulers of the area who defied Umayyad rule and united various local fiefdoms into an emirate. Rulers of the emirate used the title "emir" or "sultan" until the tenth century, when Abd al-Rahman III was faced with the threat of invasion by the Fatimid Caliphate. To aid his fight against the invading Fatimids, who claimed the caliphate in opposition to the generally recognised Abbasid caliph of Baghdad, Al-Mu'tadid, Abd al-Rahman III claimed the title of caliph himself. This helped Abd al-Rahman III gain prestige with his subjects, and the title was retained after the Fatimids were repulsed. The rule of the caliphate is considered as the heyday of Muslim presence in the Iberian Peninsula, before it fragmented into various taifas in the eleventh century. This period was characterised by a flourishing in technology, trade and culture; a number of the buildings of al-Andalus were constructed in this period. The Almohad Caliphate (Berber languages: Imweḥḥden, from الموحدون al-Muwaḥḥidun, "the Monotheists" or "the Unifiers") was a Moroccan Berber Muslim movement founded in the 12th century. The Almohad movement was started by Ibn Tumart among the Masmuda tribes of southern Morocco. The Almohads first established a Berber state in Tinmel in the Atlas Mountains in roughly 1120. The Almohads succeeded in overthrowing the Almoravid dynasty in governing Morocco by 1147, when Abd al-Mu'min (r. 1130–1163) conquered Marrakech and declared himself caliph. They then extended their power over all of the Maghreb by 1159. Al-Andalus followed the fate of Africa, and all Islamic Iberia was under Almohad rule by 1172. The Almohad dominance of Iberia continued until 1212, when Muhammad al-Nasir (1199–1214) was defeated at the Battle of Las Navas de Tolosa in the Sierra Morena by an alliance of the Christian princes of Castile, Aragon, Navarre and Portugal.[citation needed] Nearly all of the Moorish dominions in Iberia were lost soon after, with the great Moorish cities of Córdoba and Seville falling to the Christians in 1236 and 1248, respectively. The Almohads continued to rule in northern Africa until the piecemeal loss of territory through the revolt of tribes and districts enabled the rise of their most effective enemies, the Marinid dynasty, in 1215.[citation needed] The last representative of the line, Idris al-Wathiq, was reduced to the possession of Marrakesh, where he was murdered by a slave in 1269; the Marinids seized Marrakesh, ending the Almohad domination of the Western Maghreb. The Hafsids had been governors of Ifriqiya for the Almohads but declared their independence in 1229. They claimed the succession of the Almohad Empire and their descent from Rashidun caliph Omar. The second independent Hafsid ruler Muhammad I al-Mustansir (r. 1249–1277) declared himself caliph in 1253 and under his reign, the caliphate reached a peak. After the fall of Baghdad, Marinid sultan Abu Yusuf Yaqub of Morocco, Sharif of Mecca Abu Numayy and Emir of Granada Muhammad recognized the Hafsids in 1258, 1259 and 1264 respectively. They were also potentially recognised by the Mamluks in 1260. In the mid-14th century, Morocco invaded the caliphate. The Hafsids would return to great power with Abu Faris Abd al-Aziz II (r. 1394–1434) and Abu 'Amr 'Uthman (r. 1435–1488). Between 1535 and 1574, the caliphate was caught between the Ottoman and Spanish fronts and became a protectorate until it was conquered by the Ottoman Caliphate. The Fatimid Caliphate was an Isma'ili Shi'i caliphate, originally based in Tunisia, that extended its rule across the Mediterranean coast of Africa and ultimately made Egypt the centre of its caliphate. At its height, in addition to Egypt, the caliphate included varying areas of the Maghreb, Sicily, the Levant and the Hejaz. The Fatimids established the Tunisian city of Mahdia and made it their capital city, before conquering Egypt and building the city of Cairo there in 969. Thereafter, Cairo became the capital of the caliphate, with Egypt becoming the political, cultural and religious centre of the state. Islam scholar Louis Massignon dubbed the fourth century AH /tenth century CE as the "Ismaili century in the history of Islam". The term Fatimite is sometimes used to refer to the citizens of this caliphate. The ruling elite of the state belonged to the Ismaili branch of Shi'ism. The leaders of the dynasty were Ismaili imams and had a religious significance to Ismaili Muslims. They are also part of the chain of holders of the office of the caliphate, as recognised by some Muslims. Therefore, this constitutes a rare period in history in which the descendants of Ali (hence the name Fatimid, referring to Ali's wife Fatima) and the caliphate were united to any degree, excepting the final period of the Rashidun Caliphate under Ali himself. The caliphate was reputed to exercise a degree of religious tolerance towards non-Ismaili sects of Islam as well as towards Jews, Maltese Christians and Copts. The Shia Ubayd Allah al-Mahdi Billah of the Fatimid dynasty, who claimed descent from Muhammad through his daughter, claimed the title of caliph in 909, creating a separate line of caliphs in North Africa. Initially controlling Algeria, Tunisia and Libya, the Fatimid caliphs extended their rule for the next 150 years, taking Egypt and Palestine, before the Abbasid dynasty was able to turn the tide, limiting Fatimid rule to Egypt. The Fatimid dynasty finally ended in 1171 and was overtaken by Saladin of the Ayyubid dynasty. The caliphate was claimed by the sultans of the Ottoman Empire beginning with Murad I (reigned 1362 to 1389), while recognising no authority on the part of the Abbasid caliphs of the Mamluk-ruled Cairo. Hence the seat of the caliphate moved to the Ottoman capital of Edirne. In 1453, after Mehmed the Conqueror's conquest of Constantinople, the seat of the Ottomans moved to Constantinople, present-day Istanbul. In 1517, the Ottoman sultan Selim I defeated and annexed the Mamluk Sultanate of Cairo into his empire. The Ottomans defied the Hadiths that state that the Imam or Caliph has to be from the Quraysh tribe, which was the norm until the overthrow of the Abbasid caliphate by the Ottomans who claimed the caliphate. Through conquering and unifying Muslim lands, Selim I became the defender of the holy cities of Mecca and Medina, which further strengthened the Ottoman claim to the caliphate in the Muslim world. Ottomans gradually came to be viewed as the de facto leaders and representatives of the Islamic world. However, the earlier Ottoman caliphs did not officially bear the title of caliph in their documents of state, inscriptions, or coinage. It was only in the late eighteenth century that the claim to the caliphate was discovered by the sultans to have a practical use, since it allowed them to counter Russian claims to protect Ottoman Christians with their own claim to protect Muslims under Russian rule. The outcome of the Russo-Turkish War of 1768–1774 was disastrous for the Ottomans. Large territories, including those with large Muslim populations, such as Crimea, were lost to the Russian Empire. However, the Ottomans under Abdul Hamid I claimed a diplomatic victory by being allowed to remain the religious leaders of Muslims in the now-independent Crimea as part of the peace treaty; in return Russia became the official protector of Christians in Ottoman territory. According to Barthold, the first time the title of "caliph" was used as a political instead of symbolic religious title by the Ottomans was the Treaty of Küçük Kaynarca with the Russian Empire in 1774, when the Empire retained moral authority on territory whose sovereignty was ceded to the Russian Empire. The British would tactfully affirm the Ottoman claim to the caliphate and proceed to have the Ottoman caliph issue orders to the Muslims living in British India to comply with the British government. The British supported and propagated the view that the Ottomans were caliphs of Islam among Muslims in British India, and the Ottoman sultans helped the British by issuing pronouncements to the Muslims of India telling them to support British rule from Sultan Selim III and Sultan Abdulmejid I. Around 1880, Sultan Abdul Hamid II reasserted the title as a way of countering Russian expansion into Muslim lands. His claim was most fervently accepted by the Sunni Muslims of British India. By the eve of World War I, the Ottoman state, despite its weakness relative to Europe, represented the largest and most powerful independent Islamic political entity. The sultan also enjoyed some authority beyond the borders of his shrinking empire as caliph of Muslims in Egypt, India and Central Asia.[citation needed] In 1899, John Hay, U.S. Secretary of State, asked the American ambassador to Ottoman Turkey, Oscar Straus, to approach Sultan Abdul Hamid II to use his position as caliph to order the Tausūg people of the Sultanate of Sulu in the Philippines to submit to American suzerainty and American military rule; the Sultan obliged them and wrote the letter which was sent to Sulu via Mecca. As a result, the "Sulu Mohammedans ... refused to join the insurrectionists and had placed themselves under the control of our army, thereby recognizing American sovereignty." Political Militant After the Armistice of Mudros of October 1918 with the military occupation of Constantinople and Treaty of Versailles (1919), the position of the Ottomans was uncertain. The movement to protect or restore the Ottomans gained force after the Treaty of Sèvres (August 1920) which imposed the partitioning of the Ottoman Empire and gave Greece a powerful position in Anatolia, to the distress of the Turks. They called for help and the movement was the result. The movement had collapsed by late 1922. On 3 March 1924, the first president of the Turkish Republic, Mustafa Kemal Atatürk, as part of his reforms, constitutionally abolished the institution of the caliphate. Atatürk offered the caliphate to Ahmed Sharif as-Senussi, on the condition that he reside outside Turkey; Senussi declined the offer and confirmed his support for Abdulmejid. The title was then claimed by Hussein bin Ali, Sharif of Mecca and Hejaz, leader of the Arab Revolt, but his kingdom was defeated and annexed by ibn Saud in 1925. Egyptian scholar Ali Abdel Raziq published his 1925 book Islam and the Foundations of Governance. The argument of this book has been summarised as "Islam does not advocate a specific form of government". He focussed his criticism both at those who use religious law as contemporary political proscription and at the history of rulers claiming legitimacy by the caliphate. Raziq wrote that past rulers spread the notion of religious justification for the caliphate "so that they could use religion as a shield protecting their thrones against the attacks of rebels". A summit was convened at Cairo in 1926 to discuss the revival of the caliphate, but most Muslim countries did not participate, and no action was taken to implement the summit's resolutions. Though the title Ameer al-Mumineen was adopted by the King of Morocco and by Mohammed Omar, former head of the Taliban of Afghanistan, neither claimed any legal standing or authority over Muslims outside the borders of their respective countries.[citation needed] Since the end of the Ottoman Empire, occasional demonstrations have been held calling for the re-establishment of the caliphate. Organisations which call for the re-establishment of the caliphate include Hizb ut-Tahrir and the Muslim Brotherhood. The AKP government in Turkey, a former Muslim Brotherhood ally who has adopted Neo-Ottomanist policies throughout its rule, has been accused of intending to restore the caliphate. The Khilafat Movement was launched by Muslims in British India in 1920 to defend the Ottoman Caliphate at the end of the First World War and it spread throughout the British colonial territories. It was strong in British India where it formed a rallying point for some Indian Muslims as one of many anti-British Indian political movements. Its leaders included Mohammad Ali Jouhar, his brother Shawkat Ali and Maulana Abul Kalam Azad, Dr. Mukhtar Ahmed Ansari, Hakim Ajmal Khan and Barrister Muhammad Jan Abbasi. For a time it was supported by Mohandas Karamchand Gandhi, who was a member of the Central Khilafat Committee.[better source needed] However, the movement lost its momentum after the abolition of the caliphate in 1924. After further arrests and flight of its leaders, and a series of offshoots splintered off from the main organisation, the Movement eventually died down and disbanded. After the Umayyad campaigns in India and the conquest on small territories of the western part of the Indian peninsula, early Indian Muslim dynasties were founded by the Ghurid dynasty and the Ghaznavids, most notably the Delhi Sultanate. The Indian sultanates did not extensively strive for a caliphate since the Ottoman Empire was already observing the caliphate. The emperors of the Mughal Empire, who were the only Sunni rulers whose territory and wealth could compete with that of the Ottomans, started assuming the title of caliph and calling their capital as the Dar-ul-khilafat ('abode of the caliphate') since the time of the third emperor Akbar like their Timurid ancestors. A gold coin struck under Akbar called him the "great sultan, the exalted khalifah". Although the Mughals did not acknowledge the overlordship of Ottomans, they nevertheless used the title of caliph to honour them in diplomatic exchanges. Akbar's letter to Suleiman the Magnificent addressed the latter as having attained the rank of the caliphate, while calling Akbar's empire as the "Khilafat of realms of Hind and Sind." The fifth emperor Shah Jahan also laid claim to the Caliphate. Although the Mughal Empire is not recognised as a caliphate, its sixth emperor Aurangzeb has often been regarded as one of the few Islamic caliphs to have ruled the Indian peninsula. He received support from the Ottoman sultans such as Suleiman II and Mehmed IV. As a memoriser of Quran, Aurangzeb fully established sharia in South Asia via his Fatawa 'Alamgiri.[citation needed] He re-introduced jizya and banned Islamically unlawful activities. However, Aurangzeb's personal expenses were covered by his own incomes, which included the sewing of caps and trade of his written copies of the Quran. Thus, he has been compared to the second caliph, Umar bin Khattab, and Kurdish conqueror Saladin. The Mughal emperors continued to be addressed as caliphs until the reign of Shah Alam II. Other notable rulers such as Muhammad bin Bakhtiyar Khalji, Alauddin Khilji, Firuz Shah Tughlaq, Shamsuddin Ilyas Shah, Babur, Sher Shah Suri, Nasir I of Kalat, Tipu Sultan, Nawabs of Bengal, and the Khwaja Salimullah were popularly given the term khalifa. Several rulers of West Africa adopted the title of caliph. Mai Ali I Gaji (r. c. 1470 – c. 1503) was the first ruler of Bornu Empire to assume the title. Askia Mohammad I of Songhai Empire also assumed the title around the same time. The Bornu mais (emperors) held the title of caliph until 1846, when the mais were replaced with a new lineage of rulers who adopted the title shehu (sheikh). The Sokoto Caliphate (1804–1903) was an Islamic state in what is now Nigeria led by Usman dan Fodio. Founded during the Fulani War in the early nineteenth century, it controlled one of the most powerful empires in sub-Saharan Africa prior to European conquest and colonisation culminating in the Adamawa Wars and the Battle of Kano. The caliphate remained extant through the colonial period and afterwards, though with reduced power.[citation needed] The current head of the Sokoto Caliphate is Sa'adu Abubakar. The Toucouleur Empire (1848–1893), also known as the Tukular Empire, was one of the Fulani jihad states in sub-saharan Africa. It was eventually pacified and annexed by the French Republic, being incorporated into French West Africa. Additionally, the Massina Empire (1818–1862) joined these jihad states in West Africa and claimed to be a caliphate. The Indonesian sultan of Yogyakarta historically used Khalifatullah (Caliph of God) as one of his titles. In 2015 sultan Hamengkubuwono X renounced any claim to the caliphate to facilitate his daughter's inheritance of the throne, as the theological opinion of the time was that a woman may hold the secular office of sultan but not the spiritual office of caliph.[better source needed] France planned to appoint sultan Yusef of Morocco, who served as the leader of the French protectorate in Morocco, as their "Caliph of the West" to strengthen their control over their colonies in Africa and the Middle East after the 1914 Ottoman jihad proclamation. As part of the Alawi dynasty, he claimed to be a descendant of Fatima. France abandoned the plan in the Sykes–Picot Agreement in 1916 which gave Britain free hand in creating their own caliphate in Arabia which also never came to fruition. The Sharifian Caliphate (Arabic: خلافة شريفية) was an Arab caliphate proclaimed by the Sharifian rulers of Hejaz in 1924 previously known as Vilayet Hejaz, declaring independence from the Ottoman Caliphate. The idea of the Sharifian Caliphate had been floating around since at least the fifteenth century. In the Arab world, it represented the culmination of a long struggle to reclaim the caliphate from Ottoman hands. The first Arab revolts challenging the validity of the Ottoman Caliphate and demanding that an Arab Sayyid be chosen as caliph can be traced back to 1883 when Sheikh Hamat-al-Din seized Sanaa and called for the caliphate as a Sayyid. However, it was not until the end of the Ottoman Caliphate, abolished by the Kemalists, that Hussein bin Ali was proclaimed caliph in March 1924. His stance towards the Ottoman Caliphate was ambiguous, and while he was hostile to it, he preferred to wait for its official abolition before assuming the title, so as not to break the Ummah by creating a second caliph alongside the Ottoman caliph. He also supported financially the late Ottoman dynasty in exile, to avoid them being ruined.[better source needed] His caliphate was opposed by the British Empire, Zionists, and Wahhabis, but he received support from a large part of the Muslim population at the time, as well as from Mehmed VI. Although he lost the Hejaz and was exiled, then imprisoned by the British on Cyprus, Hussein continued to use the title until his death in 1931. Non-political caliphates Though non-political, some Sufi orders and the Ahmadiyya movement define themselves as caliphates. Their leaders are thus commonly referred to as khalifas (caliphs). In Sufism, tariqas (orders) are led by spiritual leaders (khilafah ruhaniyyah), the main khalifas, who nominate local khalifas to organise zaouias. Sufi caliphates are not necessarily hereditary. Khalifas are aimed to serve the silsilah in relation to spiritual responsibilities and to propagate the teachings of the tariqa. The Ahmadiyya Muslim Community is a self-proclaimed Islamic revivalist movement founded in 1889 by Mirza Ghulam Ahmad of Qadian, India, who claimed to be the promised Messiah and Mahdi, awaited by Muslims. He also claimed to be a follower-prophet subordinate to Muhammad, the prophet of Islam.[citation needed] The group are traditionally shunned by the majority of Muslims. After Ahmad's death in 1908, his first successor, Hakeem Noor-ud-Din, became the caliph of the community and assumed the title of Khalifatul Masih (Successor or Caliph of the Messiah).[citation needed] After Hakeem Noor-ud-Din, the first caliph, the title of the Ahmadiyya caliph continued under Mirza Mahmud Ahmad, who led the community for over 50 years. Following him were Mirza Nasir Ahmad and then Mirza Tahir Ahmad who were the third and fourth caliphs respectively.[citation needed] The current caliph is Mirza Masroor Ahmad, who lives in London. Period of dormancy Once the subject of intense conflict and rivalry among Muslim rulers, the caliphate lay dormant and largely unclaimed since the 1920s. For the majority of Muslims, the caliph, as leader of the ummah, "is cherished both as memory and ideal" as a time when Muslims "enjoyed scientific and military superiority globally". Muhammad is reported to have prophesied: Prophethood will remain with you for as long as Allah wills it to remain, then Allah will raise it up whenever he wills to raise it up. Afterwards, there will be a Caliphate that follows the guidance of Prophethood remaining with you for as long as Allah wills it to remain. Then, He will raise it up whenever He wills to raise it up. Afterwards, there will be a reign of violently oppressive rule and it will remain with you for as long as Allah wills it to remain. Then, there will be a reign of tyrannical rule and it will remain for as long as Allah wills it to remain. Then, Allah will raise it up whenever He wills to raise it up. Then, there will be a Caliphate that follows the guidance of Prophethood. — As-Silsilah As-Sahihah, vol. 1, no. 5 A contemporary effort to re-establish the caliphate by supporters of armed jihad that predates Abu Bakr al-Baghdadi and the Islamic State and was much less successful, was "the forgotten caliphate" of Muhammad bin ʿIssa bin Musa al Rifaʿi ("known to his followers as Abu ʿIssa"). This "microcaliphate" was founded on 3 April 1993 on the Pakistan–Afghanistan border, when Abu Issa's small number of "Afghan Arabs" followers swore loyalty (bay'ah) to him. Abu Issa, was born in the city of Zarqa, Jordan and like his followers had come to Afghanistan to wage jihad against the Soviets. Unlike them he had ancestors in the tribe of Quraysh, a traditional requirement for a caliph. The caliphate was ostensibly an attempt to unite the other jihadis who were not his followers and who were quarrelling among each other. It was not successful. Abu Issa's efforts to compel them to unite under his command were met "with mockery and then force". Local Afghans also despised him and his followers. Like the later Islamic State he tried to abolish infidel currency and rejected nationalism. According to scholar Kevin Jackson, Abu ʿIssa issued 'sad and funny' fatwas, as Abu al-Walid puts it, notably sanctioning the use of drugs. A nexus had been forged between [Abu Issa's group] and local drug smugglers. (The fatwa led one jihadist author to dismiss Abu Issa as the 'caliph of the Muslims among drug traffickers and takfir') Abu ʿIssa also prohibited the use of paper currency and ordered his men to burn their passports. The territory under his control "did not extend beyond a few small towns" in Afghanistan's Kunar province. Eventually he did not even control this area after the Taliban took it over in the late 1990s. The caliphate then moved to London, where they "preach[ed] to a mostly skeptical jihadi intelligentsia about the obligation of establishing a caliphate". They succeeded in attracting some jihadis (Yahya al-Bahrumi, Abu Umar al Kuwaiti) who later joined the Islamic State. Abu Issa died in 2014, "after spending most of his final years in prison in London". Abu Umar al Kuwaiti became a judge for the Islamic state but was later executed for extremism after he "took takfir to new levels ... pronouncing death sentences for apostasy on those who were ignorant of scripture – and then pronouncing takfir on those too reluctant to pronounce takfir." A network of Islamist militants formed the Al-Qaeda in Iraq affiliate during the Iraq War (2003–2011). The group eventually expanded into Syria and rose to prominence as the Islamic State of Iraq and the Levant (ISIL) during the Syrian Civil War. In the summer of 2014, the group launched the Northern Iraq offensive, seizing the city of Mosul. The group declared itself a caliphate under Abu Bakr al-Baghdadi on 29 June 2014 and renamed itself as the "Islamic State". ISIL's claim to be the highest authority of Muslims has been widely rejected. No prominent Muslim scholar has supported its declaration of caliphate; even Salafi jihadist preachers accused the group of engaging in political showmanship and bringing disrepute to the notion of Islamic state. ISIL has been at war with armed forces including the Iraqi Army, the Syrian Army, the Free Syrian Army, Al-Nusra Front, Syrian Democratic Forces, and Iraqi Kurdistan's Peshmerga and People's Protection Units (YPG) along with a 60 nation coalition in its efforts to establish a de facto state on Iraqi and Syrian territory. At its height in 2014, the Islamic State held "about a third of Syria and 40 percent of Iraq". By December 2017 it had lost 95% of that territory, including Mosul, Iraq's second largest city, and the northern Syrian city of Raqqa, its capital. Its caliph, Al-Baghdadi, was killed in a raid by U.S. forces on 26 October 2019, its "last holdout", the town of Al-Baghuz Fawqani, fell to Syrian Democratic Forces on 23 March 2019. The members of the Ahmadiyya community believe that the Ahmadiyya Caliphate is the continuation of the Islamic caliphate, first being the Rāshidūn (rightly guided) Caliphate (of Righteous Caliphs). This is believed to have been suspended with Ali, the son-in-law of Muhammad and re-established with the appearance of Mirza Ghulam Ahmad (1835–1908, the founder of the movement) whom Ahmadis identify as the Promised Messiah and Mahdi. Ahmadis maintain that in accordance with Quranic verses (such as 24:55) and a number of ahadith on the issue, caliphates can only be established by God himself. and is a divine blessing given to "those who believe and work righteousness" and uphold the unity of God, therefore any movement to establish the caliphates centered on human endeavours alone is bound to fail, particularly when the condition of the people diverges from the "precepts of prophethood" and they are as a result disunited, their inability to establish a caliphate caused fundamentally by the lack of righteousness in them. Although the caliph is elected, it is believed that God himself directs the hearts of believers towards an individual. Thus the caliph is designated neither necessarily by right (i.e. the rightful or competent one in the eyes of the people at that time) nor merely by election but primarily by God.[citation needed][non-primary source needed] According to Ahmadiyya thought, a khalifa need not be the head of a state; rather the Ahmadiyya community emphasises the spiritual and organisational significance of the Khilāfah. It is primarily a religious/spiritual office, with the purpose of upholding, strengthening and spreading Islam and of maintaining the high spiritual and moral standards within the global community established by Muhammad—who was not merely a political leader but primarily a religious leader. If a khalifa does happen to bear governmental authority as a head of state, it is incidental and subsidiary in relation to his overall function as khalifa which is applicable to believers transnationally and not limited to one particular state. Ahmadi Muslims believe that God has assured them that this caliphate will endure to the end of time, depending on their righteousness and faith in God. The Khalifa provides unity, security, moral direction and progress for the community. It is required that the Khalifa carry out his duties through consultation and taking into consideration the views of the members of the Shura (consultative body). However, it is not incumbent upon him to always accept the views and recommendations of the members. The Khalifatul Masih has overall authority for all religious and organisational matters and is bound to decide and act in accordance with the Qur'an and sunnah. A number of Islamist political parties and mujahideen called for the restoration of the caliphate by uniting Muslim nations, either through political action (e.g. Hizb ut-Tahrir), or through force (e.g. al-Qaeda). Various Islamist movements gained momentum in recent years with the ultimate aim of establishing a caliphate. In 2014, ISIL/ISIS made a claim to re-establishing the caliphate. Those advocating the re-establishment of a caliphate differed in their methodology and approach. Some[who?] were locally oriented, mainstream political parties that had no apparent transnational objectives.[citation needed] Abul A'la Maududi believed the caliph was not just an individual ruler who had to be restored, but was man's representation of God's authority on Earth: Khilafa means representative. Man, according to Islam is the representative of "people", His (God's) viceregent; that is to say, by virtue of the powers delegated to him, and within the limits prescribed by the Qu'ran and the teaching of the prophet, the caliph is required to exercise Divine authority. The Muslim Brotherhood advocates pan-Islamic unity and the implementation of Islamic law. Founder Hassan al-Banna wrote about the restoration of the caliphate. One transnational group whose ideology was based specifically on restoring the caliphate as a pan-Islamic state is Hizb ut-Tahrir (literally, 'Party of Liberation'). It is particularly strong in Central Asia and Europe and is growing in strength in the Arab world. It is based on the claim that Muslims can prove that God exists and that the Qur'an is the word of God.[citation needed] Hizb ut-Tahrir's stated strategy is a non-violent political and intellectual struggle. In Southeast Asia, groups such as Jemaah Islamiyah aimed to establish a Caliphate across Indonesia, Malaysia, Brunei and parts of Thailand, the Philippines and Cambodia.[citation needed] Al-Qaeda has as one of its clearly stated goals the re-establishment of a caliphate. Its former leader, Osama bin Laden, called for Muslims to "establish the righteous caliphate of our umma". Al-Qaeda chiefs released a statement in 2005, under which, in what they call "phase five" there will be "an Islamic state, or caliphate". Al-Qaeda has named its Internet newscast from Iraq "The Voice of the Caliphate". According to author and Egyptian native Lawrence Wright, Ayman al-Zawahiri, bin Laden's mentor and al-Qaeda's second-in-command until 2011, once "sought to restore the caliphate... which had formally ended in 1924 following the dissolution of the Ottoman Empire but which had not exercised real power since the thirteenth century." Zawahiri believes that once the caliphate is re-established, Egypt would become a rallying point for the rest of the Islamic world, leading the jihad against the West. "Then history would make a new turn, God willing", Zawahiri later wrote, "in the opposite direction against the empire of the United States and the world's Jewish government". Scholar Olivier Roy writes that "early on, Islamists replace the concept of the caliphate ... with that of the emir." There were a number of reasons including "that according to the classical authors, a caliph must be a member of the tribe of the Prophet (the Quraysh) ... moreover, caliphs ruled societies that the Islamists do not consider to have been Islamic (the Ottoman Empire)." This is not the view of the majority of Islamist groups, as both the Muslim Brotherhood and Hizb ut-Tahrir view the Ottoman state as a caliphate.[non-primary source needed][better source needed] Religious basis The Quran uses the term khalifa twice. First, in Surah Al-Baqara 2:30, it refers to God creating humanity as his khalifa on Earth. Second, in Surah Sad 38:26, it addresses King David as God's khalifa and reminds him of his obligation to rule with justice. In addition, the following excerpt from the Quran, known as the 'Istikhlaf Verse', is used by some to argue for a Quranic basis for a caliphate: Allah has promised those of you who believe and do good that He will certainly make them successors in the land, as He did with those before them; and will surely establish for them their faith which He has chosen for them; and will indeed change their fear into security—˹provided that˺ they worship Me, associating nothing with Me. But whoever disbelieves after this ˹promise˺, it is they who will be the rebellious. — Surah An-Nur 24:55 Several schools of jurisprudence and thought within Sunni Islam argue that to govern a state by Sharia is, by definition, to rule via the caliphate and use the following verses to sustain their claim. And judge between them ˹O Prophet˺ by what Allah has revealed, and do not follow their desires. And beware, so they do not lure you away from some of what Allah has revealed to you. If they turn away ˹from Allah's judgment˺, then know that it is Allah's Will to repay them for some of their sins, and that many people are indeed rebellious. — Surah Al-Ma'idah 5:49 O believers! Obey Allah and obey the Messenger and those in authority among you. Should you disagree on anything, then refer it to Allah and His Messenger, if you ˹truly˺ believe in Allah and the Last Day. This is the best and fairest resolution. — Surah An-Nisa 4:59 The following hadith from Musnad Ahmad ibn Hanbal can be understood to prophesy two eras of the caliphate (both on the lines/precepts of prophethood). Hadhrat Huzaifa narrated that the Messenger of Allah said: Prophethood will remain among you as long as Allah wills. Then Caliphate (Khilafah) on the lines of Prophethood shall commence, and remain as long as Allah wills. Then corrupt/erosive monarchy would take place, and it will remain as long as Allah wills. After that, despotic kingship would emerge, and it will remain as long as Allah wills. Then, the Caliphate (Khilafah) shall come once again based on the precept of Prophethood.[page needed] In the above, the first era of the caliphate is commonly accepted by Muslims to be that of the Rashidun Caliphate. Nafi'a reported saying: It has been reported on the authority of Nafi, that 'Abdullah b. Umar paid a visit to Abdullah b. Muti' in the days (when atrocities were perpetrated on the People Of Medina) at Harra in the time of Yazid b. Mu'awiya. Ibn Muti' said: Place a pillow for Abu 'Abd al-Rahman (family name of 'Abdullah b. 'Umar). But the latter said: I have not come to sit with you. I have come to you to tell you a tradition I heard from the Messenger of Allah. I heard him say: One who withdraws his band from obedience (to the Amir) will find no argument (in his defence) when he stands before Allah on the Day of Judgment, and one who dies without having bound himself by an oath of allegiance (to an Amir) will die the death of one belonging to the days of Jahiliyyah. — Sahih Muslim 1851a Hisham ibn Urwah reported on the authority of Abu Saleh on the authority of Abu Hurairah that Muhammad said: Leaders will take charge of you after me, where the pious (one) will lead you with his piety and the impious (one) with his impiety, so only listen to them and obey them in everything which conforms with the truth (Islam). If they act rightly it is for your credit, and if they acted wrongly it is counted for you and against them. It has been narrated on the authority of Abu Huraira that the Prophet of Allah said: A Imam is a shield for them. They fight behind him and they are protected by (him from tyrants and aggressors). If he enjoins fear of God, the Exalted and Glorious, and dispenses justice, there will be a (great) reward for him; and if he enjoins otherwise, it redounds on him. — Sahih Muslim 1841 Narrated Abu Huraira: The Prophet said, "The Israelis used to be ruled and guided by prophets: Whenever a prophet died, another would take over his place. There will be no prophet after me, but there will be Caliphs who will increase in number." The people asked, "O Allah's Messenger! What do you order us (to do)?" He said, "Obey the one who will be given the pledge of allegiance first. Fulfil their (i.e. the Caliphs) rights, for Allah will ask them about (any shortcoming) in ruling those Allah has put under their guardianship." — Sahih al-Bukhari 3455 Many Islamic texts, including several ahadith, state that the Mahdi will be elected caliph and rule over a caliphate. A number of Islamic figures titled themselves both "caliph" and "al-Mahdi", including the first Abbasid caliph As-Saffah. Al-Habbab Ibn ul-Munthir said, when the Sahaba met in the wake of the death of Muhammad, (at the thaqifa hall) of Bani Sa’ida: Let there be one Amir from us and one Amir from you (meaning one from the Ansar and one from the Mohajireen). Upon this Abu Bakr replied: It is forbidden for Muslims to have two Amirs (rulers)... Then he got up and addressed the Muslims.[page needed] It has additionally been reported that Abu Bakr went on to say on the day of Al-Saqifa: It is forbidden for Muslims to have two Amirs for this would cause differences in their affairs and concepts, their unity would be divided and disputes would break out among them. The Sunnah would then be abandoned, the bida'a (innovations) would spread and Fitna would grow, and that is in no one's interests. The Sahaba agreed to this and selected Abu Bakr as their first Khaleef. Habbab ibn Mundhir who suggested the idea of two Ameers corrected himself and was the first to give Abu Bakr the Bay'ah. This indicates an Ijma as-Sahaba of all of the Sahaba. Ali ibni abi Talib, who was attending the body of Muhammad at the time, also consented to this. Imam Ali whom the Shia revere said: People must have an Amir...where the believer works under his Imara (rule) and under which the unbeliever would also benefit, until his rule ended by the end of his life (ajal), the booty (fay’i) would be gathered, the enemy would be fought, the routes would be made safe, the strong one will return what he took from the weak till the tyrant would be contained, and not bother anyone. Scholars like Al-Mawardi, Ibn Hazm, Ahmad al-Qalqashandi, and Al-Sha`rani stated that the global Muslim community can have only one leader at any given time. Al-Nawawi and Abd al-Jabbar ibn Ahmad declared it impermissible to give oaths of loyalty to more than one leader. Al-Joziri said: The Imams (scholars of the four schools of thought)- may Allah have mercy on them- agree that the Caliphate is an obligation, and that the Muslims must appoint a leader who would implement the injunctions of the religion, and give the oppressed justice against the oppressors. It is forbidden for Muslims to have two leaders in the world whether in agreement or discord. Shia scholars have expressed similar opinions. However, the Shia school of thought states that the leader must not be appointed by the Islamic ummah, but must be appointed by God. Al-Qurtubi said that the caliph is the "pillar upon which other pillars rest", and said of the Quranic verse, "Indeed, man is made upon this earth a Caliph": This Ayah is a source in the selection of an Imaam, and a Khaleef, he is listened to and he is obeyed, for the word is united through him, and the Ahkam (laws) of the Caliph are implemented through him, and there is no difference regarding the obligation of that between the Ummah ... An-Nawawi said: (The scholars) consented that it is an obligation upon the Muslims to select a Khalif Al-Ghazali when writing of the potential consequences of losing the caliphate said: The judges will be suspended, the Wilayaat (provinces) will be nullified, ... the decrees of those in authority will not be executed and all the people will be on the verge of Haraam Ibn Taymiyyah said[page needed]: It is obligatory to know that the office in charge of commanding over the people (ie: the post of the Khaleefah) is one of the greatest obligations of the Deen. In fact, there is no establishment of the Deen except by it....this is the opinion of the salaf, such as Al-Fuḍayl ibn ‘Iyāḍ, Ahmad ibn Hanbal and others Government In his book The Early Islamic Conquests (1981), Fred Donner argues that the standard Arabian practice during the early caliphates was for the prominent men of a kinship group, or tribe, to gather after a leader's death and elect a leader from among themselves, although there was no specified procedure for this shura, or consultative assembly. Candidates were usually from the same lineage as the deceased leader, but they were not necessarily his sons. Capable men who would lead well were preferred over an ineffectual direct heir, as there was no basis in the majority Sunni view that the head of state or governor should be chosen based on lineage alone. Since the Umayyads, all caliphates have been dynastic. Traditionally, Sunni Muslim madhhabs all agreed that a caliph must be a descendant of the Quraysh. Al-Baqillani has said that the leader of the Muslims simply should be from the majority. Following the death of Muhammad, a meeting took place at Saqifah. At that meeting, Abu Bakr was elected caliph by the Muslim community. Sunni Muslims developed the belief that the caliph is a temporal political ruler, appointed to rule within the bounds of Islamic law (Sharia). The job of adjudicating orthodoxy and Islamic law was left to mujtahids, legal specialists collectively called the Ulama. A number of Muslims call the first four caliphs the Rashidun, meaning the 'Rightly Guided', because they are believed to have followed the Qur'an and the sunnah of Muhammad.[citation needed] With the exception of Zaidis, Shia Muslims believe in the Imamate, a principle by which rulers are imams who are divinely chosen, infallible and sinless and must come from the Ahl al-Bayt regardless of majority opinion, shura or election. They claim that before his death, Muhammad had given multiple indications, in the hadith of the pond of Khumm in particular, that he considered Ali, his cousin and son-in-law, as his successor. For the Twelvers, Ali and his eleven descendants, the Twelve Imams, are believed to have been considered, even before their birth, as the only valid Islamic rulers appointed and decreed by God. Shia Muslims believe that all the Muslim caliphs following Muhammad's death to be illegitimate due to their unjust rule and that Muslims have no obligation to follow them, as the only guidance that was left behind, as ordained in the hadith of the two weighty things, was the Islamic holy book, the Quran and Muhammad's family and offspring, who are believed to be infallible, therefore able to lead society and the Muslim community with complete justice and equity. The Prophet's own grandson, and third Shia imam, Hussain ibn Ali led an uprising against injustice and the oppressive rule of the Muslim caliph at the time at the Battle of Karbala. Shia Muslims emphasise that values of social justice, and speaking out against oppression and tyranny are not merely moral values, but values essential to a person's religiosity. After these Twelve Imams, the potential caliphs, had passed, and in the absence of the possibility of a government headed by their imams, some Twelvers believe it was necessary that a system of Shi'i Islamic government based on the Guardianship of the Islamic Jurist be developed, due to the need for some form of government, where an Islamic jurist or faqih rules Muslims, suffices. However, this idea, developed by the marja' Ayatollah Ruhollah Khomeini and established in Iran, is not universally accepted among the Shia. Ismailis believe in the Imamate principle mentioned above, but they need not be secular rulers as well. The Majlis al-Shura (literally 'consultative assembly') was a representation of the idea of consultative governance. The importance of this is premised by the following verses of the Qur'an: The majlis is also the means to elect a new caliph. Al-Mawardi has written that members of the majlis should satisfy three conditions: they must be just, have enough knowledge to distinguish a good caliph from a bad one and have sufficient wisdom and judgement to select the best caliph. Al-Mawardi also said that in emergencies when there is no caliphate and no majlis, the people themselves should create a majlis and select a list of candidates for caliph; then the majlis should select a caliph from the list of candidates. Some Islamist interpretations of the role of the Majlis al-Shura are the following: In an analysis of the shura chapter of the Qur'an, Islamist author Sayyid Qutb argues that Islam only requires the ruler to consult with some of the representatives of the ruled and govern within the context of the Sharia. Taqiuddin al-Nabhani, the founder of a transnational political movement devoted to the revival of the caliphate, writes that although the Shura is an important part of "the ruling structure" of the Islamic caliphate, "(it is) not one of its pillars", meaning that its neglect would not make a caliph's rule un-Islamic such as to justify a rebellion. However, the Muslim Brotherhood, the largest Islamic movement in Egypt, has toned down these Islamist views by accepting in principle that in the modern age the Majlis al-Shura is democracy. Al-Mawardi said that if the rulers meet their Islamic responsibilities to the public the people must obey their laws, but a caliph or ruler who becomes either unjust or severely ineffective must be impeached via the Majlis al-Shura. Al-Juwayni argued that Islam is the goal of the ummah, so any ruler who deviates from this goal must be impeached. Al-Ghazali believed that oppression by a caliph is sufficient grounds for impeachment. Rather than just relying on impeachment, Ibn Hajar al-Asqalani stated that the people have an obligation to rebel if the caliph begins to act with no regard for Islamic law. Ibn Hajar al-Asqalani said that to ignore such a situation is haraam and those who cannot revolt from inside the caliphate should launch a struggle from outside. Al-Asqalani used two ayahs from the Qur'an to justify this: And they (the sinners on qiyama) will say, "Our Lord! We obeyed our leaders and elite, but they led us astray from the ˹Right˺ Way. Our Lord! Give them double ˹our˺ punishment, and condemn them tremendously." — Surah Al-Ahzab 33:67–68 Islamic lawyers commented that when the rulers refuse to step down after being impeached through the Majlis, becoming dictators through the support of a corrupt army, if the majority is in agreement they have the option to launch a revolution. Some noted that this option is to be exercised only after factoring in the potential cost of life. The following hadith establishes the principle of rule of law in relation to nepotism and accountability[non-primary source needed] Narrated ‘Aisha: The people of Quraish worried about the lady from Bani Makhzum who had committed theft. They asked, "Who will intercede for her with Allah's Apostle?" Some said, "No one dare to do so except Usama bin Zaid the beloved one to Allah's Apostle." When Usama spoke about that to Allah's Apostle; Allah's Apostle said: "Do you try to intercede for somebody in a case connected with Allah’s Prescribed Punishments?" Then he got up and delivered a sermon saying, "What destroyed the nations preceding you, was that if a noble amongst them stole, they would forgive him, and if a poor person amongst them stole, they would inflict Allah's Legal punishment on him. By Allah, if Fatima, the daughter of Muhammad (my daughter) stole, I would cut off her hand." Various Islamic lawyers, however, place multiple conditions and stipulations on the execution of such a law, making it difficult to implement. For example, the poor cannot be penalised for stealing out of poverty, and during a time of drought in the Rashidun Caliphate, capital punishment was suspended until the effects of the drought passed. Islamic jurists later formulated the concept that all classes were subject to the law of the land, and no person is above the law; officials and private citizens alike have a duty to obey the same law. Furthermore, a Qadi (Islamic judge) was not allowed to discriminate on the grounds of religion, race, colour, kinship or prejudice. In a number of cases, caliphs had to appear before judges as they prepared to render their verdict. According to Noah Feldman, a law professor at Harvard University, the system of legal scholars and jurists responsible for the rule of law was replaced by the codification of Sharia by the Ottoman Empire in the early nineteenth century: During the Muslim Agricultural Revolution, the caliphate understood that real incentives were needed to increase productivity and wealth and thus enhance tax revenues. A social transformation took place as a result of changing land ownership giving individuals of any gender, ethnic or religious background the right to buy, sell, mortgage and inherit land for farming or any other purpose. Signatures were required on contracts for every major financial transaction concerning agriculture, industry, commerce and employment. Copies of the contract were usually kept by both parties involved. Early forms of proto-capitalism and free markets were present in the caliphate, since an early market economy and early form of merchant capitalism developed between the 8th and 12th centuries, which some refer to as "Islamic capitalism". A vigorous monetary economy developed based on the circulation of a stable high-value currency (the dinar) and the integration of previously independent monetary areas. Business techniques and forms of business organisation employed during this time included early contracts, bills of exchange, long-distance international trade, early forms of partnership (mufawada) such as limited partnerships (mudaraba) and early forms of credit, debt, profit, loss, capital (al-mal), capital accumulation (nama al-mal), circulating capital, capital expenditure, revenue, cheques, promissory notes, trusts (waqf), startup companies, savings accounts, transactional accounts, pawning, loaning, exchange rates, bankers, money changers, ledgers, deposits, assignments, the double-entry bookkeeping system, and lawsuits. Organisational enterprises similar to corporations independent from the state also existed in the medieval Islamic world. A number of these concepts were adopted and further advanced in medieval Europe from the thirteenth century onwards. Early Islamic law included collection of zakat (charity), one of the Five Pillars of Islam, since the time of the first Islamic State, established by Muhammad at Medina. The taxes (including zakat and jizya) collected in the treasury (Bayt al-mal) of an Islamic government were used to provide income for the needy, including the poor, elderly, orphans, widows and the disabled. During the caliphate of Abu Bakr, a number of the Arab tribes, who had accepted Islam at the hand of The Prophet Muhammad, rebelled and refused to continue to pay the Zakat, leading to the Ridda Wars. Caliph Umar added to the duties of the state an allowance, paid on behalf of every man woman and child, starting at birth, creating the world's first state run social welfare program. Maya Shatzmiller states that the demographic behaviour of medieval Islamic society varied in some significant respects from other agricultural societies. Nomadic groups within places like the deserts of Egypt and Morocco maintained high birth rates compared to rural and urban populations, though periods of extremely high nomadic birth rates seem to have occurred in occasional "surges" rather than on a continuous basis. Individuals living in large cities had much lower birth rates, possibly due to the use of birth control methods and political or economic instability. This led to population declines in some regions. While several studies have shown that Islamic scholars enjoyed a life expectancy of 59–75 years between the eleventh and thirteenth centuries, the overall life expectancy of men in the same societies was lower. Factoring in infant mortality, Lawrence Conrad estimates the average lifespan in the early Islamic caliphate to be above 35 years for the general population, compared to around 40 years for the population of Classical Greece and 31 years for the population of thirteenth-century England. The early Islamic Empire also had the highest literacy rates among pre-modern societies, alongside the city of classical Athens in the fourth century BC, and later, China after the introduction of printing from the tenth century. One factor for the relatively high literacy rates in the early Islamic Empire was its parent-driven educational marketplace, as the state did not systematically subsidise educational services until the introduction of state funding under Nizam al-Mulk in the eleventh century. Another factor was the diffusion of paper from China, which led to an efflorescence of books and written culture in Islamic society; thus papermaking technology transformed Islamic society from an oral to scribal culture, comparable to the later shifts from scribal to typographic culture, and from typographic culture to the Internet. Other factors include the widespread use of paper books in Islamic society (more so than any other previously existing society), the study and memorisation of the Quran, flourishing commercial activity and the emergence of the Maktab and Madrasah educational institutions. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Mars#cite_note-138] | [TOKENS: 11899] |
Contents Mars Mars is the fourth planet from the Sun. It is also known as the "Red Planet", for its orange-red appearance. Mars is a desert-like rocky planet with a tenuous atmosphere that is primarily carbon dioxide (CO2). At the average surface level the atmospheric pressure is a few thousandths of Earth's, atmospheric temperature ranges from −153 to 20 °C (−243 to 68 °F), and cosmic radiation is high. Mars retains some water, in the ground as well as thinly in the atmosphere, forming cirrus clouds, fog, frost, larger polar regions of permafrost and ice caps (with seasonal CO2 snow), but no bodies of liquid surface water. Its surface gravity is roughly a third of Earth's or double that of the Moon. Its diameter, 6,779 km (4,212 mi), is about half the Earth's, or twice the Moon's, and its surface area is the size of all the dry land of Earth. Fine dust is prevalent across the surface and the atmosphere, being picked up and spread at the low Martian gravity even by the weak wind of the tenuous atmosphere. The terrain of Mars roughly follows a north-south divide, the Martian dichotomy, with the northern hemisphere mainly consisting of relatively flat, low lying plains, and the southern hemisphere of cratered highlands. Geologically, the planet is fairly active with marsquakes trembling underneath the ground, but also hosts many enormous volcanoes that are extinct (the tallest is Olympus Mons, 21.9 km or 13.6 mi tall), as well as one of the largest canyons in the Solar System (Valles Marineris, 4,000 km or 2,500 mi long). Mars has two natural satellites that are small and irregular in shape: Phobos and Deimos. With a significant axial tilt of 25 degrees, Mars experiences seasons, like Earth (which has an axial tilt of 23.5 degrees). A Martian solar year is equal to 1.88 Earth years (687 Earth days), a Martian solar day (sol) is equal to 24.6 hours. Mars formed along with the other planets approximately 4.5 billion years ago. During the martian Noachian period (4.5 to 3.5 billion years ago), its surface was marked by meteor impacts, valley formation, erosion, the possible presence of water oceans and the loss of its magnetosphere. The Hesperian period (beginning 3.5 billion years ago and ending 3.3–2.9 billion years ago) was dominated by widespread volcanic activity and flooding that carved immense outflow channels. The Amazonian period, which continues to the present, is the currently dominating and remaining influence on geological processes. Because of Mars's geological history, the possibility of past or present life on Mars remains an area of active scientific investigation, with some possible traces needing further examination. Being visible with the naked eye in Earth's sky as a red wandering star, Mars has been observed throughout history, acquiring diverse associations in different cultures. In 1963 the first flight to Mars took place with Mars 1, but communication was lost en route. The first successful flyby exploration of Mars was conducted in 1965 with Mariner 4. In 1971 Mariner 9 entered orbit around Mars, being the first spacecraft to orbit any body other than the Moon, Sun or Earth; following in the same year were the first uncontrolled impact (Mars 2) and first successful landing (Mars 3) on Mars. Probes have been active on Mars continuously since 1997. At times, more than ten probes have simultaneously operated in orbit or on the surface, more than at any other planet beyond Earth. Mars is an often proposed target for future crewed exploration missions, though no such mission is currently planned. Natural history Scientists have theorized that during the Solar System's formation, Mars was created as the result of a random process of run-away accretion of material from the protoplanetary disk that orbited the Sun. Mars has many distinctive chemical features caused by its position in the Solar System. Elements with comparatively low boiling points, such as chlorine, phosphorus, and sulfur, are much more common on Mars than on Earth; these elements were probably pushed outward by the young Sun's energetic solar wind. After the formation of the planets, the inner Solar System may have been subjected to the so-called Late Heavy Bombardment. About 60% of the surface of Mars shows a record of impacts from that era, whereas much of the remaining surface is probably underlain by immense impact basins caused by those events. However, more recent modeling has disputed the existence of the Late Heavy Bombardment. There is evidence of an enormous impact basin in the Northern Hemisphere of Mars, spanning 10,600 by 8,500 kilometres (6,600 by 5,300 mi), or roughly four times the size of the Moon's South Pole–Aitken basin, which would be the largest impact basin yet discovered if confirmed. It has been hypothesized that the basin was formed when Mars was struck by a Pluto-sized body about four billion years ago. The event, thought to be the cause of the Martian hemispheric dichotomy, created the smooth Borealis basin that covers 40% of the planet. A 2023 study shows evidence, based on the orbital inclination of Deimos (a small moon of Mars), that Mars may once have had a ring system 3.5 billion years to 4 billion years ago. This ring system may have been formed from a moon, 20 times more massive than Phobos, orbiting Mars billions of years ago; and Phobos would be a remnant of that ring. Epochs: The geological history of Mars can be split into many periods, but the following are the three primary periods: Geological activity is still taking place on Mars. The Athabasca Valles is home to sheet-like lava flows created about 200 million years ago. Water flows in the grabens called the Cerberus Fossae occurred less than 20 million years ago, indicating equally recent volcanic intrusions. The Mars Reconnaissance Orbiter has captured images of avalanches. Physical characteristics Mars is approximately half the diameter of Earth or twice that of the Moon, with a surface area only slightly less than the total area of Earth's dry land. Mars is less dense than Earth, having about 15% of Earth's volume and 11% of Earth's mass, resulting in about 38% of Earth's surface gravity. Mars is the only presently known example of a desert planet, a rocky planet with a surface akin to that of Earth's deserts. The red-orange appearance of the Martian surface is caused by iron(III) oxide (nanophase Fe2O3) and the iron(III) oxide-hydroxide mineral goethite. It can look like butterscotch; other common surface colors include golden, brown, tan, and greenish, depending on the minerals present. Like Earth, Mars is differentiated into a dense metallic core overlaid by less dense rocky layers. The outermost layer is the crust, which is on average about 42–56 kilometres (26–35 mi) thick, with a minimum thickness of 6 kilometres (3.7 mi) in Isidis Planitia, and a maximum thickness of 117 kilometres (73 mi) in the southern Tharsis plateau. For comparison, Earth's crust averages 27.3 ± 4.8 km in thickness. The most abundant elements in the Martian crust are silicon, oxygen, iron, magnesium, aluminum, calcium, and potassium. Mars is confirmed to be seismically active; in 2019, it was reported that InSight had detected and recorded over 450 marsquakes and related events. Beneath the crust is a silicate mantle responsible for many of the tectonic and volcanic features on the planet's surface. The upper Martian mantle is a low-velocity zone, where the velocity of seismic waves is lower than surrounding depth intervals. The mantle appears to be rigid down to the depth of about 250 km, giving Mars a very thick lithosphere compared to Earth. Below this the mantle gradually becomes more ductile, and the seismic wave velocity starts to grow again. The Martian mantle does not appear to have a thermally insulating layer analogous to Earth's lower mantle; instead, below 1050 km in depth, it becomes mineralogically similar to Earth's transition zone. At the bottom of the mantle lies a basal liquid silicate layer approximately 150–180 km thick. The Martian mantle appears to be highly heterogenous, with dense fragments up to 4 km across, likely injected deep into the planet by colossal impacts ~4.5 billion years ago; high-frequency waves from eight marsquakes slowed as they passed these localized regions, and modeling indicates the heterogeneities are compositionally distinct debris preserved because Mars lacks plate tectonics and has a sluggishly convecting interior that prevents complete homogenization. Mars's iron and nickel core is at least partially molten, and may have a solid inner core. It is around half of Mars's radius, approximately 1650–1675 km, and is enriched in light elements such as sulfur, oxygen, carbon, and hydrogen. The temperature of the core is estimated to be 2000–2400 K, compared to 5400–6230 K for Earth's solid inner core. In 2025, based on data from the InSight lander, a group of researchers reported the detection of a solid inner core 613 kilometres (381 mi) ± 67 kilometres (42 mi) in radius. Mars is a terrestrial planet with a surface that consists of minerals containing silicon and oxygen, metals, and other elements that typically make up rock. The Martian surface is primarily composed of tholeiitic basalt, although parts are more silica-rich than typical basalt and may be similar to andesitic rocks on Earth, or silica glass. Regions of low albedo suggest concentrations of plagioclase feldspar, with northern low albedo regions displaying higher than normal concentrations of sheet silicates and high-silicon glass. Parts of the southern highlands include detectable amounts of high-calcium pyroxenes. Localized concentrations of hematite and olivine have been found. Much of the surface is deeply covered by finely grained iron(III) oxide dust. The Phoenix lander returned data showing Martian soil to be slightly alkaline and containing elements such as magnesium, sodium, potassium and chlorine. These nutrients are found in soils on Earth, and are necessary for plant growth. Experiments performed by the lander showed that the Martian soil has a basic pH of 7.7, and contains 0.6% perchlorate by weight, concentrations that are toxic to humans. Streaks are common across Mars and new ones appear frequently on steep slopes of craters, troughs, and valleys. The streaks are dark at first and get lighter with age. The streaks can start in a tiny area, then spread out for hundreds of metres. They have been seen to follow the edges of boulders and other obstacles in their path. The commonly accepted hypotheses include that they are dark underlying layers of soil revealed after avalanches of bright dust or dust devils. Several other explanations have been put forward, including those that involve water or even the growth of organisms. Environmental radiation levels on the surface are on average 0.64 millisieverts of radiation per day, and significantly less than the radiation of 1.84 millisieverts per day or 22 millirads per day during the flight to and from Mars. For comparison the radiation levels in low Earth orbit, where Earth's space stations orbit, are around 0.5 millisieverts of radiation per day. Hellas Planitia has the lowest surface radiation at about 0.342 millisieverts per day, featuring lava tubes southwest of Hadriacus Mons with potentially levels as low as 0.064 millisieverts per day, comparable to radiation levels during flights on Earth. Although Mars has no evidence of a structured global magnetic field, observations show that parts of the planet's crust have been magnetized, suggesting that alternating polarity reversals of its dipole field have occurred in the past. This paleomagnetism of magnetically susceptible minerals is similar to the alternating bands found on Earth's ocean floors. One hypothesis, published in 1999 and re-examined in October 2005 (with the help of the Mars Global Surveyor), is that these bands suggest plate tectonic activity on Mars four billion years ago, before the planetary dynamo ceased to function and the planet's magnetic field faded. Geography and features Although better remembered for mapping the Moon, Johann Heinrich von Mädler and Wilhelm Beer were the first areographers. They began by establishing that most of Mars's surface features were permanent and by more precisely determining the planet's rotation period. In 1840, Mädler combined ten years of observations and drew the first map of Mars. Features on Mars are named from a variety of sources. Albedo features are named for classical mythology. Craters larger than roughly 50 km are named for deceased scientists and writers and others who have contributed to the study of Mars. Smaller craters are named for towns and villages of the world with populations of less than 100,000. Large valleys are named for the word "Mars" or "star" in various languages; smaller valleys are named for rivers. Large albedo features retain many of the older names but are often updated to reflect new knowledge of the nature of the features. For example, Nix Olympica (the snows of Olympus) has become Olympus Mons (Mount Olympus). The surface of Mars as seen from Earth is divided into two kinds of areas, with differing albedo. The paler plains covered with dust and sand rich in reddish iron oxides were once thought of as Martian "continents" and given names like Arabia Terra (land of Arabia) or Amazonis Planitia (Amazonian plain). The dark features were thought to be seas, hence their names Mare Erythraeum, Mare Sirenum and Aurorae Sinus. The largest dark feature seen from Earth is Syrtis Major Planum. The permanent northern polar ice cap is named Planum Boreum. The southern cap is called Planum Australe. Mars's equator is defined by its rotation, but the location of its Prime Meridian was specified, as was Earth's (at Greenwich), by choice of an arbitrary point; Mädler and Beer selected a line for their first maps of Mars in 1830. After the spacecraft Mariner 9 provided extensive imagery of Mars in 1972, a small crater (later called Airy-0), located in the Sinus Meridiani ("Middle Bay" or "Meridian Bay"), was chosen by Merton E. Davies, Harold Masursky, and Gérard de Vaucouleurs for the definition of 0.0° longitude to coincide with the original selection. Because Mars has no oceans, and hence no "sea level", a zero-elevation surface had to be selected as a reference level; this is called the areoid of Mars, analogous to the terrestrial geoid. Zero altitude was defined by the height at which there is 610.5 Pa (6.105 mbar) of atmospheric pressure. This pressure corresponds to the triple point of water, and it is about 0.6% of the sea level surface pressure on Earth (0.006 atm). For mapping purposes, the United States Geological Survey divides the surface of Mars into thirty cartographic quadrangles, each named for a classical albedo feature it contains. In April 2023, The New York Times reported an updated global map of Mars based on images from the Hope spacecraft. A related, but much more detailed, global Mars map was released by NASA on 16 April 2023. The vast upland region Tharsis contains several massive volcanoes, which include the shield volcano Olympus Mons. The edifice is over 600 km (370 mi) wide. Because the mountain is so large, with complex structure at its edges, giving a definite height to it is difficult. Its local relief, from the foot of the cliffs which form its northwest margin to its peak, is over 21 km (13 mi), a little over twice the height of Mauna Kea as measured from its base on the ocean floor. The total elevation change from the plains of Amazonis Planitia, over 1,000 km (620 mi) to the northwest, to the summit approaches 26 km (16 mi), roughly three times the height of Mount Everest, which in comparison stands at just over 8.8 kilometres (5.5 mi). Consequently, Olympus Mons is either the tallest or second-tallest mountain in the Solar System; the only known mountain which might be taller is the Rheasilvia peak on the asteroid Vesta, at 20–25 km (12–16 mi). The dichotomy of Martian topography is striking: northern plains flattened by lava flows contrast with the southern highlands, pitted and cratered by ancient impacts. It is possible that, four billion years ago, the Northern Hemisphere of Mars was struck by an object one-tenth to two-thirds the size of Earth's Moon. If this is the case, the Northern Hemisphere of Mars would be the site of an impact crater 10,600 by 8,500 kilometres (6,600 by 5,300 mi) in size, or roughly the area of Europe, Asia, and Australia combined, surpassing Utopia Planitia and the Moon's South Pole–Aitken basin as the largest impact crater in the Solar System. Mars is scarred by 43,000 impact craters with a diameter of 5 kilometres (3.1 mi) or greater. The largest exposed crater is Hellas, which is 2,300 kilometres (1,400 mi) wide and 7,000 metres (23,000 ft) deep, and is a light albedo feature clearly visible from Earth. There are other notable impact features, such as Argyre, which is around 1,800 kilometres (1,100 mi) in diameter, and Isidis, which is around 1,500 kilometres (930 mi) in diameter. Due to the smaller mass and size of Mars, the probability of an object colliding with the planet is about half that of Earth. Mars is located closer to the asteroid belt, so it has an increased chance of being struck by materials from that source. Mars is more likely to be struck by short-period comets, i.e., those that lie within the orbit of Jupiter. Martian craters can[discuss] have a morphology that suggests the ground became wet after the meteor impact. The large canyon, Valles Marineris (Latin for 'Mariner Valleys, also known as Agathodaemon in the old canal maps), has a length of 4,000 kilometres (2,500 mi) and a depth of up to 7 kilometres (4.3 mi). The length of Valles Marineris is equivalent to the length of Europe and extends across one-fifth the circumference of Mars. By comparison, the Grand Canyon on Earth is only 446 kilometres (277 mi) long and nearly 2 kilometres (1.2 mi) deep. Valles Marineris was formed due to the swelling of the Tharsis area, which caused the crust in the area of Valles Marineris to collapse. In 2012, it was proposed that Valles Marineris is not just a graben, but a plate boundary where 150 kilometres (93 mi) of transverse motion has occurred, making Mars a planet with possibly a two-tectonic plate arrangement. Images from the Thermal Emission Imaging System (THEMIS) aboard NASA's Mars Odyssey orbiter have revealed seven possible cave entrances on the flanks of the volcano Arsia Mons. The caves, named after loved ones of their discoverers, are collectively known as the "seven sisters". Cave entrances measure from 100 to 252 metres (328 to 827 ft) wide and they are estimated to be at least 73 to 96 metres (240 to 315 ft) deep. Because light does not reach the floor of most of the caves, they may extend much deeper than these lower estimates and widen below the surface. "Dena" is the only exception; its floor is visible and was measured to be 130 metres (430 ft) deep. The interiors of these caverns may be protected from micrometeoroids, UV radiation, solar flares and high energy particles that bombard the planet's surface. Martian geysers (or CO2 jets) are putative sites of small gas and dust eruptions that occur in the south polar region of Mars during the spring thaw. "Dark dune spots" and "spiders" – or araneiforms – are the two most visible types of features ascribed to these eruptions. Similarly sized dust will settle from the thinner Martian atmosphere sooner than it would on Earth. For example, the dust suspended by the 2001 global dust storms on Mars only remained in the Martian atmosphere for 0.6 years, while the dust from Mount Pinatubo took about two years to settle. However, under current Martian conditions, the mass movements involved are generally much smaller than on Earth. Even the 2001 global dust storms on Mars moved only the equivalent of a very thin dust layer – about 3 μm thick if deposited with uniform thickness between 58° north and south of the equator. Dust deposition at the two rover sites has proceeded at a rate of about the thickness of a grain every 100 sols. Atmosphere Mars lost its magnetosphere 4 billion years ago, possibly because of numerous asteroid strikes, so the solar wind interacts directly with the Martian ionosphere, lowering the atmospheric density by stripping away atoms from the outer layer. Both Mars Global Surveyor and Mars Express have detected ionized atmospheric particles trailing off into space behind Mars, and this atmospheric loss is being studied by the MAVEN orbiter. Compared to Earth, the atmosphere of Mars is quite rarefied. Atmospheric pressure on the surface today ranges from a low of 30 Pa (0.0044 psi) on Olympus Mons to over 1,155 Pa (0.1675 psi) in Hellas Planitia, with a mean pressure at the surface level of 600 Pa (0.087 psi). The highest atmospheric density on Mars is equal to that found 35 kilometres (22 mi) above Earth's surface. The resulting mean surface pressure is only 0.6% of Earth's 101.3 kPa (14.69 psi). The scale height of the atmosphere is about 10.8 kilometres (6.7 mi), which is higher than Earth's 6 kilometres (3.7 mi), because the surface gravity of Mars is only about 38% of Earth's. The atmosphere of Mars consists of about 96% carbon dioxide, 1.93% argon and 1.89% nitrogen along with traces of oxygen and water. The atmosphere is quite dusty, containing particulates about 1.5 μm in diameter which give the Martian sky a tawny color when seen from the surface. It may take on a pink hue due to iron oxide particles suspended in it. Despite repeated detections of methane on Mars, there is no scientific consensus as to its origin. One suggestion is that methane exists on Mars and that its concentration fluctuates seasonally. The existence of methane could be produced by non-biological process such as serpentinization involving water, carbon dioxide, and the mineral olivine, which is known to be common on Mars, or by Martian life. Compared to Earth, its higher concentration of atmospheric CO2 and lower surface pressure may be why sound is attenuated more on Mars, where natural sources are rare apart from the wind. Using acoustic recordings collected by the Perseverance rover, researchers concluded that the speed of sound there is approximately 240 m/s for frequencies below 240 Hz, and 250 m/s for those above. Auroras have been detected on Mars. Because Mars lacks a global magnetic field, the types and distribution of auroras there differ from those on Earth; rather than being mostly restricted to polar regions as is the case on Earth, a Martian aurora can encompass the planet. In September 2017, NASA reported radiation levels on the surface of the planet Mars were temporarily doubled, and were associated with an aurora 25 times brighter than any observed earlier, due to a massive, and unexpected, solar storm in the middle of the month. Mars has seasons, alternating between its northern and southern hemispheres, similar to on Earth. Additionally the orbit of Mars has, compared to Earth's, a large eccentricity and approaches perihelion when it is summer in its southern hemisphere and winter in its northern, and aphelion when it is winter in its southern hemisphere and summer in its northern. As a result, the seasons in its southern hemisphere are more extreme and the seasons in its northern are milder than would otherwise be the case. The summer temperatures in the south can be warmer than the equivalent summer temperatures in the north by up to 30 °C (54 °F). Martian surface temperatures vary from lows of about −110 °C (−166 °F) to highs of up to 35 °C (95 °F) in equatorial summer. The wide range in temperatures is due to the thin atmosphere which cannot store much solar heat, the low atmospheric pressure (about 1% that of the atmosphere of Earth), and the low thermal inertia of Martian soil. The planet is 1.52 times as far from the Sun as Earth, resulting in just 43% of the amount of sunlight. Mars has the largest dust storms in the Solar System, reaching speeds of over 160 km/h (100 mph). These can vary from a storm over a small area, to gigantic storms that cover the entire planet. They tend to occur when Mars is closest to the Sun, and have been shown to increase global temperature. Seasons also produce dry ice covering polar ice caps. Hydrology While Mars contains water in larger amounts, most of it is dust covered water ice at the Martian polar ice caps. The volume of water ice in the south polar ice cap, if melted, would be enough to cover most of the surface of the planet with a depth of 11 metres (36 ft). Water in its liquid form cannot persist on the surface due to Mars's low atmospheric pressure, which is less than 1% that of Earth. Only at the lowest of elevations are the pressure and temperature high enough for liquid water to exist for short periods. Although little water is present in the atmosphere, there is enough to produce clouds of water ice and different cases of snow and frost, often mixed with snow of carbon dioxide dry ice. Landforms visible on Mars strongly suggest that liquid water has existed on the planet's surface. Huge linear swathes of scoured ground, known as outflow channels, cut across the surface in about 25 places. These are thought to be a record of erosion caused by the catastrophic release of water from subsurface aquifers, though some of these structures have been hypothesized to result from the action of glaciers or lava. One of the larger examples, Ma'adim Vallis, is 700 kilometres (430 mi) long, much greater than the Grand Canyon, with a width of 20 kilometres (12 mi) and a depth of 2 kilometres (1.2 mi) in places. It is thought to have been carved by flowing water early in Mars's history. The youngest of these channels is thought to have formed only a few million years ago. Elsewhere, particularly on the oldest areas of the Martian surface, finer-scale, dendritic networks of valleys are spread across significant proportions of the landscape. Features of these valleys and their distribution strongly imply that they were carved by runoff resulting from precipitation in early Mars history. Subsurface water flow and groundwater sapping may play important subsidiary roles in some networks, but precipitation was probably the root cause of the incision in almost all cases. Along craters and canyon walls, there are thousands of features that appear similar to terrestrial gullies. The gullies tend to be in the highlands of the Southern Hemisphere and face the Equator; all are poleward of 30° latitude. A number of authors have suggested that their formation process involves liquid water, probably from melting ice, although others have argued for formation mechanisms involving carbon dioxide frost or the movement of dry dust. No partially degraded gullies have formed by weathering and no superimposed impact craters have been observed, indicating that these are young features, possibly still active. Other geological features, such as deltas and alluvial fans preserved in craters, are further evidence for warmer, wetter conditions at an interval or intervals in earlier Mars history. Such conditions necessarily require the widespread presence of crater lakes across a large proportion of the surface, for which there is independent mineralogical, sedimentological and geomorphological evidence. Further evidence that liquid water once existed on the surface of Mars comes from the detection of specific minerals such as hematite and goethite, both of which sometimes form in the presence of water. The chemical signature of water vapor on Mars was first unequivocally demonstrated in 1963 by spectroscopy using an Earth-based telescope. In 2004, Opportunity detected the mineral jarosite. This forms only in the presence of acidic water, showing that water once existed on Mars. The Spirit rover found concentrated deposits of silica in 2007 that indicated wet conditions in the past, and in December 2011, the mineral gypsum, which also forms in the presence of water, was found on the surface by NASA's Mars rover Opportunity. It is estimated that the amount of water in the upper mantle of Mars, represented by hydroxyl ions contained within Martian minerals, is equal to or greater than that of Earth at 50–300 parts per million of water, which is enough to cover the entire planet to a depth of 200–1,000 metres (660–3,280 ft). On 18 March 2013, NASA reported evidence from instruments on the Curiosity rover of mineral hydration, likely hydrated calcium sulfate, in several rock samples including the broken fragments of "Tintina" rock and "Sutton Inlier" rock as well as in veins and nodules in other rocks like "Knorr" rock and "Wernicke" rock. Analysis using the rover's DAN instrument provided evidence of subsurface water, amounting to as much as 4% water content, down to a depth of 60 centimetres (24 in), during the rover's traverse from the Bradbury Landing site to the Yellowknife Bay area in the Glenelg terrain. In September 2015, NASA announced that they had found strong evidence of hydrated brine flows in recurring slope lineae, based on spectrometer readings of the darkened areas of slopes. These streaks flow downhill in Martian summer, when the temperature is above −23 °C, and freeze at lower temperatures. These observations supported earlier hypotheses, based on timing of formation and their rate of growth, that these dark streaks resulted from water flowing just below the surface. However, later work suggested that the lineae may be dry, granular flows instead, with at most a limited role for water in initiating the process. A definitive conclusion about the presence, extent, and role of liquid water on the Martian surface remains elusive. Researchers suspect much of the low northern plains of the planet were covered with an ocean hundreds of meters deep, though this theory remains controversial. In March 2015, scientists stated that such an ocean might have been the size of Earth's Arctic Ocean. This finding was derived from the ratio of protium to deuterium in the modern Martian atmosphere compared to that ratio on Earth. The amount of Martian deuterium (D/H = 9.3 ± 1.7 10−4) is five to seven times the amount on Earth (D/H = 1.56 10−4), suggesting that ancient Mars had significantly higher levels of water. Results from the Curiosity rover had previously found a high ratio of deuterium in Gale Crater, though not significantly high enough to suggest the former presence of an ocean. Other scientists caution that these results have not been confirmed, and point out that Martian climate models have not yet shown that the planet was warm enough in the past to support bodies of liquid water. Near the northern polar cap is the 81.4 kilometres (50.6 mi) wide Korolev Crater, which the Mars Express orbiter found to be filled with approximately 2,200 cubic kilometres (530 cu mi) of water ice. In November 2016, NASA reported finding a large amount of underground ice in the Utopia Planitia region. The volume of water detected has been estimated to be equivalent to the volume of water in Lake Superior (which is 12,100 cubic kilometers). During observations from 2018 through 2021, the ExoMars Trace Gas Orbiter spotted indications of water, probably subsurface ice, in the Valles Marineris canyon system. Orbital motion Mars's average distance from the Sun is roughly 230 million km (143 million mi), and its orbital period is 687 (Earth) days. The solar day (or sol) on Mars is only slightly longer than an Earth day: 24 hours, 39 minutes, and 35.244 seconds. A Martian year is equal to 1.8809 Earth years, or 1 year, 320 days, and 18.2 hours. The gravitational potential difference and thus the delta-v needed to transfer between Mars and Earth is the second lowest for Earth. The axial tilt of Mars is 25.19° relative to its orbital plane, which is similar to the axial tilt of Earth. As a result, Mars has seasons like Earth, though on Mars they are nearly twice as long because its orbital period is that much longer. In the present day, the orientation of the north pole of Mars is close to the star Deneb. Mars has a relatively pronounced orbital eccentricity of about 0.09; of the seven other planets in the Solar System, only Mercury has a larger orbital eccentricity. It is known that in the past, Mars has had a much more circular orbit. At one point, 1.35 million Earth years ago, Mars had an eccentricity of roughly 0.002, much less than that of Earth today. Mars's cycle of eccentricity is 96,000 Earth years compared to Earth's cycle of 100,000 years. Mars has its closest approach to Earth (opposition) in a synodic period of 779.94 days. It should not be confused with Mars conjunction, where the Earth and Mars are at opposite sides of the Solar System and form a straight line crossing the Sun. The average time between the successive oppositions of Mars, its synodic period, is 780 days; but the number of days between successive oppositions can range from 764 to 812. The distance at close approach varies between about 54 and 103 million km (34 and 64 million mi) due to the planets' elliptical orbits, which causes comparable variation in angular size. At their furthest Mars and Earth can be as far as 401 million km (249 million mi) apart. Mars comes into opposition from Earth every 2.1 years. The planets come into opposition near Mars's perihelion in 2003, 2018 and 2035, with the 2020 and 2033 events being particularly close to perihelic opposition. The mean apparent magnitude of Mars is +0.71 with a standard deviation of 1.05. Because the orbit of Mars is eccentric, the magnitude at opposition from the Sun can range from about −3.0 to −1.4. The minimum brightness is magnitude +1.86 when the planet is near aphelion and in conjunction with the Sun. At its brightest, Mars (along with Jupiter) is second only to Venus in apparent brightness. Mars usually appears distinctly yellow, orange, or red. When farthest away from Earth, it is more than seven times farther away than when it is closest. Mars is usually close enough for particularly good viewing once or twice at 15-year or 17-year intervals. Optical ground-based telescopes are typically limited to resolving features about 300 kilometres (190 mi) across when Earth and Mars are closest because of Earth's atmosphere. As Mars approaches opposition, it begins a period of retrograde motion, which means it will appear to move backwards in a looping curve with respect to the background stars. This retrograde motion lasts for about 72 days, and Mars reaches its peak apparent brightness in the middle of this interval. Moons Mars has two relatively small (compared to Earth's) natural moons, Phobos (about 22 km (14 mi) in diameter) and Deimos (about 12 km (7.5 mi) in diameter), which orbit at 9,376 km (5,826 mi) and 23,460 km (14,580 mi) around the planet. The origin of both moons is unclear, although a popular theory states that they were asteroids captured into Martian orbit. Both satellites were discovered in 1877 by Asaph Hall and were named after the characters Phobos (the deity of panic and fear) and Deimos (the deity of terror and dread), twins from Greek mythology who accompanied their father Ares, god of war, into battle. Mars was the Roman equivalent to Ares. In modern Greek, the planet retains its ancient name Ares (Aris: Άρης). From the surface of Mars, the motions of Phobos and Deimos appear different from that of the Earth's satellite, the Moon. Phobos rises in the west, sets in the east, and rises again in just 11 hours. Deimos, being only just outside synchronous orbit – where the orbital period would match the planet's period of rotation – rises as expected in the east, but slowly. Because the orbit of Phobos is below a synchronous altitude, tidal forces from Mars are gradually lowering its orbit. In about 50 million years, it could either crash into Mars's surface or break up into a ring structure around the planet. The origin of the two satellites is not well understood. Their low albedo and carbonaceous chondrite composition have been regarded as similar to asteroids, supporting a capture theory. The unstable orbit of Phobos would seem to point toward a relatively recent capture. But both have circular orbits near the equator, which is unusual for captured objects, and the required capture dynamics are complex. Accretion early in the history of Mars is plausible, but would not account for a composition resembling asteroids rather than Mars itself, if that is confirmed. Mars may have yet-undiscovered moons, smaller than 50 to 100 metres (160 to 330 ft) in diameter, and a dust ring is predicted to exist between Phobos and Deimos. A third possibility for their origin as satellites of Mars is the involvement of a third body or a type of impact disruption. More-recent lines of evidence for Phobos having a highly porous interior, and suggesting a composition containing mainly phyllosilicates and other minerals known from Mars, point toward an origin of Phobos from material ejected by an impact on Mars that reaccreted in Martian orbit, similar to the prevailing theory for the origin of Earth's satellite. Although the visible and near-infrared (VNIR) spectra of the moons of Mars resemble those of outer-belt asteroids, the thermal infrared spectra of Phobos are reported to be inconsistent with chondrites of any class. It is also possible that Phobos and Deimos were fragments of an older moon, formed by debris from a large impact on Mars, and then destroyed by a more recent impact upon the satellite. More recently, a study conducted by a team of researchers from multiple countries suggests that a lost moon, at least fifteen times the size of Phobos, may have existed in the past. By analyzing rocks which point to tidal processes on the planet, it is possible that these tides may have been regulated by a past moon. Human observations and exploration The history of observations of Mars is marked by oppositions of Mars when the planet is closest to Earth and hence is most easily visible, which occur every couple of years. Even more notable are the perihelic oppositions of Mars, which are distinguished because Mars is close to perihelion, making it even closer to Earth. The ancient Sumerians named Mars Nergal, the god of war and plague. During Sumerian times, Nergal was a minor deity of little significance, but, during later times, his main cult center was the city of Nineveh. In Mesopotamian texts, Mars is referred to as the "star of judgement of the fate of the dead". The existence of Mars as a wandering object in the night sky was also recorded by the ancient Egyptian astronomers and, by 1534 BCE, they were familiar with the retrograde motion of the planet. By the period of the Neo-Babylonian Empire, the Babylonian astronomers were making regular records of the positions of the planets and systematic observations of their behavior. For Mars, they knew that the planet made 37 synodic periods, or 42 circuits of the zodiac, every 79 years. They invented arithmetic methods for making minor corrections to the predicted positions of the planets. In Ancient Greece, the planet was known as Πυρόεις. Commonly, the Greek name for the planet now referred to as Mars, was Ares. It was the Romans who named the planet Mars, for their god of war, often represented by the sword and shield of the planet's namesake. In the fourth century BCE, Aristotle noted that Mars disappeared behind the Moon during an occultation, indicating that the planet was farther away. Ptolemy, a Greek living in Alexandria, attempted to address the problem of the orbital motion of Mars. Ptolemy's model and his collective work on astronomy was presented in the multi-volume collection later called the Almagest (from the Arabic for "greatest"), which became the authoritative treatise on Western astronomy for the next fourteen centuries. Literature from ancient China confirms that Mars was known by Chinese astronomers by no later than the fourth century BCE. In the East Asian cultures, Mars is traditionally referred to as the "fire star" (火星) based on the Wuxing system. In 1609 Johannes Kepler published a 10 year study of Martian orbit, using the diurnal parallax of Mars, measured by Tycho Brahe, to make a preliminary calculation of the relative distance to the planet. From Brahe's observations of Mars, Kepler deduced that the planet orbited the Sun not in a circle, but in an ellipse. Moreover, Kepler showed that Mars sped up as it approached the Sun and slowed down as it moved farther away, in a manner that later physicists would explain as a consequence of the conservation of angular momentum.: 433–437 In 1610 the first use of a telescope for astronomical observation, including Mars, was performed by Italian astronomer Galileo Galilei. With the telescope the diurnal parallax of Mars was again measured in an effort to determine the Sun-Earth distance. This was first performed by Giovanni Domenico Cassini in 1672. The early parallax measurements were hampered by the quality of the instruments. The only occultation of Mars by Venus observed was that of 13 October 1590, seen by Michael Maestlin at Heidelberg. By the 19th century, the resolution of telescopes reached a level sufficient for surface features to be identified. On 5 September 1877, a perihelic opposition to Mars occurred. The Italian astronomer Giovanni Schiaparelli used a 22-centimetre (8.7 in) telescope in Milan to help produce the first detailed map of Mars. These maps notably contained features he called canali, which, with the possible exception of the natural canyon Valles Marineris, were later shown to be an optical illusion. These canali were supposedly long, straight lines on the surface of Mars, to which he gave names of famous rivers on Earth. His term, which means "channels" or "grooves", was popularly mistranslated in English as "canals". Influenced by the observations, the orientalist Percival Lowell founded an observatory which had 30- and 45-centimetre (12- and 18-in) telescopes. The observatory was used for the exploration of Mars during the last good opportunity in 1894, and the following less favorable oppositions. He published several books on Mars and life on the planet, which had a great influence on the public. The canali were independently observed by other astronomers, like Henri Joseph Perrotin and Louis Thollon in Nice, using one of the largest telescopes of that time. The seasonal changes (consisting of the diminishing of the polar caps and the dark areas formed during Martian summers) in combination with the canals led to speculation about life on Mars, and it was a long-held belief that Mars contained vast seas and vegetation. As bigger telescopes were used, fewer long, straight canali were observed. During observations in 1909 by Antoniadi with an 84-centimetre (33 in) telescope, irregular patterns were observed, but no canali were seen. The first spacecraft from Earth to visit Mars was Mars 1 of the Soviet Union, which flew by in 1963, but contact was lost en route. NASA's Mariner 4 followed and became the first spacecraft to successfully transmit from Mars; launched on 28 November 1964, it made its closest approach to the planet on 15 July 1965. Mariner 4 detected the weak Martian radiation belt, measured at about 0.1% that of Earth, and captured the first images of another planet from deep space. Once spacecraft visited the planet during the 1960s and 1970s, many previous concepts of Mars were radically broken. After the results of the Viking life-detection experiments, the hypothesis of a dead planet was generally accepted. The data from Mariner 9 and Viking allowed better maps of Mars to be made. Until 1997 and after Viking 1 shut down in 1982, Mars was only visited by three unsuccessful probes, two flying past without contact (Phobos 1, 1988; Mars Observer, 1993), and one (Phobos 2 1989) malfunctioning in orbit before reaching its destination Phobos. In 1997 Mars Pathfinder became the first successful rover mission beyond the Moon and started together with Mars Global Surveyor (operated until late 2006) an uninterrupted active robotic presence at Mars that has lasted until today. It produced complete, extremely detailed maps of the Martian topography, magnetic field and surface minerals. Starting with these missions a range of new improved crewless spacecraft, including orbiters, landers, and rovers, have been sent to Mars, with successful missions by the NASA (United States), Jaxa (Japan), ESA, United Kingdom, ISRO (India), Roscosmos (Russia), the United Arab Emirates, and CNSA (China) to study the planet's surface, climate, and geology, uncovering the different elements of the history and dynamic of the hydrosphere of Mars and possible traces of ancient life. As of 2023[update], Mars is host to ten functioning spacecraft. Eight are in orbit: 2001 Mars Odyssey, Mars Express, Mars Reconnaissance Orbiter, MAVEN, ExoMars Trace Gas Orbiter, the Hope orbiter, and the Tianwen-1 orbiter. Another two are on the surface: the Mars Science Laboratory Curiosity rover and the Perseverance rover. Collected maps are available online at websites including Google Mars. NASA provides two online tools: Mars Trek, which provides visualizations of the planet using data from 50 years of exploration, and Experience Curiosity, which simulates traveling on Mars in 3-D with Curiosity. Planned missions to Mars include: As of February 2024[update], debris from these types of missions has reached over seven tons. Most of it consists of crashed and inactive spacecraft as well as discarded components. In April 2024, NASA selected several companies to begin studies on providing commercial services to further enable robotic science on Mars. Key areas include establishing telecommunications, payload delivery and surface imaging. Habitability and habitation During the late 19th century, it was widely accepted in the astronomical community that Mars had life-supporting qualities, including the presence of oxygen and water. However, in 1894 W. W. Campbell at Lick Observatory observed the planet and found that "if water vapor or oxygen occur in the atmosphere of Mars it is in quantities too small to be detected by spectroscopes then available". That observation contradicted many of the measurements of the time and was not widely accepted. Campbell and V. M. Slipher repeated the study in 1909 using better instruments, but with the same results. It was not until the findings were confirmed by W. S. Adams in 1925 that the myth of the Earth-like habitability of Mars was finally broken. However, even in the 1960s, articles were published on Martian biology, putting aside explanations other than life for the seasonal changes on Mars. The current understanding of planetary habitability – the ability of a world to develop environmental conditions favorable to the emergence of life – favors planets that have liquid water on their surface. Most often this requires the orbit of a planet to lie within the habitable zone, which for the Sun is estimated to extend from within the orbit of Earth to about that of Mars. During perihelion, Mars dips inside this region, but Mars's thin (low-pressure) atmosphere prevents liquid water from existing over large regions for extended periods. The past flow of liquid water demonstrates the planet's potential for habitability. Recent evidence has suggested that any water on the Martian surface may have been too salty and acidic to support regular terrestrial life. The environmental conditions on Mars are a challenge to sustaining organic life: the planet has little heat transfer across its surface, it has poor insulation against bombardment by the solar wind due to the absence of a magnetosphere and has insufficient atmospheric pressure to retain water in a liquid form (water instead sublimes to a gaseous state). Mars is nearly, or perhaps totally, geologically dead; the end of volcanic activity has apparently stopped the recycling of chemicals and minerals between the surface and interior of the planet. Evidence suggests that the planet was once significantly more habitable than it is today, but whether living organisms ever existed there remains unknown. The Viking probes of the mid-1970s carried experiments designed to detect microorganisms in Martian soil at their respective landing sites and had positive results, including a temporary increase in CO2 production on exposure to water and nutrients. This sign of life was later disputed by scientists, resulting in a continuing debate, with NASA scientist Gilbert Levin asserting that Viking may have found life. A 2014 analysis of Martian meteorite EETA79001 found chlorate, perchlorate, and nitrate ions in sufficiently high concentrations to suggest that they are widespread on Mars. UV and X-ray radiation would turn chlorate and perchlorate ions into other, highly reactive oxychlorines, indicating that any organic molecules would have to be buried under the surface to survive. Small quantities of methane and formaldehyde detected by Mars orbiters are both claimed to be possible evidence for life, as these chemical compounds would quickly break down in the Martian atmosphere. Alternatively, these compounds may instead be replenished by volcanic or other geological means, such as serpentinite. Impact glass, formed by the impact of meteors, which on Earth can preserve signs of life, has also been found on the surface of the impact craters on Mars. Likewise, the glass in impact craters on Mars could have preserved signs of life, if life existed at the site. The Cheyava Falls rock discovered on Mars in June 2024 has been designated by NASA as a "potential biosignature" and was core sampled by the Perseverance rover for possible return to Earth and further examination. Although highly intriguing, no definitive final determination on a biological or abiotic origin of this rock can be made with the data currently available. Several plans for a human mission to Mars have been proposed, but none have come to fruition. The NASA Authorization Act of 2017 directed NASA to study the feasibility of a crewed Mars mission in the early 2030s; the resulting report concluded that this would be unfeasible. In addition, in 2021, China was planning to send a crewed Mars mission in 2033. Privately held companies such as SpaceX have also proposed plans to send humans to Mars, with the eventual goal to settle on the planet. As of 2024, SpaceX has proceeded with the development of the Starship launch vehicle with the goal of Mars colonization. In plans shared with the company in April 2024, Elon Musk envisions the beginning of a Mars colony within the next twenty years. This would be enabled by the planned mass manufacturing of Starship and initially sustained by resupply from Earth, and in situ resource utilization on Mars, until the Mars colony reaches full self sustainability. Any future human mission to Mars will likely take place within the optimal Mars launch window, which occurs every 26 months. The moon Phobos has been proposed as an anchor point for a space elevator. Besides national space agencies and space companies, groups such as the Mars Society and The Planetary Society advocate for human missions to Mars. In culture Mars is named after the Roman god of war (Greek Ares), but was also associated with the demi-god Heracles (Roman Hercules) by ancient Greek astronomers, as detailed by Aristotle. This association between Mars and war dates back at least to Babylonian astronomy, in which the planet was named for the god Nergal, deity of war and destruction. It persisted into modern times, as exemplified by Gustav Holst's orchestral suite The Planets, whose famous first movement labels Mars "The Bringer of War". The planet's symbol, a circle with a spear pointing out to the upper right, is also used as a symbol for the male gender. The symbol dates from at least the 11th century, though a possible predecessor has been found in the Greek Oxyrhynchus Papyri. The idea that Mars was populated by intelligent Martians became widespread in the late 19th century. Schiaparelli's "canali" observations combined with Percival Lowell's books on the subject put forward the standard notion of a planet that was a drying, cooling, dying world with ancient civilizations constructing irrigation works. Many other observations and proclamations by notable personalities added to what has been termed "Mars Fever". In the present day, high-resolution mapping of the surface of Mars has revealed no artifacts of habitation, but pseudoscientific speculation about intelligent life on Mars still continues. Reminiscent of the canali observations, these speculations are based on small scale features perceived in the spacecraft images, such as "pyramids" and the "Face on Mars". In his book Cosmos, planetary astronomer Carl Sagan wrote: "Mars has become a kind of mythic arena onto which we have projected our Earthly hopes and fears." The depiction of Mars in fiction has been stimulated by its dramatic red color and by nineteenth-century scientific speculations that its surface conditions might support not just life but intelligent life. This gave way to many science fiction stories involving these concepts, such as H. G. Wells's The War of the Worlds, in which Martians seek to escape their dying planet by invading Earth; Ray Bradbury's The Martian Chronicles, in which human explorers accidentally destroy a Martian civilization; as well as Edgar Rice Burroughs's series Barsoom, C. S. Lewis's novel Out of the Silent Planet (1938), and a number of Robert A. Heinlein stories before the mid-sixties. Since then, depictions of Martians have also extended to animation. A comic figure of an intelligent Martian, Marvin the Martian, appeared in Haredevil Hare (1948) as a character in the Looney Tunes animated cartoons of Warner Brothers, and has continued as part of popular culture to the present. After the Mariner and Viking spacecraft had returned pictures of Mars as a lifeless and canal-less world, these ideas about Mars were abandoned; for many science-fiction authors, the new discoveries initially seemed like a constraint, but eventually the post-Viking knowledge of Mars became itself a source of inspiration for works like Kim Stanley Robinson's Mars trilogy. See also Notes References Further reading External links Solar System → Local Interstellar Cloud → Local Bubble → Gould Belt → Orion Arm → Milky Way → Milky Way subgroup → Local Group → Local Sheet → Local Volume → Virgo Supercluster → Laniakea Supercluster → Pisces–Cetus Supercluster Complex → Local Hole → Observable universe → UniverseEach arrow (→) may be read as "within" or "part of". |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.