text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/Black_hole#Observational_evidence] | [TOKENS: 13839] |
Contents Black hole A black hole is an astronomical body so compact that its gravity prevents anything, including light, from escaping. Albert Einstein's theory of general relativity predicts that a sufficiently compact mass will form a black hole. The boundary of no escape is called the event horizon. In general relativity, a black hole's event horizon seals an object's fate but produces no locally detectable change when crossed. General relativity also predicts that every black hole should have a central singularity, where the curvature of spacetime is infinite. In many ways, a black hole acts like an ideal black body, as it reflects no light. Quantum field theory in curved spacetime predicts that event horizons emit Hawking radiation, with the same spectrum as a black body of a temperature inversely proportional to its mass. This temperature is of the order of billionths of a kelvin for stellar black holes, making it essentially impossible to observe directly. Objects whose gravitational fields are too strong for light to escape were first considered in the 18th century by John Michell and Pierre-Simon Laplace. In 1916, Karl Schwarzschild found the first modern solution of general relativity that would characterise a black hole. Due to his influential research, the Schwarzschild metric is named after him. David Finkelstein, in 1958, first interpreted Schwarzschild's model as a region of space from which nothing can escape. Black holes were long considered a mathematical curiosity; it was not until the 1960s that theoretical work showed they were a generic prediction of general relativity. The first black hole known was Cygnus X-1, identified by several researchers independently in 1971. Black holes typically form when massive stars collapse at the end of their life cycle. After a black hole has formed, it can grow by absorbing mass from its surroundings. Supermassive black holes of millions of solar masses may form by absorbing other stars and merging with other black holes, or via direct collapse of gas clouds. There is consensus that supermassive black holes exist in the centres of most galaxies. The presence of a black hole can be inferred through its interaction with other matter and with electromagnetic radiation such as visible light. Matter falling toward a black hole can form an accretion disk of infalling plasma, heated by friction and emitting light. In extreme cases, this creates a quasar, some of the brightest objects in the universe. Merging black holes can also be detected by observation of the gravitational waves they emit. If other stars are orbiting a black hole, their orbits can be used to determine the black hole's mass and location. Such observations can be used to exclude possible alternatives such as neutron stars. In this way, astronomers have identified numerous stellar black hole candidates in binary systems and established that the radio source known as Sagittarius A*, at the core of the Milky Way galaxy, contains a supermassive black hole of about 4.3 million solar masses. History The idea of a body so massive that even light could not escape was first proposed in the late 18th century by English astronomer and clergyman John Michell and independently by French scientist Pierre-Simon Laplace. Both scholars proposed very large stars in contrast to the modern concept of an extremely dense object. Michell's idea, in a short part of a letter published in 1784, calculated that a star with the same density but 500 times the radius of the sun would not let any emitted light escape; the surface escape velocity would exceed the speed of light.: 122 Michell correctly hypothesized that such supermassive but non-radiating bodies might be detectable through their gravitational effects on nearby visible bodies. In 1796, Laplace mentioned that a star could be invisible if it were sufficiently large while speculating on the origin of the Solar System in his book Exposition du Système du Monde. Franz Xaver von Zach asked Laplace for a mathematical analysis, which Laplace provided and published in a journal edited by von Zach. In 1905, Albert Einstein showed that the laws of electromagnetism would be invariant under a Lorentz transformation: they would be identical for observers travelling at different velocities relative to each other. This discovery became known as the principle of special relativity. Although the laws of mechanics had already been shown to be invariant, gravity remained yet to be included.: 19 In 1907, Einstein published a paper proposing his equivalence principle, the hypothesis that inertial mass and gravitational mass have a common cause. Using the principle, Einstein predicted the redshift and half of the lensing effect of gravity on light; the full prediction of gravitational lensing required development of general relativity.: 19 By 1915, Einstein refined these ideas into his general theory of relativity, which explained how matter affects spacetime, which in turn affects the motion of other matter. This formed the basis for black hole physics. Only a few months after Einstein published the field equations describing general relativity, astrophysicist Karl Schwarzschild set out to apply the idea to stars. He assumed spherical symmetry with no spin and found a solution to Einstein's equations.: 124 A few months after Schwarzschild, Johannes Droste, a student of Hendrik Lorentz, independently gave the same solution. At a certain radius from the center of the mass, the Schwarzschild solution became singular, meaning that some of the terms in the Einstein equations became infinite. The nature of this radius, which later became known as the Schwarzschild radius, was not understood at the time. Many physicists of the early 20th century were skeptical of the existence of black holes. In a 1926 popular science book, Arthur Eddington critiqued the idea of a star with mass compressed to its Schwarzschild radius as a flaw in the then-poorly-understood theory of general relativity.: 134 In 1939, Einstein himself used his theory of general relativity in an attempt to prove that black holes were impossible. His work relied on increasing pressure or increasing centrifugal force balancing the force of gravity so that the object would not collapse beyond its Schwarzschild radius. He missed the possibility that implosion would drive the system below this critical value.: 135 By the 1920s, astronomers had classified a number of white dwarf stars as too cool and dense to be explained by the gradual cooling of ordinary stars. In 1926, Ralph Fowler showed that quantum-mechanical degeneracy pressure was larger than thermal pressure at these densities.: 145 In 1931, Subrahmanyan Chandrasekhar calculated that a non-rotating body of electron-degenerate matter below a certain limiting mass is stable, and by 1934 he showed that this explained the catalog of white dwarf stars.: 151 When Chandrasekhar announced his results, Eddington pointed out that stars above this limit would radiate until they were sufficiently dense to prevent light from exiting, a conclusion he considered absurd. Eddington and, later, Lev Landau argued that some yet unknown mechanism would stop the collapse. In the 1930s, Fritz Zwicky and Walter Baade studied stellar novae, focusing on exceptionally bright ones they called supernovae. Zwicky promoted the idea that supernovae produced stars with the density of atomic nuclei—neutron stars—but this idea was largely ignored.: 171 In 1939, based on Chandrasekhar's reasoning, J. Robert Oppenheimer and George Volkoff predicted that neutron stars below a certain mass limit, later called the Tolman–Oppenheimer–Volkoff limit, would be stable due to neutron degeneracy pressure. Above that limit, they reasoned that either their model would not apply or that gravitational contraction would not stop.: 380 John Archibald Wheeler and two of his students resolved questions about the model behind the Tolman–Oppenheimer–Volkoff (TOV) limit. Harrison and Wheeler developed the equations of state relating density to pressure for cold matter all the way through electron degeneracy and neutron degeneracy. Masami Wakano and Wheeler then used the equations to compute the equilibrium curve for stars, relating mass to circumference. They found no additional features that would invalidate the TOV limit. This meant that the only thing that could prevent black holes from forming was a dynamic process ejecting sufficient mass from a star as it cooled.: 205 The modern concept of black holes was formulated by Robert Oppenheimer and his student Hartland Snyder in 1939.: 80 In the paper, Oppenheimer and Snyder solved Einstein's equations of general relativity for an idealized imploding star, in a model later called the Oppenheimer–Snyder model, then described the results from far outside the star. The implosion starts as one might expect: the star material rapidly collapses inward. However, as the density of the star increases, gravitational time dilation increases and the collapse, viewed from afar, seems to slow down further and further until the star reaches its Schwarzschild radius, where it appears frozen in time.: 217 In 1958, David Finkelstein identified the Schwarzschild surface as an event horizon, calling it "a perfect unidirectional membrane: causal influences can cross it in only one direction". In this sense, events that occur inside of the black hole cannot affect events that occur outside of the black hole. Finkelstein created a new reference frame to include the point of view of infalling observers.: 103 Finkelstein's new frame of reference allowed events at the surface of an imploding star to be related to events far away. By 1962 the two points of view were reconciled, convincing many skeptics that implosion into a black hole made physical sense.: 226 The era from the mid-1960s to the mid-1970s was the "golden age of black hole research", when general relativity and black holes became mainstream subjects of research.: 258 In this period, more general black hole solutions were found. In 1963, Roy Kerr found the exact solution for a rotating black hole. Two years later, Ezra Newman found the cylindrically symmetric solution for a black hole that is both rotating and electrically charged. In 1967, Werner Israel found that the Schwarzschild solution was the only possible solution for a nonspinning, uncharged black hole, meaning that a Schwarzschild black hole would be defined by its mass alone. Similar identities were later found for Reissner-Nordstrom and Kerr black holes, defined only by their mass and their charge or spin respectively. Together, these findings became known as the no-hair theorem, which states that a stationary black hole is completely described by the three parameters of the Kerr–Newman metric: mass, angular momentum, and electric charge. At first, it was suspected that the strange mathematical singularities found in each of the black hole solutions only appeared due to the assumption that a black hole would be perfectly spherically symmetric, and therefore the singularities would not appear in generic situations where black holes would not necessarily be symmetric. This view was held in particular by Vladimir Belinski, Isaak Khalatnikov, and Evgeny Lifshitz, who tried to prove that no singularities appear in generic solutions, although they would later reverse their positions. However, in 1965, Roger Penrose proved that general relativity without quantum mechanics requires that singularities appear in all black holes. Astronomical observations also made great strides during this era. In 1967, Antony Hewish and Jocelyn Bell Burnell discovered pulsars and by 1969, these were shown to be rapidly rotating neutron stars. Until that time, neutron stars, like black holes, were regarded as just theoretical curiosities, but the discovery of pulsars showed their physical relevance and spurred a further interest in all types of compact objects that might be formed by gravitational collapse. Based on observations in Greenwich and Toronto in the early 1970s, Cygnus X-1, a galactic X-ray source discovered in 1964, became the first astronomical object commonly accepted to be a black hole. Work by James Bardeen, Jacob Bekenstein, Carter, and Hawking in the early 1970s led to the formulation of black hole thermodynamics. These laws describe the behaviour of a black hole in close analogy to the laws of thermodynamics by relating mass to energy, area to entropy, and surface gravity to temperature. The analogy was completed: 442 when Hawking, in 1974, showed that quantum field theory implies that black holes should radiate like a black body with a temperature proportional to the surface gravity of the black hole, predicting the effect now known as Hawking radiation. While Cygnus X-1, a stellar-mass black hole, was generally accepted by the scientific community as a black hole by the end of 1973, it would be decades before a supermassive black hole would gain the same broad recognition. Although, as early as the 1960s, physicists such as Donald Lynden-Bell and Martin Rees had suggested that powerful quasars in the center of galaxies were powered by accreting supermassive black holes, little observational proof existed at the time. However, the Hubble Space Telescope, launched decades later, found that supermassive black holes were not only present in these active galactic nuclei, but that supermassive black holes in the center of galaxies were ubiquitous: Almost every galaxy had a supermassive black hole at its center, many of which were quiescent. In 1999, David Merritt proposed the M–sigma relation, which related the dispersion of the velocity of matter in the center bulge of a galaxy to the mass of the supermassive black hole at its core. Subsequent studies confirmed this correlation. Around the same time, based on telescope observations of the velocities of stars at the center of the Milky Way galaxy, independent work groups led by Andrea Ghez and Reinhard Genzel concluded that the compact radio source in the center of the galaxy, Sagittarius A*, was likely a supermassive black hole. On 11 February 2016, the LIGO Scientific Collaboration and Virgo Collaboration announced the first direct detection of gravitational waves, named GW150914, representing the first observation of a black hole merger. At the time of the merger, the black holes were approximately 1.4 billion light-years away from Earth and had masses of 30 and 35 solar masses.: 6 In 2017, Rainer Weiss, Kip Thorne, and Barry Barish, who had spearheaded the project, were awarded the Nobel Prize in Physics for their work. Since the initial discovery in 2015, hundreds more gravitational waves have been observed by LIGO and another interferometer, Virgo. On 10 April 2019, the first direct image of a black hole and its vicinity was published, following observations made by the Event Horizon Telescope (EHT) in 2017 of the supermassive black hole in Messier 87's galactic centre. In 2022, the Event Horizon Telescope collaboration released an image of the black hole in the center of the Milky Way galaxy, Sagittarius A*; The data had been collected in 2017. In 2020, the Nobel Prize in Physics was awarded for work on black holes. Andrea Ghez and Reinhard Genzel shared one-half for their discovery that Sagittarius A* is a supermassive black hole. Penrose received the other half for his work showing that the mathematics of general relativity requires the formation of black holes. Cosmologists lamented that Hawking's extensive theoretical work on black holes would not be honored since he died in 2018. In December 1967, a student reportedly suggested the phrase black hole at a lecture by John Wheeler; Wheeler adopted the term for its brevity and "advertising value", and Wheeler's stature in the field ensured it quickly caught on, leading some to credit Wheeler with coining the phrase. However, the term was used by others around that time. Science writer Marcia Bartusiak traces the term black hole to physicist Robert H. Dicke, who in the early 1960s reportedly compared the phenomenon to the Black Hole of Calcutta, notorious as a prison where people entered but never left alive. The term was used in print by Life and Science News magazines in 1963, and by science journalist Ann Ewing in her article "'Black Holes' in Space", dated 18 January 1964, which was a report on a meeting of the American Association for the Advancement of Science held in Cleveland, Ohio. Definition A black hole is generally defined as a region of spacetime from which no information-carrying signals or objects can escape. However, verifying an object as a black hole by this definition would require waiting for an infinite time and at an infinite distance from the black hole to verify that indeed, nothing has escaped, and thus cannot be used to identify a physical black hole. Broadly, physicists do not have a precisely-agreed-upon definition of a black hole. Among astrophysicists, a black hole is a compact object with a mass larger than four solar masses. A black hole may also be defined as a reservoir of information: 142 or a region where space is falling inwards faster than the speed of light. Properties The no-hair theorem postulates that, once it achieves a stable condition after formation, a black hole has only three independent physical properties: mass, electric charge, and angular momentum; the black hole is otherwise featureless. If the conjecture is true, any two black holes that share the same values for these properties, or parameters, are indistinguishable from one another. The degree to which the conjecture is true for real black holes is currently an unsolved problem. The simplest static black holes have mass but neither electric charge nor angular momentum. According to Birkhoff's theorem, these Schwarzschild black holes are the only vacuum solution that is spherically symmetric. Solutions describing more general black holes also exist. Non-rotating charged black holes are described by the Reissner–Nordström metric, while the Kerr metric describes a non-charged rotating black hole. The most general stationary black hole solution known is the Kerr–Newman metric, which describes a black hole with both charge and angular momentum. The simplest static black holes have mass but neither electric charge nor angular momentum. Contrary to the popular notion of a black hole "sucking in everything" in its surroundings, from far away, the external gravitational field of a black hole is identical to that of any other body of the same mass. While a black hole can theoretically have any positive mass, the charge and angular momentum are constrained by the mass. The total electric charge Q and the total angular momentum J are expected to satisfy the inequality Q 2 4 π ϵ 0 + c 2 J 2 G M 2 ≤ G M 2 {\displaystyle {\frac {Q^{2}}{4\pi \epsilon _{0}}}+{\frac {c^{2}J^{2}}{GM^{2}}}\leq GM^{2}} for a black hole of mass M. Black holes with the maximum possible charge or spin satisfying this inequality are called extremal black holes. Solutions of Einstein's equations that violate this inequality exist, but they do not possess an event horizon. These are so-called naked singularities that can be observed from the outside. Because these singularities make the universe inherently unpredictable, many physicists believe they could not exist. The weak cosmic censorship hypothesis, proposed by Sir Roger Penrose, rules out the formation of such singularities, when they are created through the gravitational collapse of realistic matter. However, this theory has not yet been proven, and some physicists believe that naked singularities could exist. It is also unknown whether black holes could even become extremal, forming naked singularities, since natural processes counteract increasing spin and charge when a black hole becomes near-extremal. The total mass of a black hole can be estimated by analyzing the motion of objects near the black hole, such as stars or gas. All black holes spin, often fast—One supermassive black hole, GRS 1915+105 has been estimated to spin at over 1,000 revolutions per second. The Milky Way's central black hole Sagittarius A* rotates at about 90% of the maximum rate. The spin rate can be inferred from measurements of atomic spectral lines in the X-ray range. As gas near the black hole plunges inward, high energy X-ray emission from electron-positron pairs illuminates the gas further out, appearing red-shifted due to relativistic effects. Depending on the spin of the black hole, this plunge happens at different radii from the hole, with different degrees of redshift. Astronomers can use the gap between the x-ray emission of the outer disk and the redshifted emission from plunging material to determine the spin of the black hole. A newer way to estimate spin is based on the temperature of gasses accreting onto the black hole. The method requires an independent measurement of the black hole mass and inclination angle of the accretion disk followed by computer modeling. Gravitational waves from coalescing binary black holes can also provide the spin of both progenitor black holes and the merged hole, but such events are rare. A spinning black hole has angular momentum. The supermassive black hole in the center of the Messier 87 (M87) galaxy appears to have an angular momentum very close to the maximum theoretical value. That uncharged limit is J ≤ G M 2 c , {\displaystyle J\leq {\frac {GM^{2}}{c}},} allowing definition of a dimensionless spin magnitude such that 0 ≤ c J G M 2 ≤ 1. {\displaystyle 0\leq {\frac {cJ}{GM^{2}}}\leq 1.} Most black holes are believed to have an approximately neutral charge. For example, Michal Zajaček, Arman Tursunov, Andreas Eckart, and Silke Britzen found the electric charge of Sagittarius A* to be at least ten orders of magnitude below the theoretical maximum. A charged black hole repels other like charges just like any other charged object. If a black hole were to become charged, particles with an opposite sign of charge would be pulled in by the extra electromagnetic force, while particles with the same sign of charge would be repelled, neutralizing the black hole. This effect may not be as strong if the black hole is also spinning. The presence of charge can reduce the diameter of the black hole by up to 38%. The charge Q for a nonspinning black hole is bounded by Q ≤ G M , {\displaystyle Q\leq {\sqrt {G}}M,} where G is the gravitational constant and M is the black hole's mass. Classification Black holes can have a wide range of masses. The minimum mass of a black hole formed by stellar gravitational collapse is governed by the maximum mass of a neutron star and is believed to be approximately two-to-four solar masses. However, theoretical primordial black holes, believed to have formed soon after the Big Bang, could be far smaller, with masses as little as 10−5 grams at formation. These very small black holes are sometimes called micro black holes. Black holes formed by stellar collapse are called stellar black holes. Estimates of their maximum mass at formation vary, but generally range from 10 to 100 solar masses, with higher estimates for black holes progenated by low-metallicity stars. The mass of a black hole formed via a supernova has a lower bound: If the progenitor star is too small, the collapse may be stopped by the degeneracy pressure of the star's constituents, allowing the condensation of matter into an exotic denser state. Degeneracy pressure occurs from the Pauli exclusion principle—Particles will resist being in the same place as each other. Smaller progenitor stars, with masses less than about 8 M☉, will be held together by the degeneracy pressure of electrons and will become a white dwarf. For more massive progenitor stars, electron degeneracy pressure is no longer strong enough to resist the force of gravity and the star will be held together by neutron degeneracy pressure, which can occur at much higher densities, forming a neutron star. If the star is still too massive, even neutron degeneracy pressure will not be able to resist the force of gravity and the star will collapse into a black hole.: 5.8 Stellar black holes can also gain mass via accretion of nearby matter, often from a companion object such as a star. Black holes that are larger than stellar black holes but smaller than supermassive black holes are called intermediate-mass black holes, with masses of approximately 102 to 105 solar masses. These black holes seem to be rarer than their stellar and supermassive counterparts, with relatively few candidates having been observed. Physicists have speculated that such black holes may form from collisions in globular and star clusters or at the center of low-mass galaxies. They may also form as the result of mergers of smaller black holes, with several LIGO observations finding merged black holes within the 110-350 solar mass range. The black holes with the largest masses are called supermassive black holes, with masses more than 106 times that of the Sun. These black holes are believed to exist at the centers of almost every large galaxy, including the Milky Way. Some scientists have proposed a subcategory of even larger black holes, called ultramassive black holes, with masses greater than 109-1010 solar masses. Theoretical models predict that the accretion disc that feeds black holes will be unstable once a black hole reaches 50-100 billion times the mass of the Sun, setting a rough upper limit to black hole mass. Structure While black holes are conceptually invisible sinks of all matter and light, in astronomical settings, their enormous gravity alters the motion of surrounding objects and pulls nearby gas inwards at near-light speed, making the area around black holes the brightest objects in the universe. Some black holes have relativistic jets—thin streams of plasma travelling away from the black hole at more than one-tenth of the speed of light. A small faction of the matter falling towards the black hole gets accelerated away along the hole rotation axis. These jets can extend as far as millions of parsecs from the black hole itself. Black holes of any mass can have jets. However, they are typically observed around spinning black holes with strongly-magnetized accretion disks. Relativistic jets were more common in the early universe, when galaxies and their corresponding supermassive black holes were rapidly gaining mass. All black holes with jets also have an accretion disk, but the jets are usually brighter than the disk. Quasars, typically found in other galaxies, are believed to be supermassive black holes with jets; microquasars are believed to be stellar-mass objects with jets, typically observed in the Milky Way. The mechanism of formation of jets is not yet known, but several options have been proposed. One method proposed to fuel these jets is the Blandford-Znajek process, which suggests that the dragging of magnetic field lines by a black hole's rotation could launch jets of matter into space. The Penrose process, which involves extraction of a black hole's rotational energy, has also been proposed as a potential mechanism of jet propulsion. Due to conservation of angular momentum, gas falling into the gravitational well created by a massive object will typically form a disk-like structure around the object.: 242 As the disk's angular momentum is transferred outward due to internal processes, its matter falls farther inward, converting its gravitational energy into heat and releasing a large flux of x-rays. The temperature of these disks can range from thousands to millions of Kelvin, and temperatures can differ throughout a single accretion disk. Accretion disks can also emit in other parts of the electromagnetic spectrum, depending on the disk's turbulence and magnetization and the black hole's mass and angular momentum. Accretion disks can be defined as geometrically thin or geometrically thick. Geometrically thin disks are mostly confined to the black hole's equatorial plane and have a well-defined edge at the innermost stable circular orbit (ISCO), while geometrically thick disks are supported by internal pressure and temperature and can extend inside the ISCO. Disks with high rates of electron scattering and absorption, appearing bright and opaque, are called optically thick; optically thin disks are more translucent and produce fainter images when viewed from afar. Accretion disks of black holes accreting beyond the Eddington limit are often referred to as polish donuts due to their thick, toroidal shape that resembles that of a donut. Quasar accretion disks are expected to usually appear blue in color. The disk for a stellar black hole, on the other hand, would likely look orange, yellow, or red, with its inner regions being the brightest. Theoretical research suggests that the hotter a disk is, the bluer it should be, although this is not always supported by observations of real astronomical objects. Accretion disk colors may also be altered by the Doppler effect, with the part of the disk travelling towards an observer appearing bluer and brighter and the part of the disk travelling away from the observer appearing redder and dimmer. In Newtonian gravity, test particles can stably orbit at arbitrary distances from a central object. In general relativity, however, there exists a smallest possible radius for which a massive particle can orbit stably. Any infinitesimal inward perturbations to this orbit will lead to the particle spiraling into the black hole, and any outward perturbations will, depending on the energy, cause the particle to spiral in, move to a stable orbit further from the black hole, or escape to infinity. This orbit is called the innermost stable circular orbit, or ISCO. The location of the ISCO depends on the spin of the black hole and the spin of the particle itself. In the case of a Schwarzschild black hole (spin zero) and a particle without spin, the location of the ISCO is: r I S C O = 3 r s = 6 G M c 2 , {\displaystyle r_{\rm {ISCO}}=3\,r_{\text{s}}={\frac {6\,GM}{c^{2}}},} where r I S C O {\displaystyle r_{\rm {_{ISCO}}}} is the radius of the ISCO, r s {\displaystyle r_{\text{s}}} is the Schwarzschild radius of the black hole, G {\displaystyle G} is the gravitational constant, and c {\displaystyle c} is the speed of light. The radius of this orbit changes slightly based on particle spin. For charged black holes, the ISCO moves inwards. For spinning black holes, the ISCO is moved inwards for particles orbiting in the same direction that the black hole is spinning (prograde) and outwards for particles orbiting in the opposite direction (retrograde). For example, the ISCO for a particle orbiting retrograde can be as far out as about 9 r s {\displaystyle 9r_{\text{s}}} , while the ISCO for a particle orbiting prograde can be as close as at the event horizon itself. The photon sphere is a spherical boundary for which photons moving on tangents to that sphere are bent completely around the black hole, possibly orbiting multiple times. Light rays with impact parameters less than the radius of the photon sphere enter the black hole. For Schwarzschild black holes, the photon sphere has a radius 1.5 times the Schwarzschild radius; the radius for non-Schwarzschild black holes is at least 1.5 times the radius of the event horizon. When viewed from a great distance, the photon sphere creates an observable black hole shadow. Since no light emerges from within the black hole, this shadow is the limit for possible observations.: 152 The shadow of colliding black holes should have characteristic warped shapes, allowing scientists to detect black holes that are about to merge. While light can still escape from the photon sphere, any light that crosses the photon sphere on an inbound trajectory will be captured by the black hole. Therefore, any light that reaches an outside observer from the photon sphere must have been emitted by objects between the photon sphere and the event horizon. Light emitted towards the photon sphere may also curve around the black hole and return to the emitter. For a rotating, uncharged black hole, the radius of the photon sphere depends on the spin parameter and whether the photon is orbiting prograde or retrograde. For a photon orbiting prograde, the photon sphere will be 1-3 Schwarzschild radii from the center of the black hole, while for a photon orbiting retrograde, the photon sphere will be between 3-5 Schwarzschild radii from the center of the black hole. The exact location of the photon sphere depends on the magnitude of the black hole's rotation. For a charged, nonrotating black hole, there will only be one photon sphere, and the radius of the photon sphere will decrease for increasing black hole charge. For non-extremal, charged, rotating black holes, there will always be two photon spheres, with the exact radii depending on the parameters of the black hole. Near a rotating black hole, spacetime rotates similar to a vortex. The rotating spacetime will drag any matter and light into rotation around the spinning black hole. This effect of general relativity, called frame dragging, gets stronger closer to the spinning mass. The region of spacetime in which it is impossible to stay still is called the ergosphere. The ergosphere of a black hole is a volume bounded by the black hole's event horizon and the ergosurface, which coincides with the event horizon at the poles but bulges out from it around the equator. Matter and radiation can escape from the ergosphere. Through the Penrose process, objects can emerge from the ergosphere with more energy than they entered with. The extra energy is taken from the rotational energy of the black hole, slowing down the rotation of the black hole.: 268 A variation of the Penrose process in the presence of strong magnetic fields, the Blandford–Znajek process, is considered a likely mechanism for the enormous luminosity and relativistic jets of quasars and other active galactic nuclei. The observable region of spacetime around a black hole closest to its event horizon is called the plunging region. In this area it is no longer possible for free falling matter to follow circular orbits or stop a final descent into the black hole. Instead, it will rapidly plunge toward the black hole at close to the speed of light, growing increasingly hot and producing a characteristic, detectable thermal emission. However, light and radiation emitted from this region can still escape from the black hole's gravitational pull. For a nonspinning, uncharged black hole, the radius of the event horizon, or Schwarzschild radius, is proportional to the mass, M, through r s = 2 G M c 2 ≈ 2.95 M M ⊙ k m , {\displaystyle r_{\mathrm {s} }={\frac {2GM}{c^{2}}}\approx 2.95\,{\frac {M}{M_{\odot }}}~\mathrm {km,} } where rs is the Schwarzschild radius and M☉ is the mass of the Sun.: 124 For a black hole with nonzero spin or electric charge, the radius is smaller,[Note 1] until an extremal black hole could have an event horizon close to r + = G M c 2 , {\displaystyle r_{\mathrm {+} }={\frac {GM}{c^{2}}},} half the radius of a nonspinning, uncharged black hole of the same mass. Since the volume within the Schwarzschild radius increase with the cube of the radius, average density of a black hole inside its Schwarzschild radius is inversely proportional to the square of its mass: supermassive black holes are much less dense than stellar black holes. The average density of a 108 M☉ black hole is comparable to that of water. The defining feature of a black hole is the existence of an event horizon, a boundary in spacetime through which matter and light can pass only inward towards the center of the black hole. Nothing, not even light, can escape from inside the event horizon. The event horizon is referred to as such because if an event occurs within the boundary, information from that event cannot reach or affect an outside observer, making it impossible to determine whether such an event occurred.: 179 For non-rotating black holes, the geometry of the event horizon is precisely spherical, while for rotating black holes, the event horizon is oblate. To a distant observer, a clock near a black hole would appear to tick more slowly than one further from the black hole.: 217 This effect, known as gravitational time dilation, would also cause an object falling into a black hole to appear to slow as it approached the event horizon, never quite reaching the horizon from the perspective of an outside observer.: 218 All processes on this object would appear to slow down, and any light emitted by the object to appear redder and dimmer, an effect known as gravitational redshift. An object falling from half of a Schwarzschild radius above the event horizon would fade away until it could no longer be seen, disappearing from view within one hundredth of a second. It would also appear to flatten onto the black hole, joining all other material that had ever fallen into the hole. On the other hand, an observer falling into a black hole would not notice any of these effects as they cross the event horizon. Their own clocks appear to them to tick normally, and they cross the event horizon after a finite time without noting any singular behaviour. In general relativity, it is impossible to determine the location of the event horizon from local observations, due to Einstein's equivalence principle.: 222 Black holes that are rotating and/or charged have an inner horizon, often called the Cauchy horizon, inside of the black hole. The inner horizon is divided up into two segments: an ingoing section and an outgoing section. At the ingoing section of the Cauchy horizon, radiation and matter that fall into the black hole would build up at the horizon, causing the curvature of spacetime to go to infinity. This would cause an observer falling in to experience tidal forces. This phenomenon is often called mass inflation, since it is associated with a parameter dictating the black hole's internal mass growing exponentially, and the buildup of tidal forces is called the mass-inflation singularity or Cauchy horizon singularity. Some physicists have argued that in realistic black holes, accretion and Hawking radiation would stop mass inflation from occurring. At the outgoing section of the inner horizon, infalling radiation would backscatter off of the black hole's spacetime curvature and travel outward, building up at the outgoing Cauchy horizon. This would cause an infalling observer to experience a gravitational shock wave and tidal forces as the spacetime curvature at the horizon grew to infinity. This buildup of tidal forces is called the shock singularity. Both of these singularities are weak, meaning that an object crossing them would only be deformed a finite amount by tidal forces, even though the spacetime curvature would still be infinite at the singularity. This is as opposed to a strong singularity, where an object hitting the singularity would be stretched and squeezed by an infinite amount. They are also null singularities, meaning that a photon could travel parallel to the them without ever being intercepted. Ignoring quantum effects, every black hole has a singularity inside, points where the curvature of spacetime becomes infinite, and geodesics terminate within a finite proper time.: 205 For a non-rotating black hole, this region takes the shape of a single point; for a rotating black hole it is smeared out to form a ring singularity that lies in the plane of rotation.: 264 In both cases, the singular region has zero volume. All of the mass of the black hole ends up in the singularity.: 252 Since the singularity has nonzero mass in an infinitely small space, it can be thought of as having infinite density. Observers falling into a Schwarzschild black hole (i.e., non-rotating and not charged) cannot avoid being carried into the singularity once they cross the event horizon. As they fall further into the black hole, they will be torn apart by the growing tidal forces in a process sometimes referred to as spaghettification or the noodle effect. Eventually, they will reach the singularity and be crushed into an infinitely small point.: 182 However any perturbations, such as those caused by matter or radiation falling in, would cause space to oscillate chaotically near the singularity. Any matter falling in would experience intense tidal forces rapidly changing in direction, all while being compressed into an increasingly small volume. Alternative forms of general relativity, including addition of some quatum effects, can lead to regular, or nonsingular, black holes without singularities. For example, the fuzzball model, based on string theory, states that black holes are actually made up of quantum microstates and need not have a singularity or an event horizon. The theory of loop quantum gravity proposes that the curvature and density at the center of a black hole is large, but not infinite. Formation Black holes are formed by gravitational collapse of massive stars, either by direct collapse or during a supernova explosion in a process called fallback. Black holes can result from the merger of two neutron stars or a neutron star and a black hole. Other more speculative mechanisms include primordial black holes created from density fluctuations in the early universe, the collapse of dark stars, a hypothetical object powered by annihilation of dark matter, or from hypothetical self-interacting dark matter. Gravitational collapse occurs when an object's internal pressure is insufficient to resist the object's own gravity. At the end of a star's life, it will run out of hydrogen to fuse, and will start fusing more and more massive elements, until it gets to iron. Since the fusion of elements heavier than iron would require more energy than it would release, nuclear fusion ceases. If the iron core of the star is too massive, the star will no longer be able to support itself and will undergo gravitational collapse. While most of the energy released during gravitational collapse is emitted very quickly, an outside observer does not actually see the end of this process. Even though the collapse takes a finite amount of time from the reference frame of infalling matter, a distant observer would see the infalling material slow and halt just above the event horizon, due to gravitational time dilation. Light from the collapsing material takes longer and longer to reach the observer, with the delay growing to infinity as the emitting material reaches the event horizon. Thus the external observer never sees the formation of the event horizon; instead, the collapsing material seems to become dimmer and increasingly red-shifted, eventually fading away. Observations of quasars at redshift z ∼ 7 {\displaystyle z\sim 7} , less than a billion years after the Big Bang, has led to investigations of other ways to form black holes. The accretion process to build supermassive black holes has a limiting rate of mass accumulation and a billion years is not enough time to reach quasar status. One suggestion is direct collapse of nearly pure hydrogen gas (low metalicity) clouds characteristic of the young universe, forming a supermassive star which collapses into a black hole. It has been suggested that seed black holes with typical masses of ~105 M☉ could have formed in this way which then could grow to ~109 M☉. However, the very large amount of gas required for direct collapse is not typically stable to fragmentation to form multiple stars. Thus another approach suggests massive star formation followed by collisions that seed massive black holes which ultimately merge to create a quasar.: 85 A neutron star in a common envelope with a regular star can accrete sufficient material to collapse to a black hole or two neutron stars can merge. These avenues for the formation of black holes are considered relatively rare. In the current epoch of the universe, conditions needed to form black holes are rare and are mostly only found in stars. However, in the early universe, conditions may have allowed for black hole formations via other means. Fluctuations of spacetime soon after the Big Bang may have formed areas that were denser then their surroundings. Initially, these regions would not have been compact enough to form a black hole, but eventually, the curvature of spacetime in the regions become large enough to cause them to collapse into a black hole. Different models for the early universe vary widely in their predictions of the scale of these fluctuations. Various models predict the creation of primordial black holes ranging from a Planck mass (~2.2×10−8 kg) to hundreds of thousands of solar masses. Primordial black holes with masses less than 1015 g would have evaporated by now due to Hawking radiation. Despite the early universe being extremely dense, it did not re-collapse into a black hole during the Big Bang, since the universe was expanding rapidly and did not have the gravitational differential necessary for black hole formation. Models for the gravitational collapse of objects of relatively constant size, such as stars, do not necessarily apply in the same way to rapidly expanding space such as the Big Bang. In principle, black holes could be formed in high-energy particle collisions that achieve sufficient density, although no such events have been detected. These hypothetical micro black holes, which could form from the collision of cosmic rays and Earth's atmosphere or in particle accelerators like the Large Hadron Collider, would not be able to aggregate additional mass. Instead, they would evaporate in about 10−25 seconds, posing no threat to the Earth. Evolution Black holes can also merge with other objects such as stars or even other black holes. This is thought to have been important, especially in the early growth of supermassive black holes, which could have formed from the aggregation of many smaller objects. The process has also been proposed as the origin of some intermediate-mass black holes. Mergers of supermassive black holes may take a long time: As a binary of supermassive black holes approach each other, most nearby stars are ejected, leaving little for the remaining black holes to gravitationally interact with that would allow them to get closer to each other. This phenomenon has been called the final parsec problem, as the distance at which this happens is usually around one parsec. When a black hole accretes matter, the gas in the inner accretion disk orbits at very high speeds because of its proximity to the black hole. The resulting friction heats the inner disk to temperatures at which it emits vast amounts of electromagnetic radiation (mainly X-rays) detectable by telescopes. By the time the matter of the disk reaches the ISCO, between 5.7% and 42% of its mass will have been converted to energy, depending on the black hole's spin. About 90% of this energy is released within about 20 black hole radii. In many cases, accretion disks are accompanied by relativistic jets that are emitted along the black hole's poles, which carry away much of the energy. The mechanism for the creation of these jets is currently not well understood, in part due to insufficient data. Many of the universe's most energetic phenomena have been attributed to the accretion of matter on black holes. Active galactic nuclei and quasars are believed to be the accretion disks of supermassive black holes. X-ray binaries are generally accepted to be binary systems in which one of the two objects is a compact object accreting matter from its companion. Ultraluminous X-ray sources may be the accretion disks of intermediate-mass black holes. At a certain rate of accretion, the outward radiation pressure will become as strong as the inward gravitational force, and the black hole should unable to accrete any faster. This limit is called the Eddington limit. However, many black holes accrete beyond this rate due to their non-spherical geometry or instabilities in the accretion disk. Accretion beyond the limit is called Super-Eddington accretion and may have been commonplace in the early universe. Stars have been observed to get torn apart by tidal forces in the immediate vicinity of supermassive black holes in galaxy nuclei, in what is known as a tidal disruption event (TDE). Some of the material from the disrupted star forms an accretion disk around the black hole, which emits observable electromagnetic radiation. The correlation between the masses of supermassive black holes in the centres of galaxies with the velocity dispersion and mass of stars in their host bulges suggests that the formation of galaxies and the formation of their central black holes are related. Black hole winds from rapid accretion, particularly when the galaxy itself is still accreting matter, can compress gas nearby, accelerating star formation. However, if the winds become too strong, the black hole may blow nearly all of the gas out of the galaxy, quenching star formation. Black hole jets may also energize nearby cavities of plasma and eject low-entropy gas from out of the galactic core, causing gas in galactic centers to be hotter than expected. If Hawking's theory of black hole radiation is correct, then black holes are expected to shrink and evaporate over time as they lose mass by the emission of photons and other particles. The temperature of this thermal spectrum (Hawking temperature) is proportional to the surface gravity of the black hole, which is inversely proportional to the mass. Hence, large black holes emit less radiation than small black holes.: Ch. 9.6 A stellar black hole of 1 M☉ has a Hawking temperature of 62 nanokelvins. This is far less than the 2.7 K temperature of the cosmic microwave background radiation. Stellar-mass or larger black holes receive more mass from the cosmic microwave background than they emit through Hawking radiation and thus will grow instead of shrinking. To have a Hawking temperature larger than 2.7 K (and be able to evaporate), a black hole would need a mass less than the Moon. Such a black hole would have a diameter of less than a tenth of a millimetre. The Hawking radiation for an astrophysical black hole is predicted to be very weak and would thus be exceedingly difficult to detect from Earth. A possible exception is the burst of gamma rays emitted in the last stage of the evaporation of primordial black holes. Searches for such flashes have proven unsuccessful and provide stringent limits on the possibility of existence of low mass primordial black holes, with modern research predicting that primordial black holes must make up less than a fraction of 10−7 of the universe's total mass. NASA's Fermi Gamma-ray Space Telescope, launched in 2008, has searched for these flashes, but has not yet found any. The properties of a black hole are constrained and interrelated by the theories that predict these properties. When based on general relativity, these relationships are called the laws of black hole mechanics. For a black hole that is not still forming or accreting matter, the zeroth law of black hole mechanics states the black hole's surface gravity is constant across the event horizon. The first law relates changes in the black hole's surface area, angular momentum, and charge to changes in its energy. The second law says the surface area of a black hole never decreases on its own. Finally, the third law says that the surface gravity of a black hole is never zero. These laws are mathematical analogs of the laws of thermodynamics. They are not equivalent, however, because, according to general relativity without quantum mechanics, a black hole can never emit radiation, and thus its temperature must always be zero.: 11 Quantum mechanics predicts that a black hole will continuously emit thermal Hawking radiation, and therefore must always have a nonzero temperature. It also predicts that all black holes have entropy which scales with their surface area. When quantum mechanics is accounted for, the laws of black hole mechanics become equivalent to the classical laws of thermodynamics. However, these conclusions are derived without a complete theory of quantum gravity, although many potential theories do predict black holes having entropy and temperature. Thus, the true quantum nature of black hole thermodynamics continues to be debated.: 29 Observational evidence Millions of black holes with around 30 solar masses derived from stellar collapse are expected to exist in the Milky Way. Even a dwarf galaxy like Draco should have hundreds. Only a few of these have been detected. By nature, black holes do not themselves emit any electromagnetic radiation other than the hypothetical Hawking radiation, so astrophysicists searching for black holes must generally rely on indirect observations. The defining characteristic of a black hole is its event horizon. The horizon itself cannot be imaged, so all other possible explanations for these indirect observations must be considered and eliminated before concluding that a black hole has been observed.: 11 The Event Horizon Telescope (EHT) is a global system of radio telescopes capable of directly observing a black hole shadow. The angular resolution of a telescope is based on its aperture and the wavelengths it is observing. Because the angular diameters of Sagittarius A* and Messier 87* in the sky are very small, a single telescope would need to be about the size of the Earth to clearly distinguish their horizons using radio wavelengths. By combining data from several different radio telescopes around the world, the Event Horizon Telescope creates an effective aperture the diameter size of the Earth. The EHT team used imaging algorithms to compute the most probable image from the data in its observations of Sagittarius A* and M87*. Gravitational-wave interferometry can be used to detect merging black holes and other compact objects. In this method, a laser beam is split down two long arms of a tunnel. The laser beams reflect off of mirrors in the tunnels and converge at the intersection of the arms, cancelling each other out. However, when a gravitational wave passes, it warps spacetime, changing the lengths of the arms themselves. Since each laser beam is now travelling a slightly different distance, they do not cancel out and produce a recognizable signal. Analysis of the signal can give scientists information about what caused the gravitational waves. Since gravitational waves are very weak, gravitational-wave observatories such as LIGO must have arms several kilometers long and carefully control for noise from Earth to be able to detect these gravitational waves. Since the first measurements in 2016, multiple gravitational waves from black holes have been detected and analyzed. The proper motions of stars near the centre of the Milky Way provide strong observational evidence that these stars are orbiting a supermassive black hole. Since 1995, astronomers have tracked the motions of 90 stars orbiting an invisible object coincident with the radio source Sagittarius A*. In 1998, by fitting the motions of the stars to Keplerian orbits, the astronomers were able to infer that Sagittarius A* must be a 2.6×106 M☉ object must be contained within a radius of 0.02 light-years. Since then, one of the stars—called S2—has completed a full orbit. From the orbital data, astronomers were able to refine the calculations of the mass of Sagittarius A* to 4.3×106 M☉, with a radius of less than 0.002 light-years. This upper limit radius is larger than the Schwarzschild radius for the estimated mass, so the combination does not prove Sagittarius A* is a black hole. Nevertheless, these observations strongly suggest that the central object is a supermassive black hole as there are no other plausible scenarios for confining so much invisible mass into such a small volume. Additionally, there is some observational evidence that this object might possess an event horizon, a feature unique to black holes. The Event Horizon Telescope image of Sagittarius A*, released in 2022, provided further confirmation that it is indeed a black hole. X-ray binaries are binary systems that emit a majority of their radiation in the X-ray part of the electromagnetic spectrum. These X-ray emissions result when a compact object accretes matter from an ordinary star. The presence of an ordinary star in such a system provides an opportunity for studying the central object and to determine if it might be a black hole. By measuring the orbital period of the binary, the distance to the binary from Earth, and the mass of the companion star, scientists can estimate the mass of the compact object. The Tolman-Oppenheimer-Volkoff limit (TOV limit) dictates the largest mass a nonrotating neutron star can be, and is estimated to be about two solar masses. While a rotating neutron star can be slightly more massive, if the compact object is much more massive than the TOV limit, it cannot be a neutron star and is generally expected to be a black hole. The first strong candidate for a black hole, Cygnus X-1, was discovered in this way by Charles Thomas Bolton, Louise Webster, and Paul Murdin in 1972. Observations of rotation broadening of the optical star reported in 1986 lead to a compact object mass estimate of 16 solar masses, with 7 solar masses as the lower bound. In 2011, this estimate was updated to 14.1±1.0 M☉ for the black hole and 19.2±1.9 M☉ for the optical stellar companion. X-ray binaries can be categorized as either low-mass or high-mass; This classification is based on the mass of the companion star, not the compact object itself. In a class of X-ray binaries called soft X-ray transients, the companion star is of relatively low mass, allowing for more accurate estimates of the black hole mass. These systems actively emit X-rays for only several months once every 10–50 years. During the period of low X-ray emission, called quiescence, the accretion disk is extremely faint, allowing detailed observation of the companion star. Numerous black hole candidates have been measured by this method. Black holes are also sometimes found in binaries with other compact objects, such as white dwarfs, neutron stars, and other black holes. The centre of nearly every galaxy contains a supermassive black hole. The close observational correlation between the mass of this hole and the velocity dispersion of the host galaxy's bulge, known as the M–sigma relation, strongly suggests a connection between the formation of the black hole and that of the galaxy itself. Astronomers use the term active galaxy to describe galaxies with unusual characteristics, such as unusual spectral line emission and very strong radio emission. Theoretical and observational studies have shown that the high levels of activity in the centers of these galaxies, regions called active galactic nuclei (AGN), may be explained by accretion onto supermassive black holes. These AGN consist of a central black hole that may be millions or billions of times more massive than the Sun, a disk of interstellar gas and dust called an accretion disk, and two jets perpendicular to the accretion disk. Although supermassive black holes are expected to be found in most AGN, only some galaxies' nuclei have been more carefully studied in attempts to both identify and measure the actual masses of the central supermassive black hole candidates. Some of the most notable galaxies with supermassive black hole candidates include the Andromeda Galaxy, Messier 32, Messier 87, the Sombrero Galaxy, and the Milky Way itself. Another way black holes can be detected is through observation of effects caused by their strong gravitational field. One such effect is gravitational lensing: The deformation of spacetime around a massive object causes light rays to be deflected, making objects behind them appear distorted. When the lensing object is a black hole, this effect can be strong enough to create multiple images of a star or other luminous source. However, the distance between the lensed images may be too small for contemporary telescopes to resolve—this phenomenon is called microlensing. Instead of seeing two images of a lensed star, astronomers see the star brighten slightly as the black hole moves towards the line of sight between the star and Earth and then return to its normal luminosity as the black hole moves away. The turn of the millennium saw the first 3 candidate detections of black holes in this way, and in January 2022, astronomers reported the first confirmed detection of a microlensing event from an isolated black hole. This was also the first determination of an isolated black hole mass, 7.1±1.3 M☉. Alternatives While there is a strong case for supermassive black holes, the model for stellar-mass black holes assumes of an upper limit for the mass of a neutron star: objects observed to have more mass are assumed to be black holes. However, the properties of extremely dense matter are poorly understood. New exotic phases of matter could allow other kinds of massive objects. Quark stars would be made up of quark matter and supported by quark degeneracy pressure, a form of degeneracy pressure even stronger than neutron degeneracy pressure. This would halt gravitational collapse at a higher mass than for a neutron star. Even stronger stars called electroweak stars would convert quarks in their cores into leptons, providing additional pressure to stop the star from collapsing. If, as some extensions of the Standard Model posit, quarks and leptons are made up of the even-smaller fundamental particles called preons, a very compact star could be supported by preon degeneracy pressure. While none of these hypothetical models can explain all of the observations of stellar black hole candidates, a Q star is the only alternative which could significantly exceed the mass limit for neutron stars and thus provide an alternative for supermassive black holes.: 12 A few theoretical objects have been conjectured to match observations of astronomical black hole candidates identically or near-identically, but which function via a different mechanism. A dark energy star would convert infalling matter into vacuum energy; This vacuum energy would be much larger than the vacuum energy of outside space, exerting outwards pressure and preventing a singularity from forming. A black star would be gravitationally collapsing slowly enough that quantum effects would keep it just on the cusp of fully collapsing into a black hole. A gravastar would consist of a very thin shell and a dark-energy interior providing outward pressure to stop the collapse into a black hole or formation of a singularity; It could even have another gravastar inside, called a 'nestar'. Open questions According to the no-hair theorem, a black hole is defined by only three parameters: its mass, charge, and angular momentum. This seems to mean that all other information about the matter that went into forming the black hole is lost, as there is no way to determine anything about the black hole from outside other than those three parameters. When black holes were thought to persist forever, this information loss was not problematic, as the information can be thought of as existing inside the black hole. However, black holes slowly evaporate by emitting Hawking radiation. This radiation does not appear to carry any additional information about the matter that formed the black hole, meaning that this information is seemingly gone forever. This is called the black hole information paradox. Theoretical studies analyzing the paradox have led to both further paradoxes and new ideas about the intersection of quantum mechanics and general relativity. While there is no consensus on the resolution of the paradox, work on the problem is expected to be important for a theory of quantum gravity.: 126 Observations of faraway galaxies have found that ultraluminous quasars, powered by supermassive black holes, existed in the early universe as far as redshift z ≥ 7 {\displaystyle z\geq 7} . These black holes have been assumed to be the products of the gravitational collapse of large population III stars. However, these stellar remnants were not massive enough to produce the quasars observed at early times without accreting beyond the Eddington limit, the theoretical maximum rate of black hole accretion. Physicists have suggested a variety of different mechanisms by which these supermassive black holes may have formed. It has been proposed that smaller black holes may have also undergone mergers to produce the observed supermassive black holes. It is also possible that they were seeded by direct-collapse black holes, in which a large cloud of hot gas avoids fragmentation that would lead to multiple stars, due to low angular momentum or heating from a nearby galaxy. Given the right circumstances, a single supermassive star forms and collapses directly into a black hole without undergoing typical stellar evolution. Additionally, these supermassive black holes in the early universe may be high-mass primordial black holes, which could have accreted further matter in the centers of galaxies. Finally, certain mechanisms allow black holes to grow faster than the theoretical Eddington limit, such as dense gas in the accretion disk limiting outward radiation pressure that prevents the black hole from accreting. However, the formation of bipolar jets prevent super-Eddington rates. In fiction Black holes have been portrayed in science fiction in a variety of ways. Even before the advent of the term itself, objects with characteristics of black holes appeared in stories such as the 1928 novel The Skylark of Space with its "black Sun" and the "hole in space" in the 1935 short story Starship Invincible. As black holes grew to public recognition in the 1960s and 1970s, they began to be featured in films as well as novels, such as Disney's The Black Hole. Black holes have also been used in works of the 21st century, such as Christopher Nolan's science fiction epic Interstellar. Authors and screenwriters have exploited the relativistic effects of black holes, particularly gravitational time dilation. For example, Interstellar features a black hole planet with a time dilation factor of over 60,000:1, while the 1977 novel Gateway depicts a spaceship approaching but never crossing the event horizon of a black hole from the perspective of an outside observer due to time dilation effects. Black holes have also been appropriated as wormholes or other methods of faster-than-light travel, such as in the 1974 novel The Forever War, where a network of black holes is used for interstellar travel. Additionally, black holes can feature as hazards to spacefarers and planets: A black hole threatens a deep-space outpost in 1978 short story The Black Hole Passes, and a binary black hole dangerously alters the orbit of a planet in the 2018 Netflix reboot of Lost in Space. Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Model_204] | [TOKENS: 533] |
Contents Model 204 Model 204 (M204) is a database management system for IBM and compatible mainframe computers developed and commercialized by Computer Corporation of America. It was announced in 1965,: 66 and first deployed in 1972. It incorporates a programming language and an environment for application development. Implemented in assembly language for IBM System/360 and its successors, M204 can deal with very large databases and transaction loads of 1000 TPS.: 4 Product description Model 204 relies on its own type of bitmap index, originally devised by Bill Mann, and combines the use of hash table, B-tree, and partitioned record list technologies to optimize speed and efficiency of database access. It has been described as "one of the three major inverted-list [database systems] ... the other two being" ADABAS and ADR's Datacom/DB. Although M204 is a pre-SQL (and pre-relational) database product, it is possible to manually map the files of an M204 database to approximate SQL equivalents and provide some limited SQL functionality using Model 204 SQL Server. Users Model 204 is commonly used in government and military applications. It was used commercially in the UK by Marks & Spencer.[citation needed] It was also used at the Ventura County Property Tax system in California, the Harris County, Texas, Justice Information Management System, and in the New York City Department of Education's Automate The Schools system. An informal list of past and present Model 204 users, compiled in 2010, identified more than 140 organizations worldwide. Beginning in 1986, it was used by the US Navy Fleet Intelligence Center Europe and Atlantic (FICEURLANT). Model 204 has been a central part of Australian social security for decades. Services Australia have used it for their ISIS system that pays over $110 billion in welfare payments to around 6 million Australians. A 1.5 billion Australian dollar project to replace ISIS was expected to be completed in 2022 but was delayed. The project was scrapped in July 2023 having spent AUD 191 million, since the replacement system (based on Pegasystems business process automation software) took minutes to complete tasks Model 204 could complete in seconds. In a 2014 media interview, then Treasurer of Australia Joe Hockey stated that the Australian social security system and the Pentagon were the only remaining active Model 204 customers in the world. Add-on products for Model 204 database were formerly available from Sirius Software, Inc. Sirius, located in Cambridge, Massachusetts, USA, was acquired by Rocket Software in 2012. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/MIVA_Script] | [TOKENS: 1566] |
Contents MIVA Script Miva Script is a proprietary computer scripting language mainly used for internet applications such as e-commerce. As of 2015, it is developed, maintained and owned by Miva Merchant, Inc., based in San Diego, California. Many web hosting companies support Miva Script on their servers, but it is significantly less widespread than other popular web languages. History The language was first developed under the name HTMLScript by Joe Austin and others in 1995, and a company, HTMLScript Corporation, was formed the following year. The origins of Miva Script began in 1993 when David Haldy and Joseph Austin built the first version of HTMLScript. The first version was written in the programming language Perl, which was called Logic Enhanced HTML (LEHTML). Joseph Austin wrote a wrapper for it in the programming language C that let it start off as a root process and then downgrade itself immediately to the ownership and permissions of the owner of the script file. This wrapper made it suitable for use with his hosting service which was called Volant Turnpike at the time. Joseph Austin eventually sold Volant Turnpike to Dave Haldy. Perl allowed self-executing code, so LEHTML did not have its own expression analyzer and just parsed the expression into Perl syntax and then passed it into Perl. Volant Turnpike users liked using LEHTML, so Joseph Austin and Ron Ahern wrote an expression analyzer and re-implemented the LEHTML syntax in the C language. Joseph Austin called the result HTMLScript and registered the name with the United States Patent and Trademark Office (USPTO). Joseph Austin, Troy McCasland and Derek Finley were the founders of the company called HTMLScript Corporation. HTMLScript did not have the concept of a "WHILE" or "FOR" loop because of the low server processing power at the time. It would be enough to bring the whole server down if even one program ran away. So, Joseph Austin did not implement a loop to make it impossible for an HTMLScript server process to run away. Also, he implemented the macro in the first version of HTMLScript so it would allow self-executing code. The macro was powerful [why?], but it eventually had some security issues [which?]. In 1997, Jon Burchmore extensively rewrote the language to make it more syntactically consistent, although the new engine supported both old HTMLScript and new (named mivascript) syntaxes. Jon Burchmore rewrote HTMLScript with syntax that Joe developed with the help of SoftQuad, using the emerging XML standard. Jon Burchmore wrote the replacement for KoolKat which then became Miva Merchant. The new end-product supported both the old HTMLScript syntax and new (named Miva Script) syntaxes. On October 14, 1997, HTMLScript’s name was changed to Miva Script and the company name was changed to Miva. Soon afterward, Miva Merchant followed suit for the name of the product. The name Miva comes from the Egyptian hieroglyphics for the word cat. The word for cat is a combination of two symbols: milk basin followed by a quail. The milk basin is pronounced mee and the quail is pronounced waa which are combined to say cat. Joseph Austin thought this was clever, as they had called KoolKat “an electronic (cat)alog.” Joseph showed it to a German friend who could not pronounce the waa sound and instead kept on pronouncing it as va. Joseph Austin registered the domain Miva.com and filed the trademark. In 1998, the firm was renamed Miva Corporation. In 1998, the first version of Miva Merchant came out. In 2002, the Miva Script compiler was delivered, and the HTMLScript syntax and macros were dropped from the engine. Miva Corporation was sold in 2003 to a mid-cap, public company called FindWhat. Subsequently, FindWhat bought the name Miva. In 2007, Russell Carroll and a group of investors bought the original Miva technologies and customer base from Miva and started Miva Merchant, Inc. Language features Miva Script is often described as 'XML-like' although this is something of a misnomer. It consists of tags which may be interspersed with HTML and XHTML and which all start with <Mv. There are both paired and stand-alone (empty) tags. Before version 4.14, Miva Script was interpreted by the Miva Script engine, Empresa. Version 4.00 introduced a compiler, boosting performance significantly. One of the distinguishing features of Miva Script is the native support for a variation of dBase database platform (DBF III) tables with a proprietary index format and support for SQL. Many installations today are running with MySQL database. Variables are untyped and are not pre-declared. Miva Script uses the file extensions .mv, .mvc, and .mvt which is the common file extension for runtime compiled template source files. Implementations Empresa is the underlying engine for Miva Script. In versions numbered less than 4.0, Miva Merchant Empresa is a script interpreter available for web servers running *nix and Microsoft Windows operating systems. The most recent interpreter version is 3.9705. Interpreted Miva Script is still widely supported by many web hosts. Versions numbered 3.9x are a transitional form of the language, implementing some (but not all) of the new features found in version 4, such as arrays. Since 4.0, Miva Merchant Empresa is a Virtual Machine for running compiled Miva Script, again available in versions for *nix and Microsoft Windows. The current version level 5.x added new language constructs, native SQL support, a new access-methodology for dbase3 tables, called MIVA-SQL, and a new templating syntax that the Empresa virtual machine can compile on the fly. Version 5.08 and later support the GD Graphics Library. Miva Merchant Mia is a version of the Empresa engine designed to run on a Windows PC as a localhost server watching a specified port, usually 8000 or 8080. No other server software is needed unless the Post Office Protocol (POP) and SMTP functions are used. This provides a portable, stand-alone development environment. Miva Merchant Mia is updated with each Miva Merchant Empresa release. Like Empressa, versions pre-4 are interpreters while post-4.0 work only with compiled script. There are a few minor differences between. Miva Merchant Script Compiler was introduced in mid-2002, claiming to offer better performance and the closure of application source code. Compilability required some changes to the language, with support for the old HTMLScript syntax and macros evaluated at runtime (often considered a security risk) dropped. The compiler produces a platform-independent bytecode which runs on the Miva Merchant Empresa and Miva Merchant Mia Virtual Machines Minor variations exist between Empresa and Mia virtual machines. In May 2005, MIVA Corporation made the Script Compiler available free. In 2011, the built in licensing code was removed simplifying installation. In August 2007, Miva Merchant was separated from its parent company due to a management buy-out. Miva Script 5.0 Introduced the a page template compiler command which is the basis for Miva Merchant Storemorph™ page template system. This compiler within a compiler offers a simplified subset of the full language, more suitable for end user creation and editing of web page templates. Storemorph™ pages allow modular components created in MivaScript, to be added to a template extending its capabilities. External links Notes |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_note-FOOTNOTEAnderson199754-116] | [TOKENS: 10728] |
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006—over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to £700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, — with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) — as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999–2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256×224 to 640×480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consoles—including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears out—usually unevenly—due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Black_hole#Direct_interferometry] | [TOKENS: 13839] |
Contents Black hole A black hole is an astronomical body so compact that its gravity prevents anything, including light, from escaping. Albert Einstein's theory of general relativity predicts that a sufficiently compact mass will form a black hole. The boundary of no escape is called the event horizon. In general relativity, a black hole's event horizon seals an object's fate but produces no locally detectable change when crossed. General relativity also predicts that every black hole should have a central singularity, where the curvature of spacetime is infinite. In many ways, a black hole acts like an ideal black body, as it reflects no light. Quantum field theory in curved spacetime predicts that event horizons emit Hawking radiation, with the same spectrum as a black body of a temperature inversely proportional to its mass. This temperature is of the order of billionths of a kelvin for stellar black holes, making it essentially impossible to observe directly. Objects whose gravitational fields are too strong for light to escape were first considered in the 18th century by John Michell and Pierre-Simon Laplace. In 1916, Karl Schwarzschild found the first modern solution of general relativity that would characterise a black hole. Due to his influential research, the Schwarzschild metric is named after him. David Finkelstein, in 1958, first interpreted Schwarzschild's model as a region of space from which nothing can escape. Black holes were long considered a mathematical curiosity; it was not until the 1960s that theoretical work showed they were a generic prediction of general relativity. The first black hole known was Cygnus X-1, identified by several researchers independently in 1971. Black holes typically form when massive stars collapse at the end of their life cycle. After a black hole has formed, it can grow by absorbing mass from its surroundings. Supermassive black holes of millions of solar masses may form by absorbing other stars and merging with other black holes, or via direct collapse of gas clouds. There is consensus that supermassive black holes exist in the centres of most galaxies. The presence of a black hole can be inferred through its interaction with other matter and with electromagnetic radiation such as visible light. Matter falling toward a black hole can form an accretion disk of infalling plasma, heated by friction and emitting light. In extreme cases, this creates a quasar, some of the brightest objects in the universe. Merging black holes can also be detected by observation of the gravitational waves they emit. If other stars are orbiting a black hole, their orbits can be used to determine the black hole's mass and location. Such observations can be used to exclude possible alternatives such as neutron stars. In this way, astronomers have identified numerous stellar black hole candidates in binary systems and established that the radio source known as Sagittarius A*, at the core of the Milky Way galaxy, contains a supermassive black hole of about 4.3 million solar masses. History The idea of a body so massive that even light could not escape was first proposed in the late 18th century by English astronomer and clergyman John Michell and independently by French scientist Pierre-Simon Laplace. Both scholars proposed very large stars in contrast to the modern concept of an extremely dense object. Michell's idea, in a short part of a letter published in 1784, calculated that a star with the same density but 500 times the radius of the sun would not let any emitted light escape; the surface escape velocity would exceed the speed of light.: 122 Michell correctly hypothesized that such supermassive but non-radiating bodies might be detectable through their gravitational effects on nearby visible bodies. In 1796, Laplace mentioned that a star could be invisible if it were sufficiently large while speculating on the origin of the Solar System in his book Exposition du Système du Monde. Franz Xaver von Zach asked Laplace for a mathematical analysis, which Laplace provided and published in a journal edited by von Zach. In 1905, Albert Einstein showed that the laws of electromagnetism would be invariant under a Lorentz transformation: they would be identical for observers travelling at different velocities relative to each other. This discovery became known as the principle of special relativity. Although the laws of mechanics had already been shown to be invariant, gravity remained yet to be included.: 19 In 1907, Einstein published a paper proposing his equivalence principle, the hypothesis that inertial mass and gravitational mass have a common cause. Using the principle, Einstein predicted the redshift and half of the lensing effect of gravity on light; the full prediction of gravitational lensing required development of general relativity.: 19 By 1915, Einstein refined these ideas into his general theory of relativity, which explained how matter affects spacetime, which in turn affects the motion of other matter. This formed the basis for black hole physics. Only a few months after Einstein published the field equations describing general relativity, astrophysicist Karl Schwarzschild set out to apply the idea to stars. He assumed spherical symmetry with no spin and found a solution to Einstein's equations.: 124 A few months after Schwarzschild, Johannes Droste, a student of Hendrik Lorentz, independently gave the same solution. At a certain radius from the center of the mass, the Schwarzschild solution became singular, meaning that some of the terms in the Einstein equations became infinite. The nature of this radius, which later became known as the Schwarzschild radius, was not understood at the time. Many physicists of the early 20th century were skeptical of the existence of black holes. In a 1926 popular science book, Arthur Eddington critiqued the idea of a star with mass compressed to its Schwarzschild radius as a flaw in the then-poorly-understood theory of general relativity.: 134 In 1939, Einstein himself used his theory of general relativity in an attempt to prove that black holes were impossible. His work relied on increasing pressure or increasing centrifugal force balancing the force of gravity so that the object would not collapse beyond its Schwarzschild radius. He missed the possibility that implosion would drive the system below this critical value.: 135 By the 1920s, astronomers had classified a number of white dwarf stars as too cool and dense to be explained by the gradual cooling of ordinary stars. In 1926, Ralph Fowler showed that quantum-mechanical degeneracy pressure was larger than thermal pressure at these densities.: 145 In 1931, Subrahmanyan Chandrasekhar calculated that a non-rotating body of electron-degenerate matter below a certain limiting mass is stable, and by 1934 he showed that this explained the catalog of white dwarf stars.: 151 When Chandrasekhar announced his results, Eddington pointed out that stars above this limit would radiate until they were sufficiently dense to prevent light from exiting, a conclusion he considered absurd. Eddington and, later, Lev Landau argued that some yet unknown mechanism would stop the collapse. In the 1930s, Fritz Zwicky and Walter Baade studied stellar novae, focusing on exceptionally bright ones they called supernovae. Zwicky promoted the idea that supernovae produced stars with the density of atomic nuclei—neutron stars—but this idea was largely ignored.: 171 In 1939, based on Chandrasekhar's reasoning, J. Robert Oppenheimer and George Volkoff predicted that neutron stars below a certain mass limit, later called the Tolman–Oppenheimer–Volkoff limit, would be stable due to neutron degeneracy pressure. Above that limit, they reasoned that either their model would not apply or that gravitational contraction would not stop.: 380 John Archibald Wheeler and two of his students resolved questions about the model behind the Tolman–Oppenheimer–Volkoff (TOV) limit. Harrison and Wheeler developed the equations of state relating density to pressure for cold matter all the way through electron degeneracy and neutron degeneracy. Masami Wakano and Wheeler then used the equations to compute the equilibrium curve for stars, relating mass to circumference. They found no additional features that would invalidate the TOV limit. This meant that the only thing that could prevent black holes from forming was a dynamic process ejecting sufficient mass from a star as it cooled.: 205 The modern concept of black holes was formulated by Robert Oppenheimer and his student Hartland Snyder in 1939.: 80 In the paper, Oppenheimer and Snyder solved Einstein's equations of general relativity for an idealized imploding star, in a model later called the Oppenheimer–Snyder model, then described the results from far outside the star. The implosion starts as one might expect: the star material rapidly collapses inward. However, as the density of the star increases, gravitational time dilation increases and the collapse, viewed from afar, seems to slow down further and further until the star reaches its Schwarzschild radius, where it appears frozen in time.: 217 In 1958, David Finkelstein identified the Schwarzschild surface as an event horizon, calling it "a perfect unidirectional membrane: causal influences can cross it in only one direction". In this sense, events that occur inside of the black hole cannot affect events that occur outside of the black hole. Finkelstein created a new reference frame to include the point of view of infalling observers.: 103 Finkelstein's new frame of reference allowed events at the surface of an imploding star to be related to events far away. By 1962 the two points of view were reconciled, convincing many skeptics that implosion into a black hole made physical sense.: 226 The era from the mid-1960s to the mid-1970s was the "golden age of black hole research", when general relativity and black holes became mainstream subjects of research.: 258 In this period, more general black hole solutions were found. In 1963, Roy Kerr found the exact solution for a rotating black hole. Two years later, Ezra Newman found the cylindrically symmetric solution for a black hole that is both rotating and electrically charged. In 1967, Werner Israel found that the Schwarzschild solution was the only possible solution for a nonspinning, uncharged black hole, meaning that a Schwarzschild black hole would be defined by its mass alone. Similar identities were later found for Reissner-Nordstrom and Kerr black holes, defined only by their mass and their charge or spin respectively. Together, these findings became known as the no-hair theorem, which states that a stationary black hole is completely described by the three parameters of the Kerr–Newman metric: mass, angular momentum, and electric charge. At first, it was suspected that the strange mathematical singularities found in each of the black hole solutions only appeared due to the assumption that a black hole would be perfectly spherically symmetric, and therefore the singularities would not appear in generic situations where black holes would not necessarily be symmetric. This view was held in particular by Vladimir Belinski, Isaak Khalatnikov, and Evgeny Lifshitz, who tried to prove that no singularities appear in generic solutions, although they would later reverse their positions. However, in 1965, Roger Penrose proved that general relativity without quantum mechanics requires that singularities appear in all black holes. Astronomical observations also made great strides during this era. In 1967, Antony Hewish and Jocelyn Bell Burnell discovered pulsars and by 1969, these were shown to be rapidly rotating neutron stars. Until that time, neutron stars, like black holes, were regarded as just theoretical curiosities, but the discovery of pulsars showed their physical relevance and spurred a further interest in all types of compact objects that might be formed by gravitational collapse. Based on observations in Greenwich and Toronto in the early 1970s, Cygnus X-1, a galactic X-ray source discovered in 1964, became the first astronomical object commonly accepted to be a black hole. Work by James Bardeen, Jacob Bekenstein, Carter, and Hawking in the early 1970s led to the formulation of black hole thermodynamics. These laws describe the behaviour of a black hole in close analogy to the laws of thermodynamics by relating mass to energy, area to entropy, and surface gravity to temperature. The analogy was completed: 442 when Hawking, in 1974, showed that quantum field theory implies that black holes should radiate like a black body with a temperature proportional to the surface gravity of the black hole, predicting the effect now known as Hawking radiation. While Cygnus X-1, a stellar-mass black hole, was generally accepted by the scientific community as a black hole by the end of 1973, it would be decades before a supermassive black hole would gain the same broad recognition. Although, as early as the 1960s, physicists such as Donald Lynden-Bell and Martin Rees had suggested that powerful quasars in the center of galaxies were powered by accreting supermassive black holes, little observational proof existed at the time. However, the Hubble Space Telescope, launched decades later, found that supermassive black holes were not only present in these active galactic nuclei, but that supermassive black holes in the center of galaxies were ubiquitous: Almost every galaxy had a supermassive black hole at its center, many of which were quiescent. In 1999, David Merritt proposed the M–sigma relation, which related the dispersion of the velocity of matter in the center bulge of a galaxy to the mass of the supermassive black hole at its core. Subsequent studies confirmed this correlation. Around the same time, based on telescope observations of the velocities of stars at the center of the Milky Way galaxy, independent work groups led by Andrea Ghez and Reinhard Genzel concluded that the compact radio source in the center of the galaxy, Sagittarius A*, was likely a supermassive black hole. On 11 February 2016, the LIGO Scientific Collaboration and Virgo Collaboration announced the first direct detection of gravitational waves, named GW150914, representing the first observation of a black hole merger. At the time of the merger, the black holes were approximately 1.4 billion light-years away from Earth and had masses of 30 and 35 solar masses.: 6 In 2017, Rainer Weiss, Kip Thorne, and Barry Barish, who had spearheaded the project, were awarded the Nobel Prize in Physics for their work. Since the initial discovery in 2015, hundreds more gravitational waves have been observed by LIGO and another interferometer, Virgo. On 10 April 2019, the first direct image of a black hole and its vicinity was published, following observations made by the Event Horizon Telescope (EHT) in 2017 of the supermassive black hole in Messier 87's galactic centre. In 2022, the Event Horizon Telescope collaboration released an image of the black hole in the center of the Milky Way galaxy, Sagittarius A*; The data had been collected in 2017. In 2020, the Nobel Prize in Physics was awarded for work on black holes. Andrea Ghez and Reinhard Genzel shared one-half for their discovery that Sagittarius A* is a supermassive black hole. Penrose received the other half for his work showing that the mathematics of general relativity requires the formation of black holes. Cosmologists lamented that Hawking's extensive theoretical work on black holes would not be honored since he died in 2018. In December 1967, a student reportedly suggested the phrase black hole at a lecture by John Wheeler; Wheeler adopted the term for its brevity and "advertising value", and Wheeler's stature in the field ensured it quickly caught on, leading some to credit Wheeler with coining the phrase. However, the term was used by others around that time. Science writer Marcia Bartusiak traces the term black hole to physicist Robert H. Dicke, who in the early 1960s reportedly compared the phenomenon to the Black Hole of Calcutta, notorious as a prison where people entered but never left alive. The term was used in print by Life and Science News magazines in 1963, and by science journalist Ann Ewing in her article "'Black Holes' in Space", dated 18 January 1964, which was a report on a meeting of the American Association for the Advancement of Science held in Cleveland, Ohio. Definition A black hole is generally defined as a region of spacetime from which no information-carrying signals or objects can escape. However, verifying an object as a black hole by this definition would require waiting for an infinite time and at an infinite distance from the black hole to verify that indeed, nothing has escaped, and thus cannot be used to identify a physical black hole. Broadly, physicists do not have a precisely-agreed-upon definition of a black hole. Among astrophysicists, a black hole is a compact object with a mass larger than four solar masses. A black hole may also be defined as a reservoir of information: 142 or a region where space is falling inwards faster than the speed of light. Properties The no-hair theorem postulates that, once it achieves a stable condition after formation, a black hole has only three independent physical properties: mass, electric charge, and angular momentum; the black hole is otherwise featureless. If the conjecture is true, any two black holes that share the same values for these properties, or parameters, are indistinguishable from one another. The degree to which the conjecture is true for real black holes is currently an unsolved problem. The simplest static black holes have mass but neither electric charge nor angular momentum. According to Birkhoff's theorem, these Schwarzschild black holes are the only vacuum solution that is spherically symmetric. Solutions describing more general black holes also exist. Non-rotating charged black holes are described by the Reissner–Nordström metric, while the Kerr metric describes a non-charged rotating black hole. The most general stationary black hole solution known is the Kerr–Newman metric, which describes a black hole with both charge and angular momentum. The simplest static black holes have mass but neither electric charge nor angular momentum. Contrary to the popular notion of a black hole "sucking in everything" in its surroundings, from far away, the external gravitational field of a black hole is identical to that of any other body of the same mass. While a black hole can theoretically have any positive mass, the charge and angular momentum are constrained by the mass. The total electric charge Q and the total angular momentum J are expected to satisfy the inequality Q 2 4 π ϵ 0 + c 2 J 2 G M 2 ≤ G M 2 {\displaystyle {\frac {Q^{2}}{4\pi \epsilon _{0}}}+{\frac {c^{2}J^{2}}{GM^{2}}}\leq GM^{2}} for a black hole of mass M. Black holes with the maximum possible charge or spin satisfying this inequality are called extremal black holes. Solutions of Einstein's equations that violate this inequality exist, but they do not possess an event horizon. These are so-called naked singularities that can be observed from the outside. Because these singularities make the universe inherently unpredictable, many physicists believe they could not exist. The weak cosmic censorship hypothesis, proposed by Sir Roger Penrose, rules out the formation of such singularities, when they are created through the gravitational collapse of realistic matter. However, this theory has not yet been proven, and some physicists believe that naked singularities could exist. It is also unknown whether black holes could even become extremal, forming naked singularities, since natural processes counteract increasing spin and charge when a black hole becomes near-extremal. The total mass of a black hole can be estimated by analyzing the motion of objects near the black hole, such as stars or gas. All black holes spin, often fast—One supermassive black hole, GRS 1915+105 has been estimated to spin at over 1,000 revolutions per second. The Milky Way's central black hole Sagittarius A* rotates at about 90% of the maximum rate. The spin rate can be inferred from measurements of atomic spectral lines in the X-ray range. As gas near the black hole plunges inward, high energy X-ray emission from electron-positron pairs illuminates the gas further out, appearing red-shifted due to relativistic effects. Depending on the spin of the black hole, this plunge happens at different radii from the hole, with different degrees of redshift. Astronomers can use the gap between the x-ray emission of the outer disk and the redshifted emission from plunging material to determine the spin of the black hole. A newer way to estimate spin is based on the temperature of gasses accreting onto the black hole. The method requires an independent measurement of the black hole mass and inclination angle of the accretion disk followed by computer modeling. Gravitational waves from coalescing binary black holes can also provide the spin of both progenitor black holes and the merged hole, but such events are rare. A spinning black hole has angular momentum. The supermassive black hole in the center of the Messier 87 (M87) galaxy appears to have an angular momentum very close to the maximum theoretical value. That uncharged limit is J ≤ G M 2 c , {\displaystyle J\leq {\frac {GM^{2}}{c}},} allowing definition of a dimensionless spin magnitude such that 0 ≤ c J G M 2 ≤ 1. {\displaystyle 0\leq {\frac {cJ}{GM^{2}}}\leq 1.} Most black holes are believed to have an approximately neutral charge. For example, Michal Zajaček, Arman Tursunov, Andreas Eckart, and Silke Britzen found the electric charge of Sagittarius A* to be at least ten orders of magnitude below the theoretical maximum. A charged black hole repels other like charges just like any other charged object. If a black hole were to become charged, particles with an opposite sign of charge would be pulled in by the extra electromagnetic force, while particles with the same sign of charge would be repelled, neutralizing the black hole. This effect may not be as strong if the black hole is also spinning. The presence of charge can reduce the diameter of the black hole by up to 38%. The charge Q for a nonspinning black hole is bounded by Q ≤ G M , {\displaystyle Q\leq {\sqrt {G}}M,} where G is the gravitational constant and M is the black hole's mass. Classification Black holes can have a wide range of masses. The minimum mass of a black hole formed by stellar gravitational collapse is governed by the maximum mass of a neutron star and is believed to be approximately two-to-four solar masses. However, theoretical primordial black holes, believed to have formed soon after the Big Bang, could be far smaller, with masses as little as 10−5 grams at formation. These very small black holes are sometimes called micro black holes. Black holes formed by stellar collapse are called stellar black holes. Estimates of their maximum mass at formation vary, but generally range from 10 to 100 solar masses, with higher estimates for black holes progenated by low-metallicity stars. The mass of a black hole formed via a supernova has a lower bound: If the progenitor star is too small, the collapse may be stopped by the degeneracy pressure of the star's constituents, allowing the condensation of matter into an exotic denser state. Degeneracy pressure occurs from the Pauli exclusion principle—Particles will resist being in the same place as each other. Smaller progenitor stars, with masses less than about 8 M☉, will be held together by the degeneracy pressure of electrons and will become a white dwarf. For more massive progenitor stars, electron degeneracy pressure is no longer strong enough to resist the force of gravity and the star will be held together by neutron degeneracy pressure, which can occur at much higher densities, forming a neutron star. If the star is still too massive, even neutron degeneracy pressure will not be able to resist the force of gravity and the star will collapse into a black hole.: 5.8 Stellar black holes can also gain mass via accretion of nearby matter, often from a companion object such as a star. Black holes that are larger than stellar black holes but smaller than supermassive black holes are called intermediate-mass black holes, with masses of approximately 102 to 105 solar masses. These black holes seem to be rarer than their stellar and supermassive counterparts, with relatively few candidates having been observed. Physicists have speculated that such black holes may form from collisions in globular and star clusters or at the center of low-mass galaxies. They may also form as the result of mergers of smaller black holes, with several LIGO observations finding merged black holes within the 110-350 solar mass range. The black holes with the largest masses are called supermassive black holes, with masses more than 106 times that of the Sun. These black holes are believed to exist at the centers of almost every large galaxy, including the Milky Way. Some scientists have proposed a subcategory of even larger black holes, called ultramassive black holes, with masses greater than 109-1010 solar masses. Theoretical models predict that the accretion disc that feeds black holes will be unstable once a black hole reaches 50-100 billion times the mass of the Sun, setting a rough upper limit to black hole mass. Structure While black holes are conceptually invisible sinks of all matter and light, in astronomical settings, their enormous gravity alters the motion of surrounding objects and pulls nearby gas inwards at near-light speed, making the area around black holes the brightest objects in the universe. Some black holes have relativistic jets—thin streams of plasma travelling away from the black hole at more than one-tenth of the speed of light. A small faction of the matter falling towards the black hole gets accelerated away along the hole rotation axis. These jets can extend as far as millions of parsecs from the black hole itself. Black holes of any mass can have jets. However, they are typically observed around spinning black holes with strongly-magnetized accretion disks. Relativistic jets were more common in the early universe, when galaxies and their corresponding supermassive black holes were rapidly gaining mass. All black holes with jets also have an accretion disk, but the jets are usually brighter than the disk. Quasars, typically found in other galaxies, are believed to be supermassive black holes with jets; microquasars are believed to be stellar-mass objects with jets, typically observed in the Milky Way. The mechanism of formation of jets is not yet known, but several options have been proposed. One method proposed to fuel these jets is the Blandford-Znajek process, which suggests that the dragging of magnetic field lines by a black hole's rotation could launch jets of matter into space. The Penrose process, which involves extraction of a black hole's rotational energy, has also been proposed as a potential mechanism of jet propulsion. Due to conservation of angular momentum, gas falling into the gravitational well created by a massive object will typically form a disk-like structure around the object.: 242 As the disk's angular momentum is transferred outward due to internal processes, its matter falls farther inward, converting its gravitational energy into heat and releasing a large flux of x-rays. The temperature of these disks can range from thousands to millions of Kelvin, and temperatures can differ throughout a single accretion disk. Accretion disks can also emit in other parts of the electromagnetic spectrum, depending on the disk's turbulence and magnetization and the black hole's mass and angular momentum. Accretion disks can be defined as geometrically thin or geometrically thick. Geometrically thin disks are mostly confined to the black hole's equatorial plane and have a well-defined edge at the innermost stable circular orbit (ISCO), while geometrically thick disks are supported by internal pressure and temperature and can extend inside the ISCO. Disks with high rates of electron scattering and absorption, appearing bright and opaque, are called optically thick; optically thin disks are more translucent and produce fainter images when viewed from afar. Accretion disks of black holes accreting beyond the Eddington limit are often referred to as polish donuts due to their thick, toroidal shape that resembles that of a donut. Quasar accretion disks are expected to usually appear blue in color. The disk for a stellar black hole, on the other hand, would likely look orange, yellow, or red, with its inner regions being the brightest. Theoretical research suggests that the hotter a disk is, the bluer it should be, although this is not always supported by observations of real astronomical objects. Accretion disk colors may also be altered by the Doppler effect, with the part of the disk travelling towards an observer appearing bluer and brighter and the part of the disk travelling away from the observer appearing redder and dimmer. In Newtonian gravity, test particles can stably orbit at arbitrary distances from a central object. In general relativity, however, there exists a smallest possible radius for which a massive particle can orbit stably. Any infinitesimal inward perturbations to this orbit will lead to the particle spiraling into the black hole, and any outward perturbations will, depending on the energy, cause the particle to spiral in, move to a stable orbit further from the black hole, or escape to infinity. This orbit is called the innermost stable circular orbit, or ISCO. The location of the ISCO depends on the spin of the black hole and the spin of the particle itself. In the case of a Schwarzschild black hole (spin zero) and a particle without spin, the location of the ISCO is: r I S C O = 3 r s = 6 G M c 2 , {\displaystyle r_{\rm {ISCO}}=3\,r_{\text{s}}={\frac {6\,GM}{c^{2}}},} where r I S C O {\displaystyle r_{\rm {_{ISCO}}}} is the radius of the ISCO, r s {\displaystyle r_{\text{s}}} is the Schwarzschild radius of the black hole, G {\displaystyle G} is the gravitational constant, and c {\displaystyle c} is the speed of light. The radius of this orbit changes slightly based on particle spin. For charged black holes, the ISCO moves inwards. For spinning black holes, the ISCO is moved inwards for particles orbiting in the same direction that the black hole is spinning (prograde) and outwards for particles orbiting in the opposite direction (retrograde). For example, the ISCO for a particle orbiting retrograde can be as far out as about 9 r s {\displaystyle 9r_{\text{s}}} , while the ISCO for a particle orbiting prograde can be as close as at the event horizon itself. The photon sphere is a spherical boundary for which photons moving on tangents to that sphere are bent completely around the black hole, possibly orbiting multiple times. Light rays with impact parameters less than the radius of the photon sphere enter the black hole. For Schwarzschild black holes, the photon sphere has a radius 1.5 times the Schwarzschild radius; the radius for non-Schwarzschild black holes is at least 1.5 times the radius of the event horizon. When viewed from a great distance, the photon sphere creates an observable black hole shadow. Since no light emerges from within the black hole, this shadow is the limit for possible observations.: 152 The shadow of colliding black holes should have characteristic warped shapes, allowing scientists to detect black holes that are about to merge. While light can still escape from the photon sphere, any light that crosses the photon sphere on an inbound trajectory will be captured by the black hole. Therefore, any light that reaches an outside observer from the photon sphere must have been emitted by objects between the photon sphere and the event horizon. Light emitted towards the photon sphere may also curve around the black hole and return to the emitter. For a rotating, uncharged black hole, the radius of the photon sphere depends on the spin parameter and whether the photon is orbiting prograde or retrograde. For a photon orbiting prograde, the photon sphere will be 1-3 Schwarzschild radii from the center of the black hole, while for a photon orbiting retrograde, the photon sphere will be between 3-5 Schwarzschild radii from the center of the black hole. The exact location of the photon sphere depends on the magnitude of the black hole's rotation. For a charged, nonrotating black hole, there will only be one photon sphere, and the radius of the photon sphere will decrease for increasing black hole charge. For non-extremal, charged, rotating black holes, there will always be two photon spheres, with the exact radii depending on the parameters of the black hole. Near a rotating black hole, spacetime rotates similar to a vortex. The rotating spacetime will drag any matter and light into rotation around the spinning black hole. This effect of general relativity, called frame dragging, gets stronger closer to the spinning mass. The region of spacetime in which it is impossible to stay still is called the ergosphere. The ergosphere of a black hole is a volume bounded by the black hole's event horizon and the ergosurface, which coincides with the event horizon at the poles but bulges out from it around the equator. Matter and radiation can escape from the ergosphere. Through the Penrose process, objects can emerge from the ergosphere with more energy than they entered with. The extra energy is taken from the rotational energy of the black hole, slowing down the rotation of the black hole.: 268 A variation of the Penrose process in the presence of strong magnetic fields, the Blandford–Znajek process, is considered a likely mechanism for the enormous luminosity and relativistic jets of quasars and other active galactic nuclei. The observable region of spacetime around a black hole closest to its event horizon is called the plunging region. In this area it is no longer possible for free falling matter to follow circular orbits or stop a final descent into the black hole. Instead, it will rapidly plunge toward the black hole at close to the speed of light, growing increasingly hot and producing a characteristic, detectable thermal emission. However, light and radiation emitted from this region can still escape from the black hole's gravitational pull. For a nonspinning, uncharged black hole, the radius of the event horizon, or Schwarzschild radius, is proportional to the mass, M, through r s = 2 G M c 2 ≈ 2.95 M M ⊙ k m , {\displaystyle r_{\mathrm {s} }={\frac {2GM}{c^{2}}}\approx 2.95\,{\frac {M}{M_{\odot }}}~\mathrm {km,} } where rs is the Schwarzschild radius and M☉ is the mass of the Sun.: 124 For a black hole with nonzero spin or electric charge, the radius is smaller,[Note 1] until an extremal black hole could have an event horizon close to r + = G M c 2 , {\displaystyle r_{\mathrm {+} }={\frac {GM}{c^{2}}},} half the radius of a nonspinning, uncharged black hole of the same mass. Since the volume within the Schwarzschild radius increase with the cube of the radius, average density of a black hole inside its Schwarzschild radius is inversely proportional to the square of its mass: supermassive black holes are much less dense than stellar black holes. The average density of a 108 M☉ black hole is comparable to that of water. The defining feature of a black hole is the existence of an event horizon, a boundary in spacetime through which matter and light can pass only inward towards the center of the black hole. Nothing, not even light, can escape from inside the event horizon. The event horizon is referred to as such because if an event occurs within the boundary, information from that event cannot reach or affect an outside observer, making it impossible to determine whether such an event occurred.: 179 For non-rotating black holes, the geometry of the event horizon is precisely spherical, while for rotating black holes, the event horizon is oblate. To a distant observer, a clock near a black hole would appear to tick more slowly than one further from the black hole.: 217 This effect, known as gravitational time dilation, would also cause an object falling into a black hole to appear to slow as it approached the event horizon, never quite reaching the horizon from the perspective of an outside observer.: 218 All processes on this object would appear to slow down, and any light emitted by the object to appear redder and dimmer, an effect known as gravitational redshift. An object falling from half of a Schwarzschild radius above the event horizon would fade away until it could no longer be seen, disappearing from view within one hundredth of a second. It would also appear to flatten onto the black hole, joining all other material that had ever fallen into the hole. On the other hand, an observer falling into a black hole would not notice any of these effects as they cross the event horizon. Their own clocks appear to them to tick normally, and they cross the event horizon after a finite time without noting any singular behaviour. In general relativity, it is impossible to determine the location of the event horizon from local observations, due to Einstein's equivalence principle.: 222 Black holes that are rotating and/or charged have an inner horizon, often called the Cauchy horizon, inside of the black hole. The inner horizon is divided up into two segments: an ingoing section and an outgoing section. At the ingoing section of the Cauchy horizon, radiation and matter that fall into the black hole would build up at the horizon, causing the curvature of spacetime to go to infinity. This would cause an observer falling in to experience tidal forces. This phenomenon is often called mass inflation, since it is associated with a parameter dictating the black hole's internal mass growing exponentially, and the buildup of tidal forces is called the mass-inflation singularity or Cauchy horizon singularity. Some physicists have argued that in realistic black holes, accretion and Hawking radiation would stop mass inflation from occurring. At the outgoing section of the inner horizon, infalling radiation would backscatter off of the black hole's spacetime curvature and travel outward, building up at the outgoing Cauchy horizon. This would cause an infalling observer to experience a gravitational shock wave and tidal forces as the spacetime curvature at the horizon grew to infinity. This buildup of tidal forces is called the shock singularity. Both of these singularities are weak, meaning that an object crossing them would only be deformed a finite amount by tidal forces, even though the spacetime curvature would still be infinite at the singularity. This is as opposed to a strong singularity, where an object hitting the singularity would be stretched and squeezed by an infinite amount. They are also null singularities, meaning that a photon could travel parallel to the them without ever being intercepted. Ignoring quantum effects, every black hole has a singularity inside, points where the curvature of spacetime becomes infinite, and geodesics terminate within a finite proper time.: 205 For a non-rotating black hole, this region takes the shape of a single point; for a rotating black hole it is smeared out to form a ring singularity that lies in the plane of rotation.: 264 In both cases, the singular region has zero volume. All of the mass of the black hole ends up in the singularity.: 252 Since the singularity has nonzero mass in an infinitely small space, it can be thought of as having infinite density. Observers falling into a Schwarzschild black hole (i.e., non-rotating and not charged) cannot avoid being carried into the singularity once they cross the event horizon. As they fall further into the black hole, they will be torn apart by the growing tidal forces in a process sometimes referred to as spaghettification or the noodle effect. Eventually, they will reach the singularity and be crushed into an infinitely small point.: 182 However any perturbations, such as those caused by matter or radiation falling in, would cause space to oscillate chaotically near the singularity. Any matter falling in would experience intense tidal forces rapidly changing in direction, all while being compressed into an increasingly small volume. Alternative forms of general relativity, including addition of some quatum effects, can lead to regular, or nonsingular, black holes without singularities. For example, the fuzzball model, based on string theory, states that black holes are actually made up of quantum microstates and need not have a singularity or an event horizon. The theory of loop quantum gravity proposes that the curvature and density at the center of a black hole is large, but not infinite. Formation Black holes are formed by gravitational collapse of massive stars, either by direct collapse or during a supernova explosion in a process called fallback. Black holes can result from the merger of two neutron stars or a neutron star and a black hole. Other more speculative mechanisms include primordial black holes created from density fluctuations in the early universe, the collapse of dark stars, a hypothetical object powered by annihilation of dark matter, or from hypothetical self-interacting dark matter. Gravitational collapse occurs when an object's internal pressure is insufficient to resist the object's own gravity. At the end of a star's life, it will run out of hydrogen to fuse, and will start fusing more and more massive elements, until it gets to iron. Since the fusion of elements heavier than iron would require more energy than it would release, nuclear fusion ceases. If the iron core of the star is too massive, the star will no longer be able to support itself and will undergo gravitational collapse. While most of the energy released during gravitational collapse is emitted very quickly, an outside observer does not actually see the end of this process. Even though the collapse takes a finite amount of time from the reference frame of infalling matter, a distant observer would see the infalling material slow and halt just above the event horizon, due to gravitational time dilation. Light from the collapsing material takes longer and longer to reach the observer, with the delay growing to infinity as the emitting material reaches the event horizon. Thus the external observer never sees the formation of the event horizon; instead, the collapsing material seems to become dimmer and increasingly red-shifted, eventually fading away. Observations of quasars at redshift z ∼ 7 {\displaystyle z\sim 7} , less than a billion years after the Big Bang, has led to investigations of other ways to form black holes. The accretion process to build supermassive black holes has a limiting rate of mass accumulation and a billion years is not enough time to reach quasar status. One suggestion is direct collapse of nearly pure hydrogen gas (low metalicity) clouds characteristic of the young universe, forming a supermassive star which collapses into a black hole. It has been suggested that seed black holes with typical masses of ~105 M☉ could have formed in this way which then could grow to ~109 M☉. However, the very large amount of gas required for direct collapse is not typically stable to fragmentation to form multiple stars. Thus another approach suggests massive star formation followed by collisions that seed massive black holes which ultimately merge to create a quasar.: 85 A neutron star in a common envelope with a regular star can accrete sufficient material to collapse to a black hole or two neutron stars can merge. These avenues for the formation of black holes are considered relatively rare. In the current epoch of the universe, conditions needed to form black holes are rare and are mostly only found in stars. However, in the early universe, conditions may have allowed for black hole formations via other means. Fluctuations of spacetime soon after the Big Bang may have formed areas that were denser then their surroundings. Initially, these regions would not have been compact enough to form a black hole, but eventually, the curvature of spacetime in the regions become large enough to cause them to collapse into a black hole. Different models for the early universe vary widely in their predictions of the scale of these fluctuations. Various models predict the creation of primordial black holes ranging from a Planck mass (~2.2×10−8 kg) to hundreds of thousands of solar masses. Primordial black holes with masses less than 1015 g would have evaporated by now due to Hawking radiation. Despite the early universe being extremely dense, it did not re-collapse into a black hole during the Big Bang, since the universe was expanding rapidly and did not have the gravitational differential necessary for black hole formation. Models for the gravitational collapse of objects of relatively constant size, such as stars, do not necessarily apply in the same way to rapidly expanding space such as the Big Bang. In principle, black holes could be formed in high-energy particle collisions that achieve sufficient density, although no such events have been detected. These hypothetical micro black holes, which could form from the collision of cosmic rays and Earth's atmosphere or in particle accelerators like the Large Hadron Collider, would not be able to aggregate additional mass. Instead, they would evaporate in about 10−25 seconds, posing no threat to the Earth. Evolution Black holes can also merge with other objects such as stars or even other black holes. This is thought to have been important, especially in the early growth of supermassive black holes, which could have formed from the aggregation of many smaller objects. The process has also been proposed as the origin of some intermediate-mass black holes. Mergers of supermassive black holes may take a long time: As a binary of supermassive black holes approach each other, most nearby stars are ejected, leaving little for the remaining black holes to gravitationally interact with that would allow them to get closer to each other. This phenomenon has been called the final parsec problem, as the distance at which this happens is usually around one parsec. When a black hole accretes matter, the gas in the inner accretion disk orbits at very high speeds because of its proximity to the black hole. The resulting friction heats the inner disk to temperatures at which it emits vast amounts of electromagnetic radiation (mainly X-rays) detectable by telescopes. By the time the matter of the disk reaches the ISCO, between 5.7% and 42% of its mass will have been converted to energy, depending on the black hole's spin. About 90% of this energy is released within about 20 black hole radii. In many cases, accretion disks are accompanied by relativistic jets that are emitted along the black hole's poles, which carry away much of the energy. The mechanism for the creation of these jets is currently not well understood, in part due to insufficient data. Many of the universe's most energetic phenomena have been attributed to the accretion of matter on black holes. Active galactic nuclei and quasars are believed to be the accretion disks of supermassive black holes. X-ray binaries are generally accepted to be binary systems in which one of the two objects is a compact object accreting matter from its companion. Ultraluminous X-ray sources may be the accretion disks of intermediate-mass black holes. At a certain rate of accretion, the outward radiation pressure will become as strong as the inward gravitational force, and the black hole should unable to accrete any faster. This limit is called the Eddington limit. However, many black holes accrete beyond this rate due to their non-spherical geometry or instabilities in the accretion disk. Accretion beyond the limit is called Super-Eddington accretion and may have been commonplace in the early universe. Stars have been observed to get torn apart by tidal forces in the immediate vicinity of supermassive black holes in galaxy nuclei, in what is known as a tidal disruption event (TDE). Some of the material from the disrupted star forms an accretion disk around the black hole, which emits observable electromagnetic radiation. The correlation between the masses of supermassive black holes in the centres of galaxies with the velocity dispersion and mass of stars in their host bulges suggests that the formation of galaxies and the formation of their central black holes are related. Black hole winds from rapid accretion, particularly when the galaxy itself is still accreting matter, can compress gas nearby, accelerating star formation. However, if the winds become too strong, the black hole may blow nearly all of the gas out of the galaxy, quenching star formation. Black hole jets may also energize nearby cavities of plasma and eject low-entropy gas from out of the galactic core, causing gas in galactic centers to be hotter than expected. If Hawking's theory of black hole radiation is correct, then black holes are expected to shrink and evaporate over time as they lose mass by the emission of photons and other particles. The temperature of this thermal spectrum (Hawking temperature) is proportional to the surface gravity of the black hole, which is inversely proportional to the mass. Hence, large black holes emit less radiation than small black holes.: Ch. 9.6 A stellar black hole of 1 M☉ has a Hawking temperature of 62 nanokelvins. This is far less than the 2.7 K temperature of the cosmic microwave background radiation. Stellar-mass or larger black holes receive more mass from the cosmic microwave background than they emit through Hawking radiation and thus will grow instead of shrinking. To have a Hawking temperature larger than 2.7 K (and be able to evaporate), a black hole would need a mass less than the Moon. Such a black hole would have a diameter of less than a tenth of a millimetre. The Hawking radiation for an astrophysical black hole is predicted to be very weak and would thus be exceedingly difficult to detect from Earth. A possible exception is the burst of gamma rays emitted in the last stage of the evaporation of primordial black holes. Searches for such flashes have proven unsuccessful and provide stringent limits on the possibility of existence of low mass primordial black holes, with modern research predicting that primordial black holes must make up less than a fraction of 10−7 of the universe's total mass. NASA's Fermi Gamma-ray Space Telescope, launched in 2008, has searched for these flashes, but has not yet found any. The properties of a black hole are constrained and interrelated by the theories that predict these properties. When based on general relativity, these relationships are called the laws of black hole mechanics. For a black hole that is not still forming or accreting matter, the zeroth law of black hole mechanics states the black hole's surface gravity is constant across the event horizon. The first law relates changes in the black hole's surface area, angular momentum, and charge to changes in its energy. The second law says the surface area of a black hole never decreases on its own. Finally, the third law says that the surface gravity of a black hole is never zero. These laws are mathematical analogs of the laws of thermodynamics. They are not equivalent, however, because, according to general relativity without quantum mechanics, a black hole can never emit radiation, and thus its temperature must always be zero.: 11 Quantum mechanics predicts that a black hole will continuously emit thermal Hawking radiation, and therefore must always have a nonzero temperature. It also predicts that all black holes have entropy which scales with their surface area. When quantum mechanics is accounted for, the laws of black hole mechanics become equivalent to the classical laws of thermodynamics. However, these conclusions are derived without a complete theory of quantum gravity, although many potential theories do predict black holes having entropy and temperature. Thus, the true quantum nature of black hole thermodynamics continues to be debated.: 29 Observational evidence Millions of black holes with around 30 solar masses derived from stellar collapse are expected to exist in the Milky Way. Even a dwarf galaxy like Draco should have hundreds. Only a few of these have been detected. By nature, black holes do not themselves emit any electromagnetic radiation other than the hypothetical Hawking radiation, so astrophysicists searching for black holes must generally rely on indirect observations. The defining characteristic of a black hole is its event horizon. The horizon itself cannot be imaged, so all other possible explanations for these indirect observations must be considered and eliminated before concluding that a black hole has been observed.: 11 The Event Horizon Telescope (EHT) is a global system of radio telescopes capable of directly observing a black hole shadow. The angular resolution of a telescope is based on its aperture and the wavelengths it is observing. Because the angular diameters of Sagittarius A* and Messier 87* in the sky are very small, a single telescope would need to be about the size of the Earth to clearly distinguish their horizons using radio wavelengths. By combining data from several different radio telescopes around the world, the Event Horizon Telescope creates an effective aperture the diameter size of the Earth. The EHT team used imaging algorithms to compute the most probable image from the data in its observations of Sagittarius A* and M87*. Gravitational-wave interferometry can be used to detect merging black holes and other compact objects. In this method, a laser beam is split down two long arms of a tunnel. The laser beams reflect off of mirrors in the tunnels and converge at the intersection of the arms, cancelling each other out. However, when a gravitational wave passes, it warps spacetime, changing the lengths of the arms themselves. Since each laser beam is now travelling a slightly different distance, they do not cancel out and produce a recognizable signal. Analysis of the signal can give scientists information about what caused the gravitational waves. Since gravitational waves are very weak, gravitational-wave observatories such as LIGO must have arms several kilometers long and carefully control for noise from Earth to be able to detect these gravitational waves. Since the first measurements in 2016, multiple gravitational waves from black holes have been detected and analyzed. The proper motions of stars near the centre of the Milky Way provide strong observational evidence that these stars are orbiting a supermassive black hole. Since 1995, astronomers have tracked the motions of 90 stars orbiting an invisible object coincident with the radio source Sagittarius A*. In 1998, by fitting the motions of the stars to Keplerian orbits, the astronomers were able to infer that Sagittarius A* must be a 2.6×106 M☉ object must be contained within a radius of 0.02 light-years. Since then, one of the stars—called S2—has completed a full orbit. From the orbital data, astronomers were able to refine the calculations of the mass of Sagittarius A* to 4.3×106 M☉, with a radius of less than 0.002 light-years. This upper limit radius is larger than the Schwarzschild radius for the estimated mass, so the combination does not prove Sagittarius A* is a black hole. Nevertheless, these observations strongly suggest that the central object is a supermassive black hole as there are no other plausible scenarios for confining so much invisible mass into such a small volume. Additionally, there is some observational evidence that this object might possess an event horizon, a feature unique to black holes. The Event Horizon Telescope image of Sagittarius A*, released in 2022, provided further confirmation that it is indeed a black hole. X-ray binaries are binary systems that emit a majority of their radiation in the X-ray part of the electromagnetic spectrum. These X-ray emissions result when a compact object accretes matter from an ordinary star. The presence of an ordinary star in such a system provides an opportunity for studying the central object and to determine if it might be a black hole. By measuring the orbital period of the binary, the distance to the binary from Earth, and the mass of the companion star, scientists can estimate the mass of the compact object. The Tolman-Oppenheimer-Volkoff limit (TOV limit) dictates the largest mass a nonrotating neutron star can be, and is estimated to be about two solar masses. While a rotating neutron star can be slightly more massive, if the compact object is much more massive than the TOV limit, it cannot be a neutron star and is generally expected to be a black hole. The first strong candidate for a black hole, Cygnus X-1, was discovered in this way by Charles Thomas Bolton, Louise Webster, and Paul Murdin in 1972. Observations of rotation broadening of the optical star reported in 1986 lead to a compact object mass estimate of 16 solar masses, with 7 solar masses as the lower bound. In 2011, this estimate was updated to 14.1±1.0 M☉ for the black hole and 19.2±1.9 M☉ for the optical stellar companion. X-ray binaries can be categorized as either low-mass or high-mass; This classification is based on the mass of the companion star, not the compact object itself. In a class of X-ray binaries called soft X-ray transients, the companion star is of relatively low mass, allowing for more accurate estimates of the black hole mass. These systems actively emit X-rays for only several months once every 10–50 years. During the period of low X-ray emission, called quiescence, the accretion disk is extremely faint, allowing detailed observation of the companion star. Numerous black hole candidates have been measured by this method. Black holes are also sometimes found in binaries with other compact objects, such as white dwarfs, neutron stars, and other black holes. The centre of nearly every galaxy contains a supermassive black hole. The close observational correlation between the mass of this hole and the velocity dispersion of the host galaxy's bulge, known as the M–sigma relation, strongly suggests a connection between the formation of the black hole and that of the galaxy itself. Astronomers use the term active galaxy to describe galaxies with unusual characteristics, such as unusual spectral line emission and very strong radio emission. Theoretical and observational studies have shown that the high levels of activity in the centers of these galaxies, regions called active galactic nuclei (AGN), may be explained by accretion onto supermassive black holes. These AGN consist of a central black hole that may be millions or billions of times more massive than the Sun, a disk of interstellar gas and dust called an accretion disk, and two jets perpendicular to the accretion disk. Although supermassive black holes are expected to be found in most AGN, only some galaxies' nuclei have been more carefully studied in attempts to both identify and measure the actual masses of the central supermassive black hole candidates. Some of the most notable galaxies with supermassive black hole candidates include the Andromeda Galaxy, Messier 32, Messier 87, the Sombrero Galaxy, and the Milky Way itself. Another way black holes can be detected is through observation of effects caused by their strong gravitational field. One such effect is gravitational lensing: The deformation of spacetime around a massive object causes light rays to be deflected, making objects behind them appear distorted. When the lensing object is a black hole, this effect can be strong enough to create multiple images of a star or other luminous source. However, the distance between the lensed images may be too small for contemporary telescopes to resolve—this phenomenon is called microlensing. Instead of seeing two images of a lensed star, astronomers see the star brighten slightly as the black hole moves towards the line of sight between the star and Earth and then return to its normal luminosity as the black hole moves away. The turn of the millennium saw the first 3 candidate detections of black holes in this way, and in January 2022, astronomers reported the first confirmed detection of a microlensing event from an isolated black hole. This was also the first determination of an isolated black hole mass, 7.1±1.3 M☉. Alternatives While there is a strong case for supermassive black holes, the model for stellar-mass black holes assumes of an upper limit for the mass of a neutron star: objects observed to have more mass are assumed to be black holes. However, the properties of extremely dense matter are poorly understood. New exotic phases of matter could allow other kinds of massive objects. Quark stars would be made up of quark matter and supported by quark degeneracy pressure, a form of degeneracy pressure even stronger than neutron degeneracy pressure. This would halt gravitational collapse at a higher mass than for a neutron star. Even stronger stars called electroweak stars would convert quarks in their cores into leptons, providing additional pressure to stop the star from collapsing. If, as some extensions of the Standard Model posit, quarks and leptons are made up of the even-smaller fundamental particles called preons, a very compact star could be supported by preon degeneracy pressure. While none of these hypothetical models can explain all of the observations of stellar black hole candidates, a Q star is the only alternative which could significantly exceed the mass limit for neutron stars and thus provide an alternative for supermassive black holes.: 12 A few theoretical objects have been conjectured to match observations of astronomical black hole candidates identically or near-identically, but which function via a different mechanism. A dark energy star would convert infalling matter into vacuum energy; This vacuum energy would be much larger than the vacuum energy of outside space, exerting outwards pressure and preventing a singularity from forming. A black star would be gravitationally collapsing slowly enough that quantum effects would keep it just on the cusp of fully collapsing into a black hole. A gravastar would consist of a very thin shell and a dark-energy interior providing outward pressure to stop the collapse into a black hole or formation of a singularity; It could even have another gravastar inside, called a 'nestar'. Open questions According to the no-hair theorem, a black hole is defined by only three parameters: its mass, charge, and angular momentum. This seems to mean that all other information about the matter that went into forming the black hole is lost, as there is no way to determine anything about the black hole from outside other than those three parameters. When black holes were thought to persist forever, this information loss was not problematic, as the information can be thought of as existing inside the black hole. However, black holes slowly evaporate by emitting Hawking radiation. This radiation does not appear to carry any additional information about the matter that formed the black hole, meaning that this information is seemingly gone forever. This is called the black hole information paradox. Theoretical studies analyzing the paradox have led to both further paradoxes and new ideas about the intersection of quantum mechanics and general relativity. While there is no consensus on the resolution of the paradox, work on the problem is expected to be important for a theory of quantum gravity.: 126 Observations of faraway galaxies have found that ultraluminous quasars, powered by supermassive black holes, existed in the early universe as far as redshift z ≥ 7 {\displaystyle z\geq 7} . These black holes have been assumed to be the products of the gravitational collapse of large population III stars. However, these stellar remnants were not massive enough to produce the quasars observed at early times without accreting beyond the Eddington limit, the theoretical maximum rate of black hole accretion. Physicists have suggested a variety of different mechanisms by which these supermassive black holes may have formed. It has been proposed that smaller black holes may have also undergone mergers to produce the observed supermassive black holes. It is also possible that they were seeded by direct-collapse black holes, in which a large cloud of hot gas avoids fragmentation that would lead to multiple stars, due to low angular momentum or heating from a nearby galaxy. Given the right circumstances, a single supermassive star forms and collapses directly into a black hole without undergoing typical stellar evolution. Additionally, these supermassive black holes in the early universe may be high-mass primordial black holes, which could have accreted further matter in the centers of galaxies. Finally, certain mechanisms allow black holes to grow faster than the theoretical Eddington limit, such as dense gas in the accretion disk limiting outward radiation pressure that prevents the black hole from accreting. However, the formation of bipolar jets prevent super-Eddington rates. In fiction Black holes have been portrayed in science fiction in a variety of ways. Even before the advent of the term itself, objects with characteristics of black holes appeared in stories such as the 1928 novel The Skylark of Space with its "black Sun" and the "hole in space" in the 1935 short story Starship Invincible. As black holes grew to public recognition in the 1960s and 1970s, they began to be featured in films as well as novels, such as Disney's The Black Hole. Black holes have also been used in works of the 21st century, such as Christopher Nolan's science fiction epic Interstellar. Authors and screenwriters have exploited the relativistic effects of black holes, particularly gravitational time dilation. For example, Interstellar features a black hole planet with a time dilation factor of over 60,000:1, while the 1977 novel Gateway depicts a spaceship approaching but never crossing the event horizon of a black hole from the perspective of an outside observer due to time dilation effects. Black holes have also been appropriated as wormholes or other methods of faster-than-light travel, such as in the 1974 novel The Forever War, where a network of black holes is used for interstellar travel. Additionally, black holes can feature as hazards to spacefarers and planets: A black hole threatens a deep-space outpost in 1978 short story The Black Hole Passes, and a binary black hole dangerously alters the orbit of a planet in the 2018 Netflix reboot of Lost in Space. Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Modelica] | [TOKENS: 1829] |
Contents Modelica Modelica is an object-oriented, declarative, multi-domain modeling language for component-oriented modeling of complex systems, e.g., systems containing mechanical, electrical, electronic, hydraulic, thermal, control, electric power or process-oriented subcomponents. The free Modelica language is developed by the non-profit Modelica Association. The Modelica Association also develops the free Modelica Standard Library that contains about 1400 generic model components and 1200 functions in various domains, as of version 4.0.0. Characteristics While Modelica resembles object-oriented programming languages, such as C++ or Java, it differs in two important respects. First, Modelica is a modeling language rather than a conventional programming language. Modelica classes are not compiled in the usual sense, but they are translated into objects which are then exercised by a simulation engine. The simulation engine is not specified by the language, although certain required capabilities are outlined. Second, although classes may contain algorithmic components similar to statements or blocks in programming languages, their primary content is a set of equations. In contrast to a typical assignment statement, such as where the left-hand side of the statement is assigned a value calculated from the expression on the right-hand side, an equation may have expressions on both its right- and left-hand sides, for example, Equations do not describe assignment but equality. In Modelica terms, equations have no pre-defined causality. The simulation engine may (and usually must) manipulate the equations symbolically to determine their order of execution and which components in the equation are inputs and which are outputs. History The Modelica design effort was initiated in September 1996 by Hilding Elmqvist. The goal was to develop an object-oriented language for modeling of technical systems in order to reuse and exchange dynamic system models in a standardized format. Modelica 1.0 is based on the PhD thesis of Hilding Elmqvist and on the experience with the modeling languages Allan, Dymola, NMF ObjectMath, Omola, SIDOPS+, and Smile. Hilding Elmqvist is the key architect of Modelica, but many other people have contributed as well (see appendix E in the Modelica specification). In September 1997, version 1.0 of the Modelica specification was released which was the basis for a prototype implementation within the commercial Dymola software system. In year 2000, the non-profit Modelica Association was formed to manage the continually evolving Modelica language and the development of the free Modelica Standard Library. In the same year, the usage of Modelica in industrial applications started. This table presents the timeline of the Modelica specification history: Implementations Commercial front-ends for Modelica include AMESim from the French company Imagine SA (now part of Siemens Digital Industries Software), Dymola from the Swedish company Dynasim AB (now part of Dassault Systèmes), Wolfram SystemModeler (formerly MathModelica) from the Swedish company Wolfram MathCore AB (now part of Wolfram Research), SimulationX from the German company ESI ITI GmbH, MapleSim from the Canadian company Maplesoft, JModelica.org (open source, discontinued) and Modelon Impact, from the Swedish company Modelon AB, and CATIA Systems from Dassault Systèmes (CATIA is one of the major CAD systems). OpenModelica is an open-source Modelica-based modeling and simulation environment intended for industrial and academic usage. Its long-term development is supported by a non-profit organization – the Open Source Modelica Consortium (OSMC). The goal with the OpenModelica effort is to create a comprehensive Open Source Modelica modeling, compilation and simulation environment based on free software distributed in binary and source code form for research, teaching, and industrial usage. The free simulation environment Scicos uses a subset of Modelica for component modeling. Support for a larger part of the Modelica language is currently under development. Nevertheless, there is still some incompatibility and diverging interpretation between all the different tools concerning the Modelica language. Examples The following code fragment shows a very simple example of a first order system ( x ˙ = − c ⋅ x , x ( 0 ) = 10 {\displaystyle {\dot {x}}=-c\cdot x,x(0)=10} ): The following code fragment shows an example to calculate the second derivative of a trigonometric function, using OMShell, as a means to develop the program written below. Interesting things to note about this example are the 'parameter' qualifier, which indicates that a given variable is time-invariant and the 'der' operator, which represents (symbolically) the time derivative of a variable. Also worth noting are the documentation strings that can be associated with declarations and equations. The main application area of Modelica is the modeling of physical systems. The most basic structuring concepts are shown at hand of simple examples from the electrical domain: Modelica has the four built-in types Real, Integer, Boolean, String. Typically, user-defined types are derived, to associate physical quantity, unit, nominal values, and other attributes: The interaction of a component to other components is defined by physical ports, called connectors, e.g., an electrical pin is defined as When drawing connection lines between ports, the meaning is that corresponding connector variables without the "flow" prefix are identical (here: "v") and that corresponding connector variables with the "flow" prefix (here: "i") are defined by a zero-sum equation (the sum of all corresponding "flow" variables is zero). The motivation is to automatically fulfill the relevant balance equations at the infinitesimally small connection point. A basic model component is defined by a model and contains equations that describe the relationship between the connector variables in a declarative form (i.e., without specifying the calculation order): The goal is that a connected set of model components leads to a set of differential, algebraic and discrete equations where the number of unknowns and the number of equations is identical. In Modelica, this is achieved by requiring so called balanced models. The full rules for defining balanced models are rather complex, and can be read from in section 4.7. However, for most cases, a simple rule can be issued, that counts variables and equations the same way as most simulation tools do: given that variables and equations must be counted according to the following rule: Note that standard input connectors (such as RealInput or IntegerInput) do not contribute to the count of variables since no new variables are defined inside them. The reason for this rule can be understood thinking of the capacitor defined above. Its pins contain a flow variable, i.e. a current, each. When we check it, it is connected to nothing. This corresponds to set an equation pin.i=0 for each pin. That's why we must add an equation for each flow variable. Obviously the example can be extended to other cases, in which other kinds of flow variables are involved (e.g. forces, torques, etc.). When our capacitor is connected to another (balanced) model through one of its pins, a connection equation will be generated that will substitute the two i=0 equations of the pins being connected. Since the connection equation corresponds to two scalar equations, the connection operation will leave the balanced larger model (constituted by our Capacitor and the model it is connected to). The Capacitor model above is balanced, since Verification using OpenModelica of this model gives, in fact Another example, containing both input connectors and physical connectors is the following component from Modelica Standard Library: The component SignalVoltage is balanced since Again, checking with OpenModelica gives A hierarchical model is built-up from basic models, by instantiating basic models, providing suitable values for the model parameters, and by connecting model connectors. A typical example is the following electrical circuit: Via the language element annotation(...), definitions can be added to a model that do not have an influence on a simulation. Annotations are used to define graphical layout, documentation and version information. A basic set of graphical annotations is standardized to ensure that the graphical appearance and layout of models in different Modelica tools is the same. Applications Modelica is designed to be domain neutral and, as a result, is used in a wide variety of applications, such as fluid systems (for example, steam power generation, hydraulics, etc.), automotive applications (especially powertrain) and mechanical systems (for example, multi-body systems, mechatronics, etc.). In the automotive sector, many of the major automotive OEMs are using Modelica. These include Ford, General Motors, Toyota, BMW, and Daimler. Modelica is also being increasingly used for the simulation of thermo-fluid and energy systems. The characteristics of Modelica (acausal, object-oriented, domain neutral) make it well suited to system-level simulation, a domain where Modelica is now well established. See also Notes External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-82] | [TOKENS: 17273] |
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America) |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Breakthrough_Initiatives] | [TOKENS: 1242] |
Contents Breakthrough Initiatives Breakthrough Initiatives is a science-based program founded in 2015 and funded by Julia and Yuri Milner, also of Breakthrough Prize, to search for extraterrestrial intelligence over a span of at least 10 years. The program is divided into multiple projects. Breakthrough Listen will comprise an effort to search over 1,000,000 stars for artificial radio or laser signals. A parallel project called Breakthrough Message is an effort to create a message "representative of humanity and planet Earth". The project Breakthrough Starshot, co-founded with Mark Zuckerberg, aims to send a swarm of probes to the nearest star at about 20% the speed of light. The project Breakthrough Watch aims to identify and characterize Earth-sized, rocky planets around Alpha Centauri and other stars within 20 light years of Earth. Breakthrough plans to send a mission to Saturn's moon Enceladus, in search for life in its warm ocean, and in 2018 signed a partnership agreement with NASA for the project. History The Breakthrough Initiatives were announced to the public on 20 July 2015, at London's Royal Society by physicist Stephen Hawking. Russian tycoon Yuri Milner created the Initiatives to search for intelligent extraterrestrial life in the Universe and consider a plan for possibly transmitting messages out into space. The announcement included an open letter co-signed by multiple scientists, including Hawking, expressing support for an intensified search for alien radio communications. During the public launch, Hawking said: "In an infinite Universe, there must be other life. There is no bigger question. It is time to commit to finding the answer." The US$100 million cash infusion is projected to mark up the pace of SETI research over the early 2000s rate, and will nearly double the rate NASA was spending on SETI research annually in approximately 1973–1993. Projects Breakthrough Listen is a program to search for intelligent extraterrestrial communications in the Universe. With $100 million in funding and thousands of hours of dedicated telescope time on state-of-the-art facilities, it is the most comprehensive search for alien communications to date. The project began in January 2016, and is expected to continue for 10 years. The project uses radio wave observations from the Green Bank Observatory and the Parkes Observatory, and visible light observations from the Automated Planet Finder. Targets for the project include one million nearby stars and the centers of 100 galaxies. All data generated from the project are available to the public, and SETI@Home is used for some of the data analysis. The first results were published in April 2017, with further updates expected every 6 months. The Breakthrough Message program is to study the ethics of sending messages into deep space. It also launched an open competition with a US$1 million prize pool, to design a digital message that could be transmitted from Earth to an extraterrestrial civilization. The message should be "representative of humanity and planet Earth". The program pledges "not to transmit any message until there has been a global debate at high levels of science and politics on the risks and rewards of contacting advanced civilizations". Breakthrough Starshot, announced 12 April 2016, is a US$100 million program to develop a proof-of-concept light sail spacecraft fleet capable of making the journey to Alpha Centauri at 20% the speed of light (60,000 km/s or 215 million km/h) taking about 20 years to get there, and about 4 years to notify Earth of a successful arrival. The interstellar journey may include a flyby of Proxima Centauri b, an Earth-sized exoplanet that is in the habitable zone of its host star in the Alpha Centauri system. From a distance of 1 Astronomical Unit (150 million kilometers or 93 million miles), the four cameras on each of the spacecraft could potentially capture an image of high enough quality to resolve surface features. The spacecraft fleet would have 1000 craft, and each craft, named StarChip, would be a very small centimeter-sized craft weighing several grams. They would be propelled by several ground-based lasers of up to 100 gigawatts. Each tiny spacecraft would transmit data back to Earth using a compact on-board laser communications system. Pete Worden is the head of this project. The conceptual principles to enable this interstellar travel project were described in "A Roadmap to Interstellar Flight", by Philip Lubin of UC Santa Barbara. METI president Douglas Vakoch summarized the significance of the project, saying that "by sending hundreds or thousands of space probes the size of postage stamps, Breakthrough Starshot gets around the hazards of spaceflight that could easily end a mission relying on a single spacecraft. Only one nanocraft needs to make its way to Alpha Centauri and send back a signal for the mission to be successful. When that happens, Starshot will make history." In July 2017, scientists announced that precursors to StarChip, named Sprites, were successfully launched and flown. Breakthrough Watch is a multimillion-dollar astronomical program to develop Earth- and space-based technologies that can find Earth-like planets in our cosmic neighborhood – and try to establish whether they host life. The project aims to identify and characterize Earth-sized, rocky planets around Alpha Centauri and other stars within 20 light years of Earth, in search of oxygen and other "biosignatures." Breakthrough Enceladus is an astrobiology space probe mission concept to explore the possibility of life on Saturn's moon, Enceladus. In September 2018, NASA signed a collaboration agreement with Breakthrough to jointly create the mission concept. This mission would be the first privately funded deep space mission. It would study the content of the plumes ejecting from Enceladus's warm ocean through its southern ice crust. Enceladus's ice crust is thought to be around two to five kilometers thick, and a probe could use an ice-penetrating radar to constrain its structure. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Breakthrough_Message] | [TOKENS: 1242] |
Contents Breakthrough Initiatives Breakthrough Initiatives is a science-based program founded in 2015 and funded by Julia and Yuri Milner, also of Breakthrough Prize, to search for extraterrestrial intelligence over a span of at least 10 years. The program is divided into multiple projects. Breakthrough Listen will comprise an effort to search over 1,000,000 stars for artificial radio or laser signals. A parallel project called Breakthrough Message is an effort to create a message "representative of humanity and planet Earth". The project Breakthrough Starshot, co-founded with Mark Zuckerberg, aims to send a swarm of probes to the nearest star at about 20% the speed of light. The project Breakthrough Watch aims to identify and characterize Earth-sized, rocky planets around Alpha Centauri and other stars within 20 light years of Earth. Breakthrough plans to send a mission to Saturn's moon Enceladus, in search for life in its warm ocean, and in 2018 signed a partnership agreement with NASA for the project. History The Breakthrough Initiatives were announced to the public on 20 July 2015, at London's Royal Society by physicist Stephen Hawking. Russian tycoon Yuri Milner created the Initiatives to search for intelligent extraterrestrial life in the Universe and consider a plan for possibly transmitting messages out into space. The announcement included an open letter co-signed by multiple scientists, including Hawking, expressing support for an intensified search for alien radio communications. During the public launch, Hawking said: "In an infinite Universe, there must be other life. There is no bigger question. It is time to commit to finding the answer." The US$100 million cash infusion is projected to mark up the pace of SETI research over the early 2000s rate, and will nearly double the rate NASA was spending on SETI research annually in approximately 1973–1993. Projects Breakthrough Listen is a program to search for intelligent extraterrestrial communications in the Universe. With $100 million in funding and thousands of hours of dedicated telescope time on state-of-the-art facilities, it is the most comprehensive search for alien communications to date. The project began in January 2016, and is expected to continue for 10 years. The project uses radio wave observations from the Green Bank Observatory and the Parkes Observatory, and visible light observations from the Automated Planet Finder. Targets for the project include one million nearby stars and the centers of 100 galaxies. All data generated from the project are available to the public, and SETI@Home is used for some of the data analysis. The first results were published in April 2017, with further updates expected every 6 months. The Breakthrough Message program is to study the ethics of sending messages into deep space. It also launched an open competition with a US$1 million prize pool, to design a digital message that could be transmitted from Earth to an extraterrestrial civilization. The message should be "representative of humanity and planet Earth". The program pledges "not to transmit any message until there has been a global debate at high levels of science and politics on the risks and rewards of contacting advanced civilizations". Breakthrough Starshot, announced 12 April 2016, is a US$100 million program to develop a proof-of-concept light sail spacecraft fleet capable of making the journey to Alpha Centauri at 20% the speed of light (60,000 km/s or 215 million km/h) taking about 20 years to get there, and about 4 years to notify Earth of a successful arrival. The interstellar journey may include a flyby of Proxima Centauri b, an Earth-sized exoplanet that is in the habitable zone of its host star in the Alpha Centauri system. From a distance of 1 Astronomical Unit (150 million kilometers or 93 million miles), the four cameras on each of the spacecraft could potentially capture an image of high enough quality to resolve surface features. The spacecraft fleet would have 1000 craft, and each craft, named StarChip, would be a very small centimeter-sized craft weighing several grams. They would be propelled by several ground-based lasers of up to 100 gigawatts. Each tiny spacecraft would transmit data back to Earth using a compact on-board laser communications system. Pete Worden is the head of this project. The conceptual principles to enable this interstellar travel project were described in "A Roadmap to Interstellar Flight", by Philip Lubin of UC Santa Barbara. METI president Douglas Vakoch summarized the significance of the project, saying that "by sending hundreds or thousands of space probes the size of postage stamps, Breakthrough Starshot gets around the hazards of spaceflight that could easily end a mission relying on a single spacecraft. Only one nanocraft needs to make its way to Alpha Centauri and send back a signal for the mission to be successful. When that happens, Starshot will make history." In July 2017, scientists announced that precursors to StarChip, named Sprites, were successfully launched and flown. Breakthrough Watch is a multimillion-dollar astronomical program to develop Earth- and space-based technologies that can find Earth-like planets in our cosmic neighborhood – and try to establish whether they host life. The project aims to identify and characterize Earth-sized, rocky planets around Alpha Centauri and other stars within 20 light years of Earth, in search of oxygen and other "biosignatures." Breakthrough Enceladus is an astrobiology space probe mission concept to explore the possibility of life on Saturn's moon, Enceladus. In September 2018, NASA signed a collaboration agreement with Breakthrough to jointly create the mission concept. This mission would be the first privately funded deep space mission. It would study the content of the plumes ejecting from Enceladus's warm ocean through its southern ice crust. Enceladus's ice crust is thought to be around two to five kilometers thick, and a probe could use an ice-penetrating radar to constrain its structure. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Astrolinguistics] | [TOKENS: 928] |
Contents Astrolinguistics Astrolinguistics is a field of linguistics connected with the search for extraterrestrial intelligence (SETI). Early Soviet experiments Arguably the first attempt to construct a language for interplanetary communication was the AO language created by the anarchist philosopher Wolf Gordin (brother of Abba Gordin) in his books Grammar of the Language of the Mankind AO (1920) and Grammar of the Language AO (1924), was presented as a language for interplanetary communication at the First International Exhibition of Interplanetary Machines and Mechanisms (dedicated to the 10th anniversary of the Russian Revolution and the 70th anniversary of the birth of Tsiolkovsky) in Moscow, 1927. The declared goal of Gordin was to construct a language which would be non-"fetishizing", non-"sociomorphic", non-gender based and non-classist. The design of the language was inspired by Russian Futurist poetry, the Gordin brothers' pan-anarchist philosophy, and Tsiolkovsky's early remarks on possible cosmic messaging (which were in accord with Hans Freudenthal's later insights). However, Sergei N. Kuznetsov notes that "Gordin nowhere defines his language as intended for space use," and that "in none of his works does he deal with problems of space communication, only mentioning 'Interplanetary Communication' in passing among other technical areas." Freudenthal's LINCOS An integral part of the SETI project in general is research in the field of the construction of messages for extraterrestrial intelligence, possibly to be transmitted into space from Earth. As far as such messages are based on linguistic principles, the research can be considered to belong to astrolinguistics. The first proposal in this field was put forward by the mathematician Hans Freudenthal at the University of Utrecht in the Netherlands, in 1960 – around the time of the first SETI effort at Greenbank in the US. Freudenthal conceived a complete Lingua Cosmica. His book LINCOS: Design of a Language for Cosmic Intercourse seems at first sight non-linguistic, because mathematical concepts are the core of the language. The concepts are, however, introduced in conversations between persons (Homo sapiens), de facto by linguistic means. This is witnessed by the innovative examples presented. The book set a landmark in astrolinguistics. This was witnessed by Bruno Bassi's review years later. Bassi noted: “LINCOS is there. In spite of its somewhat ephemeral 'cosmic intercourse' purpose it remains a fascinating linguistic and educational construction, deserving existence as another Toy of Man's Designing”. Freudenthal eventually had lost interest in creating further work altogether because of rising issues in applying LINCOS "for [anything] other than mathematical contents due to the potential different sociological aspects of alien receivers". Ollongren's LINCOS The concept astrolinguistics in scientific research was coined as such, also with a view towards message construction for ETI, in 2013 in the monograph Astrolinguistics: Design of a Linguistic System for Interstellar Communication Based on Logic, written by the astronomer and computer scientist Alexander Ollongren from the University of Leiden (the Netherlands). This book presents a new Lingua Cosmica totally different from Freudenthal's design. It describes the way the logic of situations in human societies can be formulated in the lingua, also named LINCOS. This astrolinguistic system, also designed for use in interstellar communication, is based on modern constructive logic – which assures that all expressions are verifiable. At a deeper, more fundamental level, however, astrolinguistics is concerned with the question whether linguistic universalia can be identified which are potentially useful in communication across interstellar distances between intelligence species. In the view of the new LINCOS these might be certain logic descriptions of specific situations and relations (possibly in an Aristotelian sense). Kadri Tinn's (Astronomy for Humans) review of Ollongren's book recognised that aspect – she wrote: Astrolinguistics is the study of interstellar languages and possibility of communication using an artificially created language that is self-contained and wouldn't include some of the aspects of natural languages. … new Lingua Cosmica is a language system based on applied logic, the understanding of which might be expected from a civilization that has developed technology advanced enough to receive radio emissions. See also References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Macchietta] | [TOKENS: 478] |
Contents Macchietta Macchietta (English: "little spot"; pl.: macchiette or macchiettas) is a form of comedy act which was common in Italian theatre between the late 1800s and the second half of the 1900s. Style The macchietta consisted in comic musical monologues caricaturing stock characters. It was generally committed to the observation of reality, and it sketched characters featuring particular defects or manias, which were further deformed and exaggerated for comical and satirical effects. Every monologue had some music serving as backdrop for the whole performance and the acting was interspersed by brief couplets sung by the comedian. History Macchiette were performed in café-chantants, revues and avanspettacolo, and less frequently as part of more elaborate comedy plays. After a golden age between the late 1800s and early 1900s, the genre apparently went out of fashion around 1920, before being resurrected in an amended and updated form in the 1930s, mostly thanks to the duo formed by Gigi Pisano and Giuseppe Cioffi, who created a series of popular macchiette such as "Ciccio Formaggio", "Mazza Pezza e Pizzo" and "Datemi Elisabetta", which were successfully performed by the most popular comedians of the time. Starting from the 1950s, the genre eventually declined and gradually disappeared together with the decline of avanspettacolo. Actors Nicola Maldacea was among the first actors to adopt the genre, and he is regarded as the figure who mostly helped to canonize the macchietta, if not its inventor. He himself regarded himself as the person who coined the term. Some of his macchiette had notable poets such as Trilussa, Salvatore Di Giacomo and Libero Bovio as often uncredited authors. Other well-known artists specialized in this form of entertainment were the Milan-based actor and playwright Edoardo Ferravilla [it], the Neapolitans Raffaele Viviani and Berardo Cantalamessa [it] (creator of the classical macchietta "La risata"), and the Roman Ettore Petrolini. Further reading References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Non-player_character#cite_ref-3] | [TOKENS: 1785] |
Contents Non-player character A non-player character (NPC) is a character in a game that is not controlled by a player. The term originated in traditional tabletop role-playing games where it applies to characters controlled by the gamemaster, or referee, rather than by another player. In video games, this usually means a computer-controlled character that has a predetermined set of behaviors that potentially will impact gameplay, but will not necessarily be the product of true artificial intelligence. Role-playing games In traditional tabletop role-playing games (RPG) such as Dungeons & Dragons, an NPC is a character portrayed by the gamemaster (GM). While the player characters (PCs) form the narrative's protagonists, non-player characters can be thought of as the "supporting cast" or "extras" of a roleplaying narrative. Non-player characters populate the fictional world of the game, and can fill any role not occupied by a player character. Non-player characters might be allies, bystanders, or competitors to the PCs. NPCs can also be traders who trade currency for things such as equipment or gear. NPCs thus vary in their level of detail. Some may be only a brief description ("You see a man in a corner of the tavern"), while others may have complete game statistics and backstories. There is some debate about how much work a gamemaster should put into an important NPC's statistics; some players prefer to have every NPC completely defined with stats, skills, and gear, while others define only what is immediately necessary and fill in the rest as the game proceeds. There is also some debate regarding the importance of fully defined NPCs in any given role-playing game, but there is consensus that the more "real" the NPCs feel, the more fun players will have interacting with them in character. In some games and in some circumstances, a player who is without a player character can temporarily take control of an NPC. Reasons for this vary, but often arise from the player not maintaining a PC within the group and playing the NPC for a session or from the player's PC being unable to act for some time (for example, because the PC is injured or in another location). Although these characters are still designed and normally controlled by the gamemaster, when players are allowed to temporarily control these non-player characters, it gives them another perspective on the plot of the game. Some systems, such as Nobilis, encourage this in their rules.[citation needed] Many game systems have rules for characters sustaining positive allies in the form of NPC followers, hired hands, or other dependents stature to the PC (player character). Characters may sometimes help in the design, recruitment, or development of NPCs. In the Champions game (and related games using the Hero System), a character may have a DNPC, or "dependent non-player character". This is a character controlled by the GM, but for which the player character is responsible in some way, and who may be put in harm's way by the PC's choices. Video games The term "non-player character" is also used in video games to describe entities not under the direct control of a player. The term carries a connotation that the character is not hostile towards players; hostile characters are referred to as enemies, mobs, or creeps. NPC behavior in computer games is usually scripted and automatic, triggered by certain actions or dialogue with the player characters. In certain multiplayer games (Neverwinter Nights and Vampire: The Masquerade series, for example) a player that acts as the GM can "possess" both player and non-player characters, controlling their actions to further the storyline. More complex games, such as the aforementioned Neverwinter Nights, allow the player to customize the NPCs' behavior by modifying their default scripts or creating entirely new ones. In some online games, such as massively multiplayer online role-playing games, NPCs may be entirely unscripted, and are essentially regular character avatars controlled by employees of the game company. These "non-players" are often distinguished from player characters by avatar appearance or other visual designation, and often serve as in-game support for new players. In other cases, these "live" NPCs are virtual actors, playing regular characters that drive a continuing storyline (as in Myst Online: Uru Live). In earlier RPGs, NPCs only had monologues. This is typically represented by a dialogue box, floating text, cutscene, or other means of displaying the NPCs' speech or reaction to the player. [citation needed] NPC speeches of this kind are often designed to give an instant impression of the character of the speaker, providing character vignettes, but they may also advance the story or illuminate the world around the PC. Similar to this is the most common form of storytelling, non-branching dialogue, in which the means of displaying NPC speech are the same as above, but the player character or avatar responds to or initiates speech with NPCs. In addition to the purposes listed above, this enables the development of the player character. More advanced RPGs feature interactive dialogue, or branching dialogue (dialogue trees). An example are the games produced by Black Isle Studios and White Wolf, Inc.; every one of their games is multiple-choice roleplaying. When talking to an NPC, the player is presented with a list of dialogue options and may choose between them. Each choice may result in a different response from the NPC. These choices may affect the course of the game, as well as the conversation. At the least, they provide a reference point to the player of their character's relationship with the game world. Ultima is an example of a game series that has advanced from non-branching (Ultima III: Exodus and earlier) to branching dialogue (from Ultima IV: Quest of the Avatar and on). Other role-playing games with branching dialogues include Cosmic Soldier, Megami Tensei, Fire Emblem, Metal Max, Langrisser, SaGa, Ogre Battle, Chrono, Star Ocean, Sakura Wars, Mass Effect, Dragon Age, Radiant Historia, and several Dragon Quest and Final Fantasy games. Certain video game genres revolve almost entirely around interactions with non-player characters, including visual novels such as Ace Attorney and dating sims such as Tokimeki Memorial, usually featuring complex branching dialogues and often presenting the player's possible responses word-for-word as the player character would say them. Games revolving around relationship-building, including visual novels, dating sims such as Tokimeki Memorial, and some role-playing games such as Persona, often give choices that have a different number of associated "mood points" that influence a player character's relationship and future conversations with a non-player character. These games often feature a day-night cycle with a time scheduling system that provides context and relevance to character interactions, allowing players to choose when and if to interact with certain characters, which in turn influences their responses during later conversations. In 2023, Replica Studios unveiled its AI-developed NPCs for the Unreal Engine 5, in cooperation with OpenAI, which enable players to have an interactive conversation with unplayable characters. "NPC streaming"—livestreaming while mimicking the behaviors of an NPC—became popular on TikTok in 2023 and was largely popularized by livestreamer Pinkydoll. Other usage From around 2018, the term NPC became an insult, primarily online, to suggest that a person is unable to form thoughts or opinions of their own. This is sometimes illustrated with a grey-faced, expressionless version of the Wojak meme. Monetization NPC streaming is a type of livestream that allows users to participate in and shape the content they are viewing in real time. It has become widely popular as influencers and users of social media platforms such as TikTok utilize livestreams to act as non-player characters. "Viewers in NPC live streams take on the role of puppeteers, influencing the creator's next move." This phenomenon has been on the rise as viewers are actively involved in what they are watching, by purchasing digital "gifts" and sending them directly to the streamer. In return, the streamer will briefly mimic a character or act. This phenomenon has become a trend starting from July 2023, as influencers make profits from this new internet character. Pinkydoll, a TikTok influencer, gained 400,000 followers the same month that she started NPC streaming, while her livestreams began to earn her as much as $7,000 in a day. NPC streaming gives creators a new avenue to earn money online. Despite this, certain creators are quitting due to certain stigmas that come with the strategy. For example, a pioneer of the NPC trend, Malik Ambersley has been robbed, accosted by police, and gotten into fights due to the controversial nature of his act. See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Social_network#cite_note-28] | [TOKENS: 5247] |
Contents Social network 1800s: Martineau · Tocqueville · Marx · Spencer · Le Bon · Ward · Pareto · Tönnies · Veblen · Simmel · Durkheim · Addams · Mead · Weber · Du Bois · Mannheim · Elias A social network is a social structure consisting of a set of social actors (such as individuals or organizations), networks of dyadic ties, and other social interactions between actors. The social network perspective provides a set of methods for analyzing the structure of whole social entities along with a variety of theories explaining the patterns observed in these structures. The study of these structures uses social network analysis to identify local and global patterns, locate influential entities, and examine dynamics of networks. For instance, social network analysis has been used in studying the spread of misinformation on social media platforms or analyzing the influence of key figures in social networks. Social networks and the analysis of them is an inherently interdisciplinary academic field which emerged from social psychology, sociology, statistics, and graph theory. Georg Simmel authored early structural theories in sociology emphasizing the dynamics of triads and "web of group affiliations". Jacob Moreno is credited with developing the first sociograms in the 1930s to study interpersonal relationships. These approaches were mathematically formalized in the 1950s and theories and methods of social networks became pervasive in the social and behavioral sciences by the 1980s. Social network analysis is now one of the major paradigms in contemporary sociology, and is also employed in a number of other social and formal sciences. Together with other complex networks, it forms part of the nascent field of network science. Overview The social network is a theoretical construct useful in the social sciences to study relationships between individuals, groups, organizations, or even entire societies (social units, see differentiation). The term is used to describe a social structure determined by such interactions. The ties through which any given social unit connects represent the convergence of the various social contacts of that unit. This theoretical approach is, necessarily, relational. An axiom of the social network approach to understanding social interaction is that social phenomena should be primarily conceived and investigated through the properties of relations between and within units, instead of the properties of these units themselves. Thus, one common criticism of social network theory is that individual agency is often ignored although this may not be the case in practice (see agent-based modeling). Precisely because many different types of relations, singular or in combination, form these network configurations, network analytics are useful to a broad range of research enterprises. In social science, these fields of study include, but are not limited to anthropology, biology, communication studies, economics, geography, information science, organizational studies, social psychology, sociology, and sociolinguistics. History In the late 1890s, both Émile Durkheim and Ferdinand Tönnies foreshadowed the idea of social networks in their theories and research of social groups. Tönnies argued that social groups can exist as personal and direct social ties that either link individuals who share values and belief (Gemeinschaft, German, commonly translated as "community") or impersonal, formal, and instrumental social links (Gesellschaft, German, commonly translated as "society"). Durkheim gave a non-individualistic explanation of social facts, arguing that social phenomena arise when interacting individuals constitute a reality that can no longer be accounted for in terms of the properties of individual actors. Georg Simmel, writing at the turn of the twentieth century, pointed to the nature of networks and the effect of network size on interaction and examined the likelihood of interaction in loosely knit networks rather than groups. Major developments in the field can be seen in the 1930s by several groups in psychology, anthropology, and mathematics working independently. In psychology, in the 1930s, Jacob L. Moreno began systematic recording and analysis of social interaction in small groups, especially classrooms and work groups (see sociometry). In anthropology, the foundation for social network theory is the theoretical and ethnographic work of Bronislaw Malinowski, Alfred Radcliffe-Brown, and Claude Lévi-Strauss. A group of social anthropologists associated with Max Gluckman and the Manchester School, including John A. Barnes, J. Clyde Mitchell and Elizabeth Bott Spillius, often are credited with performing some of the first fieldwork from which network analyses were performed, investigating community networks in southern Africa, India and the United Kingdom. Concomitantly, British anthropologist S. F. Nadel codified a theory of social structure that was influential in later network analysis. In sociology, the early (1930s) work of Talcott Parsons set the stage for taking a relational approach to understanding social structure. Later, drawing upon Parsons' theory, the work of sociologist Peter Blau provides a strong impetus for analyzing the relational ties of social units with his work on social exchange theory. By the 1970s, a growing number of scholars worked to combine the different tracks and traditions. One group consisted of sociologist Harrison White and his students at the Harvard University Department of Social Relations. Also independently active in the Harvard Social Relations department at the time were Charles Tilly, who focused on networks in political and community sociology and social movements, and Stanley Milgram, who developed the "six degrees of separation" thesis. Mark Granovetter and Barry Wellman are among the former students of White who elaborated and championed the analysis of social networks. Beginning in the late 1990s, social network analysis experienced work by sociologists, political scientists, and physicists such as Duncan J. Watts, Albert-László Barabási, Peter Bearman, Nicholas A. Christakis, James H. Fowler, and others, developing and applying new models and methods to emerging data available about online social networks, as well as "digital traces" regarding face-to-face networks. Levels of analysis In general, social networks are self-organizing, emergent, and complex, such that a globally coherent pattern appears from the local interaction of the elements that make up the system. These patterns become more apparent as network size increases. However, a global network analysis of, for example, all interpersonal relationships in the world is not feasible and is likely to contain so much information as to be uninformative. Practical limitations of computing power, ethics and participant recruitment and payment also limit the scope of a social network analysis. The nuances of a local system may be lost in a large network analysis, hence the quality of information may be more important than its scale for understanding network properties. Thus, social networks are analyzed at the scale relevant to the researcher's theoretical question. Although levels of analysis are not necessarily mutually exclusive, there are three general levels into which networks may fall: micro-level, meso-level, and macro-level. At the micro-level, social network research typically begins with an individual, snowballing as social relationships are traced, or may begin with a small group of individuals in a particular social context. Dyadic level: A dyad is a social relationship between two individuals. Network research on dyads may concentrate on structure of the relationship (e.g. multiplexity, strength), social equality, and tendencies toward reciprocity/mutuality. Triadic level: Add one individual to a dyad, and you have a triad. Research at this level may concentrate on factors such as balance and transitivity, as well as social equality and tendencies toward reciprocity/mutuality. In the balance theory of Fritz Heider the triad is the key to social dynamics. The discord in a rivalrous love triangle is an example of an unbalanced triad, likely to change to a balanced triad by a change in one of the relations. The dynamics of social friendships in society has been modeled by balancing triads. The study is carried forward with the theory of signed graphs. Actor level: The smallest unit of analysis in a social network is an individual in their social setting, i.e., an "actor" or "ego." Egonetwork analysis focuses on network characteristics, such as size, relationship strength, density, centrality, prestige and roles such as isolates, liaisons, and bridges. Such analyses, are most commonly used in the fields of psychology or social psychology, ethnographic kinship analysis or other genealogical studies of relationships between individuals. Subset level: Subset levels of network research problems begin at the micro-level, but may cross over into the meso-level of analysis. Subset level research may focus on distance and reachability, cliques, cohesive subgroups, or other group actions or behavior. In general, meso-level theories begin with a population size that falls between the micro- and macro-levels. However, meso-level may also refer to analyses that are specifically designed to reveal connections between micro- and macro-levels. Meso-level networks are low density and may exhibit causal processes distinct from interpersonal micro-level networks. Organizations: Formal organizations are social groups that distribute tasks for a collective goal. Network research on organizations may focus on either intra-organizational or inter-organizational ties in terms of formal or informal relationships. Intra-organizational networks themselves often contain multiple levels of analysis, especially in larger organizations with multiple branches, franchises or semi-autonomous departments. In these cases, research is often conducted at a work group level and organization level, focusing on the interplay between the two structures. Experiments with networked groups online have documented ways to optimize group-level coordination through diverse interventions, including the addition of autonomous agents to the groups. Randomly distributed networks: Exponential random graph models of social networks became state-of-the-art methods of social network analysis in the 1980s. This framework has the capacity to represent social-structural effects commonly observed in many human social networks, including general degree-based structural effects commonly observed in many human social networks as well as reciprocity and transitivity, and at the node-level, homophily and attribute-based activity and popularity effects, as derived from explicit hypotheses about dependencies among network ties. Parameters are given in terms of the prevalence of small subgraph configurations in the network and can be interpreted as describing the combinations of local social processes from which a given network emerges. These probability models for networks on a given set of actors allow generalization beyond the restrictive dyadic independence assumption of micro-networks, allowing models to be built from theoretical structural foundations of social behavior. Scale-free networks: A scale-free network is a network whose degree distribution follows a power law, at least asymptotically. In network theory a scale-free ideal network is a random network with a degree distribution that unravels the size distribution of social groups. Specific characteristics of scale-free networks vary with the theories and analytical tools used to create them, however, in general, scale-free networks have some common characteristics. One notable characteristic in a scale-free network is the relative commonness of vertices with a degree that greatly exceeds the average. The highest-degree nodes are often called "hubs", and may serve specific purposes in their networks, although this depends greatly on the social context. Another general characteristic of scale-free networks is the clustering coefficient distribution, which decreases as the node degree increases. This distribution also follows a power law. The Barabási model of network evolution shown above is an example of a scale-free network. Rather than tracing interpersonal interactions, macro-level analyses generally trace the outcomes of interactions, such as economic or other resource transfer interactions over a large population. Large-scale networks: Large-scale network is a term somewhat synonymous with "macro-level." It is primarily used in social and behavioral sciences, and in economics. Originally, the term was used extensively in the computer sciences (see large-scale network mapping). Complex networks: Most larger social networks display features of social complexity, which involves substantial non-trivial features of network topology, with patterns of complex connections between elements that are neither purely regular nor purely random (see, complexity science, dynamical system and chaos theory), as do biological, and technological networks. Such complex network features include a heavy tail in the degree distribution, a high clustering coefficient, assortativity or disassortativity among vertices, community structure (see stochastic block model), and hierarchical structure. In the case of agency-directed networks these features also include reciprocity, triad significance profile (TSP, see network motif), and other features. In contrast, many of the mathematical models of networks that have been studied in the past, such as lattices and random graphs, do not show these features. Theoretical links Various theoretical frameworks have been imported for the use of social network analysis. The most prominent of these are Graph theory, Balance theory, Social comparison theory, and more recently, the Social identity approach. Few complete theories have been produced from social network analysis. Two that have are structural role theory and heterophily theory. The basis of Heterophily Theory was the finding in one study that more numerous weak ties can be important in seeking information and innovation, as cliques have a tendency to have more homogeneous opinions as well as share many common traits. This homophilic tendency was the reason for the members of the cliques to be attracted together in the first place. However, being similar, each member of the clique would also know more or less what the other members knew. To find new information or insights, members of the clique will have to look beyond the clique to its other friends and acquaintances. This is what Granovetter called "the strength of weak ties". Structural holes In the context of networks, social capital exists where people have an advantage because of their location in a network. Contacts in a network provide information, opportunities and perspectives that can be beneficial to the central player in the network. Most social structures tend to be characterized by dense clusters of strong connections. Information within these clusters tends to be rather homogeneous and redundant. Non-redundant information is most often obtained through contacts in different clusters. When two separate clusters possess non-redundant information, there is said to be a structural hole between them. Thus, a network that bridges structural holes will provide network benefits that are in some degree additive, rather than overlapping. An ideal network structure has a vine and cluster structure, providing access to many different clusters and structural holes. Networks rich in structural holes are a form of social capital in that they offer information benefits. The main player in a network that bridges structural holes is able to access information from diverse sources and clusters. For example, in business networks, this is beneficial to an individual's career because he is more likely to hear of job openings and opportunities if his network spans a wide range of contacts in different industries/sectors. This concept is similar to Mark Granovetter's theory of weak ties, which rests on the basis that having a broad range of contacts is most effective for job attainment. Structural holes have been widely applied in social network analysis, resulting in applications in a wide range of practical scenarios as well as machine learning-based social prediction. Research clusters Research has used network analysis to examine networks created when artists are exhibited together in museum exhibition. Such networks have been shown to affect an artist's recognition in history and historical narratives, even when controlling for individual accomplishments of the artist. Other work examines how network grouping of artists can affect an individual artist's auction performance. An artist's status has been shown to increase when associated with higher status networks, though this association has diminishing returns over an artist's career. In J.A. Barnes' day, a "community" referred to a specific geographic location and studies of community ties had to do with who talked, associated, traded, and attended church with whom. Today, however, there are extended "online" communities developed through telecommunications devices and social network services. Such devices and services require extensive and ongoing maintenance and analysis, often using network science methods. Community development studies, today, also make extensive use of such methods. Complex networks require methods specific to modelling and interpreting social complexity and complex adaptive systems, including techniques of dynamic network analysis. Mechanisms such as Dual-phase evolution explain how temporal changes in connectivity contribute to the formation of structure in social networks. The study of social networks is being used to examine the nature of interdependencies between actors and the ways in which these are related to outcomes of conflict and cooperation. Areas of study include cooperative behavior among participants in collective actions such as protests; promotion of peaceful behavior, social norms, and public goods within communities through networks of informal governance; the role of social networks in both intrastate conflict and interstate conflict; and social networking among politicians, constituents, and bureaucrats. In criminology and urban sociology, much attention has been paid to the social networks among criminal actors. For example, murders can be seen as a series of exchanges between gangs. Murders can be seen to diffuse outwards from a single source, because weaker gangs cannot afford to kill members of stronger gangs in retaliation, but must commit other violent acts to maintain their reputation for strength. Diffusion of ideas and innovations studies focus on the spread and use of ideas from one actor to another or one culture and another. This line of research seeks to explain why some become "early adopters" of ideas and innovations, and links social network structure with facilitating or impeding the spread of an innovation. A case in point is the social diffusion of linguistic innovation such as neologisms. Experiments and large-scale field trials (e.g., by Nicholas Christakis and collaborators) have shown that cascades of desirable behaviors can be induced in social groups, in settings as diverse as Honduras villages, Indian slums, or in the lab. Still other experiments have documented the experimental induction of social contagion of voting behavior, emotions, risk perception, and commercial products. In demography, the study of social networks has led to new sampling methods for estimating and reaching populations that are hard to enumerate (for example, homeless people or intravenous drug users.) For example, respondent driven sampling is a network-based sampling technique that relies on respondents to a survey recommending further respondents. The field of sociology focuses almost entirely on networks of outcomes of social interactions. More narrowly, economic sociology considers behavioral interactions of individuals and groups through social capital and social "markets". Sociologists, such as Mark Granovetter, have developed core principles about the interactions of social structure, information, ability to punish or reward, and trust that frequently recur in their analyses of political, economic and other institutions. Granovetter examines how social structures and social networks can affect economic outcomes like hiring, price, productivity and innovation and describes sociologists' contributions to analyzing the impact of social structure and networks on the economy. Analysis of social networks is increasingly incorporated into health care analytics, not only in epidemiological studies but also in models of patient communication and education, disease prevention, mental health diagnosis and treatment, and in the study of health care organizations and systems. Human ecology is an interdisciplinary and transdisciplinary study of the relationship between humans and their natural, social, and built environments. The scientific philosophy of human ecology has a diffuse history with connections to geography, sociology, psychology, anthropology, zoology, and natural ecology. In the study of literary systems, network analysis has been applied by Anheier, Gerhards and Romo, De Nooy, Senekal, and Lotker, to study various aspects of how literature functions. The basic premise is that polysystem theory, which has been around since the writings of Even-Zohar, can be integrated with network theory and the relationships between different actors in the literary network, e.g. writers, critics, publishers, literary histories, etc., can be mapped using visualization from SNA. Research studies of formal or informal organization relationships, organizational communication, economics, economic sociology, and other resource transfers. Social networks have also been used to examine how organizations interact with each other, characterizing the many informal connections that link executives together, as well as associations and connections between individual employees at different organizations. Many organizational social network studies focus on teams. Within team network studies, research assesses, for example, the predictors and outcomes of centrality and power, density and centralization of team instrumental and expressive ties, and the role of between-team networks. Intra-organizational networks have been found to affect organizational commitment, organizational identification, interpersonal citizenship behaviour. Social capital is a form of economic and cultural capital in which social networks are central, transactions are marked by reciprocity, trust, and cooperation, and market agents produce goods and services not mainly for themselves, but for a common good. Social capital is split into three dimensions: the structural, the relational and the cognitive dimension. The structural dimension describes how partners interact with each other and which specific partners meet in a social network. Also, the structural dimension of social capital indicates the level of ties among organizations. This dimension is highly connected to the relational dimension which refers to trustworthiness, norms, expectations and identifications of the bonds between partners. The relational dimension explains the nature of these ties which is mainly illustrated by the level of trust accorded to the network of organizations. The cognitive dimension analyses the extent to which organizations share common goals and objectives as a result of their ties and interactions. Social capital is a sociological concept about the value of social relations and the role of cooperation and confidence to achieve positive outcomes. The term refers to the value one can get from their social ties. For example, newly arrived immigrants can make use of their social ties to established migrants to acquire jobs they may otherwise have trouble getting (e.g., because of unfamiliarity with the local language). A positive relationship exists between social capital and the intensity of social network use. In a dynamic framework, higher activity in a network feeds into higher social capital which itself encourages more activity. This particular cluster focuses on brand-image and promotional strategy effectiveness, taking into account the impact of customer participation on sales and brand-image. This is gauged through techniques such as sentiment analysis which rely on mathematical areas of study such as data mining and analytics. This area of research produces vast numbers of commercial applications as the main goal of any study is to understand consumer behaviour and drive sales. In many organizations, members tend to focus their activities inside their own groups, which stifles creativity and restricts opportunities. A player whose network bridges structural holes has an advantage in detecting and developing rewarding opportunities. Such a player can mobilize social capital by acting as a "broker" of information between two clusters that otherwise would not have been in contact, thus providing access to new ideas, opinions and opportunities. British philosopher and political economist John Stuart Mill, writes, "it is hardly possible to overrate the value of placing human beings in contact with persons dissimilar to themselves.... Such communication [is] one of the primary sources of progress." Thus, a player with a network rich in structural holes can add value to an organization through new ideas and opportunities. This in turn, helps an individual's career development and advancement. A social capital broker also reaps control benefits of being the facilitator of information flow between contacts. Full communication with exploratory mindsets and information exchange generated by dynamically alternating positions in a social network promotes creative and deep thinking. In the case of consulting firm Eden McCallum, the founders were able to advance their careers by bridging their connections with former big three consulting firm consultants and mid-size industry firms. By bridging structural holes and mobilizing social capital, players can advance their careers by executing new opportunities between contacts. There has been research that both substantiates and refutes the benefits of information brokerage. A study of high tech Chinese firms by Zhixing Xiao found that the control benefits of structural holes are "dissonant to the dominant firm-wide spirit of cooperation and the information benefits cannot materialize due to the communal sharing values" of such organizations. However, this study only analyzed Chinese firms, which tend to have strong communal sharing values. Information and control benefits of structural holes are still valuable in firms that are not quite as inclusive and cooperative on the firm-wide level. In 2004, Ronald Burt studied 673 managers who ran the supply chain for one of America's largest electronics companies. He found that managers who often discussed issues with other groups were better paid, received more positive job evaluations and were more likely to be promoted. Thus, bridging structural holes can be beneficial to an organization, and in turn, to an individual's career. Computer networks combined with social networking software produce a new medium for social interaction. A relationship over a computerized social networking service can be characterized by context, direction, and strength. The content of a relation refers to the resource that is exchanged. In a computer-mediated communication context, social pairs exchange different kinds of information, including sending a data file or a computer program as well as providing emotional support or arranging a meeting. With the rise of electronic commerce, information exchanged may also correspond to exchanges of money, goods or services in the "real" world. Social network analysis methods have become essential to examining these types of computer mediated communication. In addition, the sheer size and the volatile nature of social media has given rise to new network metrics. A key concern with networks extracted from social media is the lack of robustness of network metrics given missing data. Based on the pattern of homophily, ties between people are most likely to occur between nodes that are most similar to each other, or within neighbourhood segregation, individuals are most likely to inhabit the same regional areas as other individuals who are like them. Therefore, social networks can be used as a tool to measure the degree of segregation or homophily within a social network. Social Networks can both be used to simulate the process of homophily but it can also serve as a measure of level of exposure of different groups to each other within a current social network of individuals in a certain area. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Birthday#cite_ref-33] | [TOKENS: 4101] |
Contents Birthday A birthday is the anniversary of the birth of a person or the figurative birth of an institution. Birthdays of people are celebrated in numerous cultures, often with birthday gifts, birthday cards, a birthday party, or a rite of passage. Many religions celebrate the birth of their founders or religious figures with special holidays (e.g. Christmas, Mawlid, Buddha's Birthday, Krishna Janmashtami, and Gurpurb). There is a distinction between birthday and birthdate (also known as date of birth): the former, except for February 29, occurs each year (e.g. January 15), while the latter is the complete date when a person was born (e.g. January 15, 2001). Coming of age In most legal systems, one becomes a legal adult on a particular birthday when they reach the age of majority (usually between 12 and 21), and reaching age-specific milestones confers particular rights and responsibilities. At certain ages, one may become eligible to leave full-time education, become subject to military conscription or to enlist in the military, to consent to sexual intercourse, to marry with parental consent, to marry without parental consent, to vote, to run for elected office, to legally purchase (or consume) alcohol and tobacco products, to purchase lottery tickets, or to obtain a driver's licence. The age of majority is when minors cease to legally be considered children and assume control over their persons, actions, and decisions, thereby terminating the legal control and responsibilities of their parents or guardians over and for them. Most countries set the age of majority at 18, though it varies by jurisdiction. Many cultures celebrate a coming of age birthday when a person reaches a particular year of life. Some cultures celebrate landmark birthdays in early life or old age. In many cultures and jurisdictions, if a person's real birthday is unknown (for example, if they are an orphan), their birthday may be adopted or assigned to a specific day of the year, such as January 1. Racehorses are reckoned to become one year old in the year following their birth on January 1 in the Northern Hemisphere and August 1 in the Southern Hemisphere.[relevant?] Birthday parties In certain parts of the world, an individual's birthday is celebrated by a party featuring a specially made cake. Presents are bestowed on the individual by the guests appropriate to their age. Other birthday activities may include entertainment (sometimes by a hired professional, i.e., a clown, magician, or musician) and a special toast or speech by the birthday celebrant. The last stanza of Patty Hill's and Mildred Hill's famous song, "Good Morning to You" (unofficially titled "Happy Birthday to You") is typically sung by the guests at some point in the proceedings. In some countries, a piñata takes the place of a cake. The birthday cake may be decorated with lettering and the person's age, or studded with the same number of lit candles as the age of the individual. The celebrated individual may make a silent wish and attempt to blow out the candles in one breath; if successful, superstition holds that the wish will be granted. In many cultures, the wish must be kept secret or it will not "come true". Birthdays as holidays Historically significant people's birthdays, such as national heroes or founders, are often commemorated by an official holiday marking the anniversary of their birth. Some notables, particularly monarchs, have an official birthday on a fixed day of the year, which may not necessarily match the day of their birth, but on which celebrations are held. In Mahayana Buddhism, many monasteries celebrate the anniversary of Buddha's birth, usually in a highly formal, ritualized manner. They treat Buddha's statue as if it was Buddha himself as if he were alive; bathing, and "feeding" him. Jesus Christ's traditional birthday is celebrated as Christmas Eve or Christmas Day around the world, on December 24 or 25, respectively. As some Eastern churches use the Julian calendar, December 25 will fall on January 7 in the Gregorian calendar. These dates are traditional and have no connection with Jesus's actual birthday, which is not recorded in the Gospels. Similarly, the birthdays of the Virgin Mary and John the Baptist are liturgically celebrated on September 8 and June 24, especially in the Roman Catholic and Eastern Orthodox traditions (although for those Eastern Orthodox churches using the Julian calendar the corresponding Gregorian dates are September 21 and July 7 respectively). As with Christmas, the dates of these celebrations are traditional and probably have no connection with the actual birthdays of these individuals. Catholic saints are remembered by a liturgical feast on the anniversary of their "birth" into heaven a.k.a. their day of death. In Hinduism, Ganesh Chaturthi is a festival celebrating the birth of the elephant-headed deity Ganesha in extensive community celebrations and at home. Figurines of Ganesha are made for the holiday and are widely sold. Sikhs celebrate the anniversary of the birth of Guru Nanak and other Sikh gurus, which is known as Gurpurb. Mawlid is the anniversary of the birth of Muhammad and is celebrated on the 12th or 17th day of Rabi' al-awwal by adherents of Sunni and Shia Islam respectively. These are the two most commonly accepted dates of birth of Muhammad. However, there is much controversy regarding the permissibility of celebrating Mawlid, as some Muslims judge the custom as an unacceptable practice according to Islamic tradition. In Iran, Mother's Day is celebrated on the birthday of Fatima al-Zahra, the daughter of Muhammad. Banners reading Ya Fatima ("O Fatima") are displayed on government buildings, private buildings, public streets and car windows. Religious views In Judaism, rabbis are divided about celebrating this custom, although the majority of the faithful accept it. In the Torah, the only mention of a birthday is the celebration of Pharaoh's birthday in Egypt (Genesis 40:20). Although the birthday of Jesus of Nazareth is celebrated as a Christian holiday on December 25, historically the celebrating of an individual person's birthday has been subject to theological debate. Early Christians, notes The World Book Encyclopedia, "considered the celebration of anyone's birth to be a pagan custom." Origen, in his commentary "On Levites," wrote that Christians should not only refrain from celebrating their birthdays but should look at them with disgust as a pagan custom. A saint's day was typically celebrated on the anniversary of their martyrdom or death, considered the occasion of or preparation for their entrance into Heaven or the New Jerusalem. Ordinary folk in the Middle Ages celebrated their saint's day (the saint they were named after), but nobility celebrated the anniversary of their birth.[citation needed] The "Squire's Tale", one of Chaucer's Canterbury Tales, opens as King Cambuskan proclaims a feast to celebrate his birthday. In the Modern era, the Catholic Church, the Eastern Orthodox Church and Protestantism, i.e. the three main branches of Christianity, as well as almost all Christian religious denominations, consider celebrating birthdays acceptable or at most a choice of the individual. An exception is Jehovah's Witnesses, who do not celebrate them for various reasons: in their interpretation this feast has pagan origins, was not celebrated by early Christians, is negatively expounded in the Holy Scriptures and has customs linked to superstition and magic. In some historically Roman Catholic and Eastern Orthodox countries,[a] it is common to have a 'name day', otherwise known as a 'Saint's day'. It is celebrated in much the same way as a birthday, but it is held on the official day of a saint with the same Christian name as the birthday person; the difference being that one may look up a person's name day in a calendar, or easily remember common name days (for example, John or Mary); however in pious traditions, the two were often made to concur by giving a newborn the name of a saint celebrated on its day of confirmation, more seldom one's birthday. Some are given the name of the religious feast of their christening's day or birthday, for example, Noel or Pascal (French for Christmas and "of Easter"); as another example, Togliatti was given Palmiro as his first name because he was born on Palm Sunday. The birthday does not reflect Islamic tradition, and because of this, the majority of Muslims refrain from celebrating it. Others do not object, as long as it is not accompanied by behavior contrary to Islamic tradition. A good portion of Muslims (and Arab Christians) who have emigrated to the United States and Europe celebrate birthdays as customary, especially for children, while others abstain. Hindus celebrate the birth anniversary day every year when the day that corresponds to the lunar month or solar month (Sun Signs Nirayana System – Sourava Mana Masa) of birth and has the same asterism (Star/Nakshatra) as that of the date of birth. That age is reckoned whenever Janma Nakshatra of the same month passes. Hindus regard death to be more auspicious than birth, since the person is liberated from the bondages of material society. Also, traditionally, rituals and prayers for the departed are observed on the 5th and 11th days, with many relatives gathering. Historical and cultural perspectives According to Herodotus (5th century BC), of all the days in the year, the one which the Persians celebrate most is their birthday. It was customary to have the board furnished on that day with an ampler supply than common: the richer people eat wholly baked cow, horse, camel, or donkey (Greek: ὄνον), while the poorer classes use instead the smaller kinds of cattle. On his birthday, the king anointed his head and presented gifts to the Persians. According to the law of the Royal Supper, on that day "no one should be refused a request". The rule for drinking was "No restrictions". In ancient Rome, a birthday (dies natalis) was originally an act of religious cultivation (cultus). A dies natalis was celebrated annually for a temple on the day of its founding, and the term is still used sometimes for the anniversary of an institution such as a university. The temple founding day might become the "birthday" of the deity housed there. March 1, for example, was celebrated as the birthday of the god Mars. Each human likewise had a natal divinity, the guardian spirit called the Genius, or sometimes the Juno for a woman, who was owed religious devotion on the day of birth, usually in the household shrine (lararium). The decoration of a lararium often shows the Genius in the role of the person carrying out the rites. A person marked their birthday with ritual acts that might include lighting an altar, saying prayers, making vows (vota), anointing and wreathing a statue of the Genius, or sacrificing to a patron deity. Incense, cakes, and wine were common offerings. Celebrating someone else's birthday was a way to show affection, friendship, or respect. In exile, the poet Ovid, though alone, celebrated not only his own birthday rite but that of his far distant wife. Birthday parties affirmed social as well as sacred ties. One of the Vindolanda tablets is an invitation to a birthday party from the wife of one Roman officer to the wife of another. Books were a popular birthday gift, sometimes handcrafted as a luxury edition or composed especially for the person honored. Birthday poems are a minor but distinctive genre of Latin literature. The banquets, libations, and offerings or gifts that were a regular part of most Roman religious observances thus became part of birthday celebrations for individuals. A highly esteemed person would continue to be celebrated on their birthday after death, in addition to the several holidays on the Roman calendar for commemorating the dead collectively. Birthday commemoration was considered so important that money was often bequeathed to a social organization to fund an annual banquet in the deceased's honor. The observance of a patron's birthday or the honoring of a political figure's Genius was one of the religious foundations for imperial cult or so-called "emperor worship." The Chinese word for "year(s) old" (t 歲, s 岁, suì) is entirely different from the usual word for "year(s)" (年, nián), reflecting the former importance of Chinese astrology and the belief that one's fate was bound to the stars imagined to be in opposition to the planet Jupiter at the time of one's birth. The importance of this duodecennial orbital cycle only survives in popular culture as the 12 animals of the Chinese zodiac, which change each Chinese New Year and may be used as a theme for some gifts or decorations. Because of the importance attached to the influence of these stars in ancient China and throughout the Sinosphere, East Asian age reckoning previously began with one at birth and then added years at each Chinese New Year, so that it formed a record of the suì one had lived through rather than of the exact amount of time from one's birth. This method—which can differ by as much as two years of age from other systems—is increasingly uncommon and is not used for official purposes in the PRC or on Taiwan, although the word suì is still used for describing age. Traditionally, Chinese birthdays—when celebrated—were reckoned using the lunisolar calendar, which varies from the Gregorian calendar by as much as a month forward or backward depending on the year. Celebrating the lunisolar birthday remains common on Taiwan while growing increasingly uncommon on the mainland. Birthday traditions reflected the culture's deep-seated focus on longevity and wordplay. From the homophony in some dialects between 酒 ("rice wine") and 久 (meaning "long" in the sense of time passing), osmanthus and other rice wines are traditional gifts for birthdays in China. Longevity noodles are another traditional food consumed on the day, although western-style birthday cakes are increasingly common among urban Chinese. Hongbaos—red envelopes stuffed with money, now especially the red 100 RMB notes—are the usual gift from relatives and close family friends for most children. Gifts for adults on their birthdays are much less common, although the birthday for each decade is a larger occasion that might prompt a large dinner and celebration. The Japanese reckoned their birthdays by the Chinese system until the Meiji Reforms. Celebrations remained uncommon or muted until after the American occupation that followed World War II.[citation needed] Children's birthday parties are the most important, typically celebrated with a cake, candles, and singing. Adults often just celebrate with their partner. In North Korea, the Day of the Sun, Kim Il Sung's birthday, is the most important public holiday of the country, and Kim Jong Il's birthday is celebrated as the Day of the Shining Star. North Koreans are not permitted to celebrate birthdays on July 8 and December 17 because these were the dates of the deaths of Kim Il Sung and Kim Jong Il, respectively. More than 100,000 North Koreans celebrate displaced birthdays on July 9 and December 18 instead to avoid these dates. A person born on July 8 before 1994 may change their birthday, with official recognition. South Korea was one of the last countries to use a form of East Asian age reckoning for many official purposes. Prior to June 2023, three systems were used together—"Korean ages" that start with 1 at birth and increase every January 1st with the Gregorian New Year, "year ages" that start with 0 at birth and otherwise increase the same way, and "actual ages" that start with 0 at birth and increase each birthday. First birthday celebrations was heavily celebrated, despite usually having little to do with the child's age. In June 2023, all Korean ages were set back at least one year, and official ages henceforth are reckoned only by birthdays. In Ghana, children wake up on their birthday to a special treat called oto, which is a patty made from mashed sweet potato and eggs fried in palm oil. Later they have a birthday party where they usually eat stew and rice and a dish known as kelewele, which is fried plantain chunks. Distribution through the year Birthdays are fairly evenly distributed throughout the year, with some seasonal effects. In the United States, there tend to be more births in September and October. This may be because there is a holiday season nine months before (the human gestation period is about nine months), or because the longest nights of the year also occur in the Northern Hemisphere nine months before. However, the holidays affect birth rates more than the winter: New Zealand, a Southern Hemisphere country, has the same September and October peak with no corresponding peak in March and April. The least common birthdays tend to fall around public holidays, such as Christmas, New Year's Day and fixed-date holidays such as Independence Day in the US, which falls on July 4. Between 1973 and 1999, September 16 was the most common birthday in the United States, and December 25 was the least common birthday (other than February 29 because of leap years). In 2011, October 5 and 6 were reported as the most frequently occurring birthdays. New Zealand's most common birthday is September 29, and the least common birthday is December 25. The ten most common birthdays all fall within a thirteen-day period, between September 22 and October 4. The ten least common birthdays (other than February 29) are December 24–27, January 1–2, February 6, March 22, April 1, and April 25. This is based on all live births registered in New Zealand between 1980 and 2017. Positive and negative associations with culturally significant dates may influence birth rates. The study shows a 5.3% decrease in spontaneous births and a 16.9% decrease in Caesarean births on Halloween, compared to dates occurring within one week before and one week after the October holiday. In contrast, on Valentine's Day, there is a 3.6% increase in spontaneous births and a 12.1% increase in Caesarean births. In Sweden, 9.3% of the population is born in March and 7.3% in November, when a uniform distribution would give 8.3%. In the Gregorian calendar (a common solar calendar), February in a leap year has 29 days instead of the usual 28, so the year lasts 366 days instead of the usual 365. A person born on February 29 may be called a "leapling" or a "leaper". In common years, they usually celebrate their birthdays on February 28. In some situations, March 1 is used as the birthday in a non-leap year since it is the day following February 28. Technically, a leapling will have fewer birthday anniversaries than their age in years. This phenomenon is exploited when a person claims to be only a quarter of their actual age, by counting their leap-year birthday anniversaries only. In Gilbert and Sullivan's 1879 comic opera The Pirates of Penzance, Frederic the pirate apprentice discovers that he is bound to serve the pirates until his 21st birthday rather than until his 21st year. For legal purposes, legal birthdays depend on how local laws count time intervals. An individual's Beddian birthday, named in tribute to firefighter Bobby Beddia, occurs during the year that their age matches the last two digits of the year they were born. Some studies show people are more likely to die on their birthdays, with explanations including excessive drinking, suicide, cardiovascular events due to high stress or happiness, efforts to postpone death for major social events, and death certificate paperwork errors. See also References Notes External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Modula-2] | [TOKENS: 2042] |
Contents Modula-2 Modula-2 is a structured, procedural programming language developed between 1977 and 1985/8 by Niklaus Wirth at ETH Zurich. It was created as the language for the operating system and application software of the Lilith personal workstation. It was later used for programming outside the context of the Lilith. Wirth viewed Modula-2 as a successor to his earlier programming languages Pascal and Modula. The main concepts are: The language design was influenced by the Mesa language and the Xerox Alto, both from Xerox PARC, that Wirth saw during his 1976 sabbatical year there. The computer magazine Byte devoted the August 1984 issue to the language and its surrounding environment. Wirth created the Oberon series of languages as the successor to Modula-2, while others (particularly at Digital Equipment Corporation and Acorn Computers, later Olivetti) developed Modula-2 into Modula-2+ and later Modula-3. Description Modula-2 is a general purpose procedural language suitable for both systems programming and applications programming. The syntax is based on Wirth's earlier language, Pascal, with some elements and syntactic ambiguities removed. The module concept, designed to support separate compilation and data abstraction; and direct language support for multiprogramming were added. The language allows the use of one-pass compilers. Such a compiler by Gutknecht and Wirth was about four times faster than earlier multi-pass compilers. Here is an example of the source code for the "Hello world" program: A Modula-2 module may be used to encapsulate a set of related subprograms and data structures, and restrict their visibility from other parts of the program. Modula-2 programs are composed of modules, each of which is made up of two parts: a definition module, the interface portion, which contains only those parts of the subsystem that are exported (visible to other modules), and an implementation module, which contains the working code that is internal to the module. The language has strict scope control. Except for standard identifiers, no object from the outside is visible inside a module unless explicitly imported; no internal module object is visible from the outside unless explicitly exported. Suppose module M1 exports objects a, b, c, and P by enumerating its identifiers in an explicit export list Then the objects a, b, c, and P from module M1 are known outside module M1 as M1.a, M1.b, M1.c, and M1.P. They are exported in a qualified manner to the outside (assuming module M1 is global). The exporting module's name, i.e. M1, is used as a qualifier followed by the object's name. Suppose module M2 contains the following IMPORT declaration Then this means that the objects exported by module M1 to the outside of its enclosing program can now be used inside module M2. They are referenced in a qualified manner: M1.a, M1.b, M1.c, and M1.P. Example: Qualified export avoids name clashes. For example, if another module M3 exports an object called P, then the two objects can be distinguished since M1.P differs from M3.P. It does not matter that both objects are called P inside their exporting modules M1 and M3. An alternative method exists. Suppose module M4 is formulated as this: This means that objects exported by module M1 to the outside can again be used inside module M4, but now by mere references to the exported identifiers in an unqualified manner as: a, b, c, and P. Example: This method of import is usable if there are no name clashes. It allows variables and other objects to be used outside their exporting module in the same unqualified manner as inside the exporting module. The export and import rules not only safeguard objects against unwanted access, but also allow a cross-reference of the definition of every identifier in a program to be created. This property helps with the maintenance of large programs containing many modules. The language provides for single-processor concurrency (monitors, coroutines and explicit transfer of control) and for hardware access (absolute addresses, bit manipulation, and interrupts). It uses a nominal type system. Dialects There are two major dialects of Modula-2. The first is PIM, named for the book Programming in Modula-2 by Niklaus Wirth. There were three major editions of PIM: the second, third (corrected), and fourth. Each describes slight variants of the language. The second major dialect is ISO, named for the standardization effort by the International Organization for Standardization. Here are a few of the differences among them. Supersets There are several supersets of Modula-2 with language extensions for specific application domains: Derivatives There are several derivative languages that resemble Modula-2 very closely but are new languages in their own right. Most are different languages with different purposes and with strengths and weaknesses of their own: Many other current programming languages have adopted features of Modula-2. Language elements PIM [2,3,4] defines 40 reserved words: PIM [3,4] defines 29 built-in identifiers: Embedded system use Modula-2 is used to program many embedded systems. Cambridge Modula-2 by Cambridge Microprocessor Systems is based on a subset of PIM4 with language extensions for embedded development. The compiler runs on MS-DOS and it generates code for Motorola 68000 series (M68k) based embedded microcontrollers running a MINOS operating system. Mod51 by Mandeno Granville Electronics is based on ISO Modula-2 with language extensions for embedded development following IEC 1131, an industry standard for programmable logic controllers (PLC) closely related to Modula-2. The Mod51 compiler generates standalone code for 80C51 based microcontrollers. Delco Electronics, then a subsidiary of GM Hughes Electronics, developed a version of Modula-2 for embedded control systems starting in 1985. Delco named it Modula-GM. It was the first high-level programming language used to replace machine code (language) for embedded systems in Delco's engine control units (ECUs). This was significant because Delco was producing over 28,000 ECUs per day in 1988 for GM. This was then the world's largest producer of ECUs. The first experimental use of Modula-GM in an embedded controller was in the 1985 Antilock Braking System Controller which was based on the Motorola 68xxx microprocessor, and in 1993 Gen-4 ECU used by the Champ Car World Series Championship Auto Racing Teams (CART) and Indy Racing League (IRL) teams. The first production use of Modula-GM was its use in GM trucks starting with the 1990 model year vehicle control module (VCM) used to manage GM Powertrain's Vortec engines. Modula-GM was also used on all ECUs for GM's 90° Buick V6 engine family 3800 Series II used in the 1997-2005 model year Buick Park Avenue. The Modula-GM compilers and associated software management tools were sourced by Delco from Intermetrics. Modula-2 was selected as the basis for Delco's high level language because of its many strengths over other alternative language choices in 1986. After Delco Electronics was spun off from GM (with other component divisions) to form Delphi Automotive Systems in 1995, global sourcing required that a non-proprietary high-level software language be used. ECU embedded software now developed at Delphi is compiled with commercial compilers for the language C. The satellites of the Russian radionavigation-satellite service framework GLONASS, similar to the United States Global Positioning System (GPS), are programmed in Modula-2. Compilers Turbo Modula-2 was a compiler and an integrated development environment for MS-DOS developed, but not published, by Borland. Jensen and Partners, which included Borland cofounder Niels Jensen, bought the unreleased codebase and turned it into TopSpeed Modula-2. It was eventually sold to Clarion, now SoftVelocity, who then offered the Modula-2 compiler as part of its Clarion product line at that time. A Zilog Z80 CP/M version of Turbo Modula-2 was briefly marketed by Echelon under license from Borland. A companion release for Hitachi HD64180 was sold by Micromint as a development tool for their SB-180 single-board computer. IBM had a Modula-2 compiler for internal use which ran on both OS/2 and AIX, and had first class support in IBM's E2 editor. IBM Modula-2 was used for parts of the OS/400 Vertical Licensed Internal Code (effectively the kernel of OS/400). This code was mostly replaced with C++ when OS/400 was ported to the IBM RS64 processor family, although some remains in modern releases of the operating system. A Motorola 68000 backend also existed, which may have been used in embedded systems products. Operating systems Modula-2 is used to program some operating systems (OSs). The Modula-2 module structure and support are used directly in two related OSs. The OS named Medos-2, for the Lilith workstation, was developed at ETH Zurich, by Svend Erik Knudsen with advice from Wirth. It is a single user, object-oriented operating system built from Modula-2 modules. The OS named Excelsior, for the Kronos workstation, was developed by the Academy of Sciences of the Soviet Union, Siberian branch, Novosibirsk Computing Center, Modular Asynchronous Developable Systems (MARS) project, Kronos Research Group (KRG). It is a single user system based on Modula-2 modules. Books References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Founding_Fathers_of_the_United_States] | [TOKENS: 9463] |
Contents Founding Fathers of the United States Page version status This is an accepted version of this page The Founding Fathers of the United States, referred to as the Founding Fathers or the Founders by Americans, were a group of late-eighteenth-century American revolutionary leaders who united the Thirteen Colonies, oversaw the War of Independence from Great Britain, established the United States of America, and crafted a framework of government for the new nation. The Founding Fathers include those who wrote and signed the United States Declaration of Independence, the Articles of Confederation, and the Constitution of the United States, certain military personnel who fought in the American Revolutionary War, and others who greatly assisted in the nation's formation. The single person most identified as Father of the United States is George Washington, commanding general in the American Revolution and the nation's first president. In 1973, the historian Richard B. Morris identified seven figures as key founders, based on what he called the "triple tests" of leadership, longevity, and statesmanship: John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, Thomas Jefferson, James Madison, and Washington. Most of the Founding Fathers had ancestry traceable back to England, though many had family roots extending across the other regions of the British Isles: Scotland, Wales, and Ireland. Additionally, some traced their lineage back to the early Dutch settlers of New York (New Netherland) during the colonial era, while others were descendants of French Huguenots who settled in the British Thirteen Colonies, escaping religious persecution in France. Many of them were wealthy merchants, lawyers, landowners, and slaveowners. Historical founders The historian Richard B. Morris's selection of seven key founders was widely accepted through the 20th century. John Adams, Thomas Jefferson, and Benjamin Franklin were members of the Committee of Five that were charged by the Second Continental Congress with drafting the Declaration of Independence. Franklin, Adams, and John Jay negotiated the 1783 Treaty of Paris, which established American independence and brought an end to the American Revolutionary War. The constitutions drafted by Jay and Adams for their respective states of New York (1777) and Massachusetts (1780) proved influential in the language used in developing the U.S. Constitution. The Federalist Papers, which advocated the ratification of the Constitution, were written by Alexander Hamilton, James Madison, and Jay. George Washington was commander-in-chief of the Continental Army and later president of the Constitutional Convention. Each of these men held additional important roles in the early government of the United States. Washington, Adams, Jefferson, and Madison served as the first four presidents; Adams and Jefferson were the nation's first two vice presidents; Jay was the nation's first chief justice; Hamilton was the first secretary of the treasury; Jefferson was the first secretary of state; and Franklin was America's most senior diplomat from the start of the Revolutionary War through its conclusion with the signing of the Treaty of Paris in 1783. The list of Founding Fathers is often expanded to include the signers of the Declaration of Independence and individuals who later approved the U.S. Constitution. Some scholars regard all delegates to the Constitutional Convention as Founding Fathers whether they approved the Constitution or not. In addition, some historians include signers of the Articles of Confederation, which was adopted in 1781 as the nation's first constitution. Historians have come to recognize others as founders, such as Revolutionary War military leaders as well as participants in developments leading up to the war, including prominent writers, orators, and other men and women who contributed to cause. Since the 19th century, the Founding Fathers have shifted from the concept of them as demigods who created the modern nation-state, to take into account their inability to address issues such as slavery and the debt owed after the American Revolutionary War. Scholars emphasize that the Founding Fathers' accomplishments and shortcomings be viewed within the context of their time. Origin of phrase The phrase "Founding Fathers" was first introduced to Americans by US senator Warren G. Harding in his keynote speech at the Republican National Convention of 1916. Harding later repeated the phrase at his March 4, 1921, presidential inauguration, becoming the first president to use the term "Founding Fathers". Subsequent speakers adopted the term. The term "fathers" had long been used for the founders. In 1811, responding to praise for his generation, John Adams wrote to a younger Josiah Quincy III, "I ought not to object to your Reverence for your Fathers as you call them ... but to tell you a very great secret ... I have no reason to believe We were better than you are." He also wrote, "Don't call me, ... Father ... [or] Founder ... These titles belong to no man, but to the American people in general." In Thomas Jefferson's second presidential inaugural address in 1805, he referred to those who first came to the New World as "forefathers". At his 1825 inauguration, John Quincy Adams called the U.S. Constitution "the work of our forefathers" and expressed his gratitude to "founders of the Union". John Adams and Thomas Jefferson both died on the same day, July 4, 1826. President J. Quincy Adams paid tribute to them as "Fathers" and "Founders of the Republic". These terms were used in the U.S. throughout the 19th century, from the inaugurations of Martin Van Buren and James Polk in 1837 and 1845, to Abraham Lincoln's Cooper Union speech in 1860 and his Gettysburg Address in 1863, and up to William McKinley's first inauguration in 1897. At a 1902 celebration of Washington's Birthday in Brooklyn, James M. Beck, a constitutional lawyer and later a U.S. congressman, delivered an address, "Founders of the Republic", in which he connected the concepts of founders and fathers, saying: "It is well for us to remember certain human aspects of the founders of the republic. Let me first refer to the fact that these fathers of the republic were for the most part young men." List of Founding Fathers The National Archives has identified three founding documents as the "Charters of Freedom": Declaration of Independence, United States Constitution, and Bill of Rights. According to the archives, these documents "have secured the rights of the American people for nearly two and a half centuries and are considered instrumental to the founding and philosophy of the United States." In addition, as the nation's first constitution, the Articles of Confederation and Perpetual Union is also a founding document. As a result, signers of three key documents are generally considered to be Founding Fathers of the United States: Declaration of Independence (DI), Articles of Confederation (AC), and U.S. Constitution (USC). The following table provides a list of these signers, some of whom signed more than one document. The 55 delegates who attended the Constitutional Convention are referred to as framers. Of these, the 16 listed below did not sign the document. Three refused, while the remainder left early, either in protest of the proceedings or for personal reasons. Nevertheless, some sources regard all framers as founders, including those who did not sign: (*) Randolph, Mason, and Gerry were the only three present at the Constitution's adoption who refused to sign. In addition to the signers and Framers of the founding documents and one of the seven notable leaders previously mentioned—John Jay—the following are regarded as founders based on their contributions to the creation and early development of the new nation: Historians have come to recognize the roles women played in the nation's early development, using the term "Founding Mothers". Among the women honored in this respect are: The following men and women are also recognized for the notable contributions they made during the founding era: The colonies unite (1765–1774) In the mid-1760s, Parliament began levying taxes on the colonies to finance Britain's debts from the French and Indian War, a decade-long conflict that ended in 1763. Opposition to Stamp Act and Townshend Acts united the colonies in a common cause. While the Stamp Act was withdrawn, taxes on tea remained under the Townshend Acts and took on a new form in 1773 with Parliament's adoption of the Tea Act. The new tea tax, along with stricter customs enforcement, was not well-received across the colonies, particularly in Massachusetts. On December 16, 1773, 150 colonists disguised as Mohawk Indians boarded ships in Boston and dumped 342 chests of tea into the city's harbor, a protest that came to be known as the Boston Tea Party. Orchestrated by Samuel Adams and the Boston Committee of Correspondence, the protest was viewed as treasonous by British authorities. In response, Parliament passed the Coercive or Intolerable Acts, a series of punitive laws that closed Boston's port and placed the colony under direct control of the British government. These measures stirred unrest throughout the colonies, which felt Parliament had overreached its authority and was posing a threat to the self-rule that had existed in the Americas since the 1600s. Intent on responding to the acts, twelve of the Thirteen Colonies agreed to send delegates to meet in Philadelphia as the First Continental Congress, with Georgia declining because it needed British military support in its conflict with native tribes. The concept of an American union had been entertained long before 1774, but always embraced the idea that it would be subject to the authority of the British Empire. By 1774, however, letters published in colonial newspapers, mostly by anonymous writers, began asserting the need for a "Congress" to represent all Americans, one that would have equal status with British authority. Continental Congress (1774–1775) The Continental Congress was convened to deal with a series of pressing issues the colonies were facing with Britain. Its delegates were men considered to be the most intelligent and thoughtful among the colonialists. In the wake of the Intolerable Acts, at the hands of an unyielding British king and Parliament, the colonies were forced to choose between either totally submitting to arbitrary Parliamentary authority or resorting to unified armed resistance. The new Congress functioned as the directing body in declaring a great war and was sanctioned only by reason of the guidance it provided during the armed struggle. Its authority remained ill-defined, and few of its delegates realized that events would soon lead them to decide policies that ultimately established a "new power among the nations". In the process the Congress performed many experiments in government before an adequate Constitution evolved. The First Continental Congress convened at Philadelphia's Carpenter's Hall on September 5, 1774. The Congress, which had no legal authority to raise taxes or call on colonial militias, consisted of 56 delegates, including George Washington of Virginia; John Adams and Samuel Adams of Massachusetts; John Jay of New York; John Dickinson of Pennsylvania; and Roger Sherman of Connecticut. Peyton Randolph of Virginia was unanimously elected its first president. The Congress came close to disbanding in its first few days over the issue of representation, with smaller colonies desiring equality with the larger ones. While Patrick Henry, from the largest colony, Virginia, disagreed, he stressed the greater importance of uniting the colonies: "The distinctions between Virginians, Pennsylvanians, New Yorkers, and New Englanders are no more. I am not a Virginian, but an American!". The delegates then began with a discussion of the Suffolk Resolves, which had just been approved at a town meeting in Milton, Massachusetts. Joseph Warren, chairman of the Resolves drafting committee, had dispatched Paul Revere to deliver signed copies to the Congress in Philadelphia. The Resolves called for the ouster of British officials, a trade embargo of British goods, and the formation of a militia throughout the colonies. Despite the radical nature of the resolves, on September 17 the Congress passed them in their entirety in exchange for assurances that Massachusetts' colonists would do nothing to provoke war. The delegates then approved a series of measures, including a Petition to the King in an appeal for peace and a Declaration and Resolves which introduced the ideas of natural law and natural rights, foreshadowing some of the principles found in the Declaration of Independence and Bill of Rights. The declaration asserted the rights of colonists and outlined Parliament's abuses of power. Proposed by Richard Henry Lee, it also included a trade boycott known as the Continental Association. The Association, a crucial step toward unification, empowered committees of correspondence throughout the colonies to enforce the boycott. The Declaration and its boycott directly challenged Parliament's right to govern in the Americas, bolstering the view of King George III and his administration under Lord North that the colonies were in a state of rebellion. Lord Dartmouth, the secretary of state for the colonies who had been sympathetic to the Americans, condemned the newly established Congress for what he considered its illegal formation and actions. In tandem with the Intolerable Acts, British Army commander-in-chief Lieutenant General Thomas Gage was installed as governor of Massachusetts. In January 1775, Gage's superior, Lord Dartmouth, ordered the general to arrest those responsible for the Tea Party and to seize the munitions that had been stockpiled by militia forces outside of Boston. The letter took several months to reach Gage, who acted immediately by sending out 700 army regulars. During their march to Lexington and Concord on the morning of April 19, 1775, the British troops encountered militia forces, who had been warned the night before by Paul Revere and another messenger on horseback, William Dawes. Even though it is unknown who fired the first shot, the Revolutionary War began. On May 10, 1775, less than three weeks after the Battles at Lexington and Concord, the Second Continental Congress convened in the Pennsylvania State House. The gathering essentially reconstituted the First Congress with many of the same delegates in attendance. Among the new arrivals were Benjamin Franklin of Pennsylvania, John Hancock of Massachusetts, and in June, Thomas Jefferson of Virginia. Hancock was elected president two weeks into the session when Peyton Randolph was recalled to Virginia to preside over the House of Burgesses as speaker, and Jefferson was named to replace him in the Virginia delegation. After adopting the rules of debate from the previous year and reinforcing its emphasis on secrecy, the Congress turned to its foremost concern, the defense of the colonies. The provincial assembly in Massachusetts, which had declared the colony's governorship vacant, reached out to the Congress for direction on two matters: whether the assembly could assume the powers of civil government and whether the Congress would take over the army being formed in Boston. In answer to the first question, on June 9 the colony's leaders were directed to choose a council to govern within the spirit of the colony's charter. As for the second, Congress spent several days discussing plans for guiding the forces of all thirteen colonies. Finally, on June 14 Congress approved provisioning the New England militias, agreed to send ten companies of riflemen from other colonies as reinforcements, and appointed a committee to draft rules for governing the military, thus establishing the Continental Army. The next day, Samuel and John Adams nominated Washington as commander-in-chief, a motion that was unanimously approved. Two days later, on June 17, the militias clashed with British forces at Bunker Hill, a victory for Britain but a costly one. The Congress's actions came despite the divide between conservatives who still hoped for reconciliation with England and at the other end of the spectrum, those who favored independence. To satisfy the former, Congress adopted the Olive Branch Petition on July 5, an appeal for peace to King George III written by John Dickinson. Then, the following day, it approved the Declaration of the Causes and Necessity of Taking Up Arms, a resolution justifying military action. The declaration, intended for Washington to read to the troops upon his arrival in Massachusetts, was drafted by Jefferson but edited by Dickinson who thought its language too strong. When the Olive Branch Petition arrived in London in September, the king refused to look at it. By then, he had already issued a proclamation declaring the American colonies in rebellion. Declaration of Independence (1776) Under the auspices of the Second Continental Congress and its Committee of Five, Thomas Jefferson drafted the Declaration of Independence. It was presented to the Congress by the Committee on June 28, and after much debate and editing of the document, on July 2, 1776, Congress passed the Lee Resolution, which declared the United Colonies independent from Great Britain. Two days later, on July 4, the Declaration of Independence was adopted. The name "United States of America", which first appeared in the Declaration, was formally approved by the Congress on September 9, 1776. In an effort to get this important document promptly into the public realm John Hancock, president of the Second Continental Congress, commissioned John Dunlap, editor and printer of the Pennsylvania Packet, to print 200 broadside copies of the Declaration, which came to be known as the Dunlap broadsides. Printing commenced the day after the Declaration was adopted. They were distributed throughout the 13 colonies/states with copies sent to General Washington and his troops at New York with a directive that it be read aloud. Copies were also sent to Britain and other points in Europe. Fighting for independence While the colonists were fighting the British to gain independence their newly formed government, with its Articles of Confederation, were put to the test, revealing the shortcomings and weaknesses of America's first Constitution. During this time Washington became convinced that a strong federal government was urgently needed, as the individual states were not meeting the organizational and supply demands of the war on their own individual accord. Key precipitating events included the Boston Tea Party in 1773, Paul Revere's Ride in 1775, and the Battles of Lexington and Concord in 1775. George Washington's crossing of the Delaware River was a major American victory over Hessian forces at the Battle of Trenton and greatly boosted American morale. The Battle of Saratoga and the Siege of Yorktown, which primarily ended the fighting between American and British, were also pivotal events during the war. The 1783 Treaty of Paris marked the official end of the war. After the war, Washington was instrumental in organizing the effort to create a "national militia" made up of individual state units, and under the direction of the federal government. He also endorsed the creation of a military academy to train artillery officers and engineers. Not wanting to leave the country disarmed and vulnerable so soon after the war, Washington favored a peacetime army of 2,600 men. He also favored the creation of a navy that could repel any European intruders. He approached Henry Knox, who accompanied Washington during most of his campaigns, with the prospect of becoming the future Secretary of War. After Washington's final victory at the surrender at Yorktown on October 19, 1781, more than a year passed before official negotiations for peace commenced. The Treaty of Paris was drafted in November 1782, and negotiations began in April 1783. The completed treaty was signed on September 3. Benjamin Franklin, John Adams, John Jay and Henry Laurens represented the United States, while David Hartley, a member of Parliament, and Richard Oswald, a prominent and influential Scottish businessman, represented Great Britain. Franklin, who had a long-established rapport with the French and was almost entirely responsible for securing an alliance with them a few months after the start of the war, was greeted with high honors from the French council, while the others received due accommodations but were generally considered to be amateur negotiators. Communications between Britain and France were largely effected through Franklin and Lord Shelburne who was on good terms with Franklin. Franklin, Adams and Jay understood the concerns of the French at this uncertain juncture and, using that to their advantage, in the final sessions of negotiations convinced both the French and the British that American independence was in their best interests. Constitutional Convention Under the Articles of Confederation, the Congress of the Confederation had no power to collect taxes, regulate commerce, pay the national debt, conduct diplomatic relations, or effectively manage the western territories. Key leaders – George Washington, Thomas Jefferson, Alexander Hamilton, James Madison, and others – began fearing for the young nation's fate. As the Articles' weaknesses became more and more apparent, the idea of creating a strong central government gained support, leading to the call for a convention to amend the Articles. The Constitutional Convention met in the Pennsylvania State House from May 14 through September 17, 1787. The 55 delegates in attendance represented a cross-section of 18th-century American leadership. The vast majority were well-educated and prosperous, and all were prominent in their respective states with over 70 percent (40 delegates) serving in the Congress when the convention was proposed. Many delegates were late to arrive, and after eleven days' delay, a quorum was finally present on May 25 to elect Washington, the nation's most trusted figure, as convention president. Four days later, on May 29, the convention adopted a rule of secrecy, a controversial decision but a common practice that allowed delegates to speak freely. Immediately following the secrecy vote, Virginia governor Edmund Randolph introduced the Virginia Plan, fifteen resolutions written by Madison and his colleagues proposing a government of three branches: a single executive, a bicameral (two-house) legislature, and a judiciary. The lower house was to be elected by the people, with seats apportioned by state population. The upper house would be chosen by the lower house from delegates nominated by state legislatures. The executive, who would have veto power over legislation, would be elected by the Congress, which could overrule state laws. While the plan exceeded the convention's objective of merely amending the Articles, most delegates were willing to abandon their original mandate in favor of crafting a new form of government. Discussions of the Virginia resolutions continued into mid-June, when William Paterson of New Jersey presented an alternative proposal. The New Jersey Plan retained most of the Articles' provisions, including a one-house legislature and equal power for the states. One of the plan's innovations was a "plural" executive branch, but its primary concession was to allow the national government to regulate trade and commerce. Meeting as a committee of the whole, the delegates discussed the two proposals beginning with the question of whether there should be a single or three-fold executive and then whether to grant the executive veto power. After agreeing on a single executive who could veto legislation, the delegates turned to an even more contentious issue, legislative representation. Larger states favored proportional representation based on population, while smaller states wanted each state to have the same number of legislators. By mid-July, the debates between the large-state and small-state factions had reached an impasse. With the convention on the verge of collapse, Roger Sherman of Connecticut introduced what became known as the Connecticut (or Great) Compromise. Sherman's proposal called for a House of Representatives elected proportionally and a Senate where all states would have the same number of seats. On July 16, the compromise was approved by the narrowest of margins, 5 states to 4. The proceedings left most delegates with reservations. Several went home early in protest, believing the convention was overstepping its authority. Others were concerned about the lack of a Bill of Rights safeguarding individual liberties. Even Madison, the Constitution's chief architect, was dissatisfied, particularly over equal representation in the Senate and the failure to grant Congress the power to veto state legislation. Misgivings aside, a final draft was approved overwhelmingly on September 17, with 11 states in favor and New York unable to vote since it had only one delegate remaining, Hamilton. Rhode Island, which was in a dispute over the state's paper currency, had refused to send anyone to the convention. Of the 42 delegates present, only three refused to sign: Randolph and George Mason, both of Virginia, and Elbridge Gerry of Massachusetts. The U. S. Constitution faced one more hurdle: approval by the legislatures in at least nine of the 13 states. Within three days of the signing, the draft was submitted to the Congress of the Confederation, which forwarded the document to the states for ratification. In November, Pennsylvania's legislature convened the first of the conventions. Before it could vote, Delaware became the first state to ratify, approving the Constitution on December 7 by a 30–0 margin. Pennsylvania followed suit five days later, splitting its vote 46–23. Despite unanimous votes in New Jersey and Georgia, several key states appeared to be leaning against ratification because of the omission of a Bill of Rights, particularly Virginia where the opposition was led by Mason and Patrick Henry, who had refused to participate in the convention claiming he "smelt a rat". Rather than risk everything, the Federalists relented, promising that if the Constitution was adopted, amendments would be added to secure people's rights. Over the next year, the string of ratifications continued. Finally, on June 21, 1788, New Hampshire became the ninth state to ratify, making the Constitution the law of the land. Virginia followed suit four days later, and New York did the same in late July. After North Carolina's assent in November, another year-and-a-half would pass before the 13th state would weigh in. Facing trade sanctions and the possibility of being forced out of the union, Rhode Island approved the Constitution on May 29, 1790, by a begrudging 34–32 vote. The Constitution officially took effect on March 4, 1789 (236 years ago) (1789-03-04), when the House and Senate met for their first sessions. On April 30, Washington was sworn in as the nation's first president. Ten amendments, known collectively as the United States Bill of Rights, were ratified on December 15, 1791. Because the delegates were sworn to secrecy, Madison's notes on the ratification were not published until after his death in 1836. The Constitution, as drafted, was sharply criticized by the Anti-Federalists, a group that contended the document failed to safeguard individual liberties from the federal government. Leading Anti-Federalists included Patrick Henry and Richard Henry Lee, both from Virginia, and Samuel Adams of Massachusetts. Delegates at the Constitutional Convention who shared their views were Virginians George Mason and Edmund Randolph and Massachusetts representative Elbridge Gerry, the three delegates who refused to sign the final document. Henry, who derived his hatred of a central governing authority from his Scottish ancestry, did all in his power to defeat the Constitution, opposing Madison every step of the way. The criticisms are what led to the amendments proposed under the Bill of Rights. Madison, the bill's principal author, was originally opposed to the amendments, but was influenced by the 1776 Virginia Declaration of Rights, primarily written by Mason, and the Declaration of Independence, by Thomas Jefferson. Jefferson, while in France, shared Henry's and Mason's fears about a strong central government, especially the president's power, but because of his friendship with Madison and the pending Bill of Rights, he quieted his concerns. Alexander Hamilton, however, was opposed to a Bill of Rights believing the amendments not only unnecessary but dangerous: Why declare things shall not be done, which there is no power to do ... that the liberty of the press shall not be restrained, when no power is given by which restrictions may be imposed? Madison had no way of knowing the debate between Virginia's two legislative houses would delay the adoption of the amendments for more than two years. The final draft, referred to the states by the federal Congress on September 25, 1789, was not ratified by Virginia's Senate until December 15, 1791. The Bill of Rights drew its authority from the consent of the people and held that, The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people. — Article 11. The powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people. — Article 12. Madison came to be recognized as the founding era's foremost proponent of religious liberty, free speech, and freedom of the press. Ascending to the presidency The first five U.S. presidents are regarded as Founding Fathers for their active participation in the American Revolution: Washington, John Adams, Jefferson, Madison, and Monroe. Each of them served as a delegate to the Continental Congress. Demographics and other characteristics The Founding Fathers represented the upper echelon of political leadership in the British colonies during the latter half of the 18th century. All were leaders in their communities and respective colonies who were willing to assume responsibility for public affairs. Of the signers of the Declaration of Independence, Articles of Confederation, and U.S. Constitution, nearly all were native born and of British heritage, including Scots, Irish, and Welsh. Nearly half were lawyers, while the remainder were primarily businessmen and planter-farmers. The average age of the founders was 43. Benjamin Franklin, born in 1706, was the oldest, while only a few were born after 1750 and thus were in their 20s. The following sections discuss these and other demographic topics in greater detail. For the most part, the information is confined to signers/delegates associated with the Declaration of Independence, Articles of Confederation, and Constitution. All of the Founding Fathers had extensive political experience at the national and state levels. As just one example, the signers of the Declaration of Independence and Articles of Confederation were members of Second Continental Congress, while four-fifths of the delegates at the Constitutional Convention had served in the Congress either during or prior to the convention. The remaining fifth attending the convention were recognized as leaders in the state assemblies that appointed them. Following are brief profiles of the political backgrounds of some of the more notable founders: More than a third of the Founding Fathers attended or graduated from colleges in the American colonies, while additional founders attended college abroad, primarily in England and Scotland. All other founders either were home schooled, received tutoring, completed apprenticeships, or were self-educated. Following is a listing of founders who graduated from six of the nine colleges established in the Americas during the Colonial Era. A few founders, such as Alexander Hamilton and James Monroe, attended college (Columbia and William & Mary, respectively) but did not graduate. The other three colonial colleges, all founded in the 1760s, included Brown University (College of Rhode Island), Dartmouth College, and Rutgers University (Queen's College). Following are founders who graduated from institutions in Britain: All of the founders were white, and two-thirds (36 out of 55) were natives of the American Colonies, while nineteen were born in other parts of the British Empire. While the Founding Fathers were engaged in a broad range of occupations, most had careers in three professions: about half the founders were lawyers, a sixth were planters/farmers, another sixth were merchants/businessmen, and the others were spread across miscellaneous professions. Religion Of the 55 delegates to the Constitutional Convention in 1787, 28 were Anglicans (Church of England or Episcopalian), 21 were other Protestants, and three were Catholics (Daniel Carroll and Fitzsimons; Charles Carroll was Catholic but was not a Constitution signatory). Among the non-Anglican Protestant delegates to the Constitutional Convention, eight were Presbyterians, seven were Congregationalists, two were Lutherans, two were Dutch Reformed, and two were Methodists. A few prominent Founding Fathers were anti-clerical, notably Jefferson. Many founders deliberately avoided public discussion of their faith. Historian David L. Holmes uses evidence gleaned from letters, government documents, and second-hand accounts to identify their religious beliefs. Founders on currency and postage Four U.S. founders are minted on American currency—Benjamin Franklin, Alexander Hamilton, Thomas Jefferson, and George Washington; Washington and Jefferson both appear on three different denominations. Additionally, the reverse of Jefferson's two-dollar bill features John Trumbull's 1818 depiction of the signing of the Declaration of Independence. Political and cultural impact According to David Sehat, in modern politics: Everyone cites the Founders. Constitutional originalists consult the Founders' papers to decide original meaning. Proponents of a living and evolving Constitution turn to the Founders as the font of ideas that have grown over time. Conservatives view the Founders as architects of a free enterprise system that built American greatness. The more liberal-leaning, following their sixties parents, claim the Founders as egalitarians, suspicious of concentrations of wealth. Independents look to the Founders to break the logjam of partisan brinksmanship. Across the political spectrum, Americans ground their views in a supposed set of ideas that emerged in the eighteenth century. But, in fact, the Founders disagreed with each other....they had vast and profound differences. They argued over federal intervention in the economy and about foreign policy. They fought bitterly over how much authority rested with the executive branch, about the relationship and prerogatives of federal and state government. The Constitution provided a nearly limitless theater of argument. The founding era was, in reality, one of the most partisan periods of American history. Independence Day (colloquially called the Fourth of July) is a United States national holiday celebrated yearly on July 4 to commemorate the signing of the Declaration of Independence and the founding of the nation. Washington's Birthday is also observed as a national federal holiday, and on April 13 Jefferson's Birthday honors the US founder and president. The Founding Fathers were portrayed in the Tony Award–winning 1969 musical 1776, which depicted the debates over and eventual adoption of the Declaration of Independence. The stage production was adapted into the 1972 film of the same name. The 1989 film A More Perfect Union, which was filmed on location in Independence Hall, depicts the events of the Constitutional Convention. The writing and passing of the founding documents are depicted in the 1997 documentary miniseries Liberty!, and the passage of the Declaration of Independence is portrayed in the second episode of the 2008 miniseries John Adams and the third episode of the 2015 miniseries Sons of Liberty. The Founders also feature in the 1986 miniseries George Washington II: The Forging of a Nation, the 2002–2003 animated television series Liberty's Kids, the 2020 miniseries Washington, and in many other films and television portrayals.[citation needed] Several Founding Fathers, Hamilton, Washington, Jefferson, and Madison—were reimagined in Hamilton, a 2015 musical inspired by Ron Chernow's 2004 biography Alexander Hamilton, with music, lyrics and book by Lin-Manuel Miranda. The musical won eleven Tony Awards and a Pulitzer Prize for Drama. Several major professional sports teams in the Northeastern United States are named for themes based on the founders: Religious freedom Religious persecution had existed for centuries around the world and it existed in colonial America. Founders such as Thomas Jefferson, James Madison, Patrick Henry, and George Mason first established a measure of religious freedom in Virginia in 1776 with the Virginia Declaration of Rights, which became a model for religious liberty for the nation. Prior to this, Baptists, Presbyterians, and Lutherans had for a decade petitioned against the Church of England's efforts to suppress religious liberties in Virginia. Jefferson left the Continental Congress to return to Virginia to join the fight for religious freedom, which proved difficult since many members of the Virginia legislature belonged to the established Church of England. While Jefferson was not completely successful, he managed to have repealed the various laws that were punitive toward those with different religious beliefs. Jefferson was the architect for separation of Church and State, which opposed the use of public funds to support any established religion and believed it was unwise to link civil rights to religious doctrine. The United States Constitution, ratified in 1788, states in Article VI that "no religious Test shall ever be required as a Qualification to any Office or public Trust under the United States". Freedom of religion and freedom of speech were further affirmed as the nation's law in the Bill of Rights. The 14th Amendment of 1868 provided all Americans with "equal protection under the laws" and thus applied the First Amendment restriction against limiting the free exercise of religion to the states. Washington, a local leader of the Church of England, was also a strong proponent of religious freedom. He assured Baptists worried that the Constitution might not protect their religious liberties, that, "... certainly, I would never have placed my signature to it." Jews also viewed Washington as a champion of freedom and sought his assurances that they would enjoy complete religious freedom. Washington responded by declaring America's revolution in religion stood as an example for the rest of the world. Slavery The Founding Fathers were not unified on the issue of slavery and continued to accommodate it within the new nation. Some were morally opposed to it and some attempted to end it in several of the colonies, but nationally, slavery remained protected. In her study of Jefferson, a slaveholder of 600 slaves, Annette Gordon-Reed notes ironically, "Others of the founders held slaves, but no other founder drafted the charter for American freedom". As well as Jefferson, Washington and many other Founding Fathers were slaveowners; 41 of the 56 signers of the Declaration owned slaves. Some were conflicted by the institution, seeing it as immoral and politically divisive; Washington freed his slaves, in his will. Jay and Hamilton led the successful fight to outlaw the international slave trade in New York, with efforts beginning in 1777. Thomas Jefferson included an anti-slavery clause in his original draft of the Declaration of Independence: He [King George] has waged cruel war against human nature itself, violating its most sacred rights of life and liberty in the persons of a distant people who never offended him, captivating & carrying them into slavery in another hemisphere or to incur miserable death in their transportation thither. This piratical warfare, the opprobrium of infidel powers, is the warfare of the Christian King of Great Britain. Determined to keep open a market where Men should be bought & sold, he has prostituted his negative for suppressing every legislative attempt to prohibit or restrain this execrable commerce. And that this assemblage of horrors might want no fact of distinguished die, he is now exciting those very people to rise in arms among us, and to purchase that liberty of which he has deprived them, by murdering the people on whom he has obtruded them: thus paying off former crimes committed again the Liberties of one people, with crimes which he urges them to commit against the lives of another. Founders such as Samuel Adams and John Adams were against slavery. Rush wrote a pamphlet in 1773 which criticized the slave trade, and slavery. Rush argued scientifically that Africans are not intellectually or morally inferior, and any apparent evidence to the contrary is only the "perverted expression" of slavery, which "is so foreign to the human mind, that the moral faculties, as well as those of the understanding are debased, and rendered torpid by it." The Continental Association contained a clause which banned any Patriot involvement in slave trading. Franklin, though a key founder of the Pennsylvania Abolition Society, owned slaves whom he manumitted (released). While serving in the Rhode Island Assembly, in 1769 Hopkins introduced one of the earliest anti-slavery laws in the colonies. When Jefferson entered public life as a member of the House of Burgesses, he began as a social reformer by an effort to secure legislation permitting emancipation of slaves. Jay founded the New York Manumission Society in 1785, for which Hamilton became an officer. They and other members of the Society founded the African Free School in New York, to educate the children of free blacks and slaves. When Jay was governor of New York in 1798, he helped secure and signed into law an abolition law; fully ending forced labor as of 1827. He freed his slaves in 1798. Hamilton opposed slavery, as his experiences left him familiar with it and its effect on slaves and slaveholders, though he did negotiate slave transactions for his wife's family, the Schuylers. Evidence suggests Hamilton may have owned a house slave[citation needed] and after the Jay Treaty was signed, Hamilton advocated that American slaves freed by the British during the war be forcibly returned to their enslavers. Henry Laurens, ran the largest slave trading house in North America. In the 1750s alone, his firm, Austin and Laurens, handled sales of more than 8,000 Africans. Slaves and slavery are mentioned indirectly in the 1787 Constitution. For example, Article 1, Section 2, Clause 3 prescribes that "three-fifths of all other Persons" are to be counted for the apportionment of seats in the House of Representatives and direct taxes. Additionally, in Article 4, Section 2, Clause 3, slaves are referred to as "persons held in service or labor". The Founding Fathers made some efforts to contain slavery. Many Northern states had adopted legislation to end, or significantly reduce slavery, during and after the revolution. In 1782, Virginia passed a manumission law that allowed owners to free their slaves by will or deed. As a result, thousands of slaves were manumitted in Virginia. In the Ordinance of 1784, Jefferson proposed to ban slavery in all the western territories, which failed to pass Congress by one vote. Partially following Jefferson's plan, Congress did ban slavery in the Northwest Ordinance, for lands north of the Ohio River. The international slave trade was banned in all states except South Carolina by 1800. In 1807, President Jefferson called for and signed into law a federally enforced ban on the international slave trade, throughout the U.S. and its territories. It became a federal crime to import or export a slave. However, the domestic slave trade was allowed for expansion or for diffusion of slavery into the Louisiana Territory. According to Jeffrey K. Tulis and Nicole Mellow: The Founding, Reconstruction (often called "the second founding"), and the New Deal are typically heralded as the most significant turning points in the country's history, with many observers seeing each of these as political triumphs through which the United States has come to more closely realize its liberal ideals of liberty and equality. Scholars such as Eric Foner have expanded the theme. Black abolitionists played a key role by stressing that freed blacks needed equal rights after slavery was abolished. Biographer David Blight states that Frederick Douglass, "played a pivotal role in America's Second Founding out of the apocalypse of the Civil War, and he very much wished to see himself as a founder and a defender of the Second American Republic." Constitutional provision for racial equality for free blacks was enacted by a Republican Congress led by Thaddeus Stevens, Charles Sumner and Lyman Trumbull. The "second founding" comprised the 13th, 14th and 15th amendments to the Constitution. All citizens now had federal rights that could be enforced in federal court. In a deep reaction, after 1876 freedmen lost many of these rights and had second class citizenship in the era of lynching and Jim Crow laws. Finally in the 1950s the U.S., Supreme Court started to restore those rights. Under the leadership of Martin Luther King and James Bevel, the Civil Rights movement made the nation aware of the crisis, and under President Lyndon Johnson major civil rights legislation was passed in 1964–65, and 1968. Scholarly analysis There are thousands of historians who have written about the American Revolution era and the founding of the United States government. Some of the most prominent ones are listed below. While most scholarly works maintain overall objectivity, historian Arthur H. Shaffer notes that many of the early works about the American Revolution often express a national bias, or anti-bias. Shaffer maintains that this bias lends a direct insight into the minds of the founders and their adversaries respectively. He notes that any bias is the product of a national interest and prevailing political mood, and as such cannot be dismissed as having no historic value for the modern historian. Conversely, various modern accounts of history contain anachronisms, modern day ideals and perceptions used in an effort to write about the past and as such can distort the historical account in an effort to placate a modern audience. Several of the earliest histories of the founding of the United States and its founders were written by Jeremy Belknap, author of his three-volume work, The history of New-Hampshire, published in 1784. Articles and books by these and other 20th- and 21st-century historians, combined with the digitization of primary sources such as handwritten letters, continue to contribute to an encyclopedic body of knowledge about the Founding Fathers: According to American historian Joseph Ellis, the concept of the Founding Fathers of the U.S. emerged in the 1820s as the last survivors died out. Ellis says the founders, or the fathers comprised an aggregate of semi-sacred figures whose particular accomplishments and singular achievements were decidedly less important than their sheer presence as a powerful but faceless symbol of past greatness. For the generation of national leaders coming of age in the 1820s and 1830s, such as Andrew Jackson, Henry Clay, Daniel Webster, and John C. Calhoun, the founders represented heroic but anonymous abstraction whose long shadow fell across all followers and whose legendary accomplishments defied comparison.[citation needed] We can win no laurels in a war for independence. Earlier and worthier hands have gathered them all. Nor are there places for us ... [as] the founders of states. Our fathers have filled them. But there remains to us a great duty of defence and preservation. See also Notes Citations Bibliography Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-Goldstein_27-9-18-135] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Extraterrestrial_life#External_links] | [TOKENS: 11349] |
Contents Extraterrestrial life Extraterrestrial life, or alien life (colloquially aliens), is life that originates from another world rather than on Earth. No extraterrestrial life has yet been scientifically or conclusively detected. Such life might range from simple forms such as prokaryotes to intelligent beings, possibly bringing forth civilizations that might be far more, or far less, advanced than humans. The Drake equation speculates about the existence of sapient life elsewhere in the universe. The science of extraterrestrial life is known as astrobiology. Speculation about inhabited worlds beyond Earth dates back to antiquity. Early Christian writers, including Augustine, discussed ideas from thinkers like Democritus and Epicurus about countless worlds in the vast universe. Pre-modern writers typically assumed extraterrestrial "worlds" were inhabited by living beings. William Vorilong, in the 15th century, acknowledged the possibility Jesus could have visited extraterrestrial worlds to redeem their inhabitants.: 26 In 1440, Nicholas of Cusa suggested Earth is a "brilliant star"; he theorized that all celestial bodies, even the Sun, could host life. Descartes wrote that there were no means to prove the stars were not inhabited by "intelligent creatures", but their existence was a matter of speculation.: 67 In comparison to the life-abundant Earth, the vast majority of intrasolar and extrasolar planets and moons have harsh surface conditions and disparate atmospheric chemistry, or lack an atmosphere. However, there are many extreme and chemically harsh ecosystems on Earth that do support forms of life and are often hypothesized to be the origin of life on Earth. Examples include life surrounding hydrothermal vents, acidic hot springs, and volcanic lakes, as well as halophiles and the deep biosphere. Since the mid-20th century, researchers have searched for extraterrestrial life and intelligence. Solar system studies focus on Venus, Mars, Europa, and Titan, while exoplanet discoveries now total 6,022 confirmed planets in 4,490 systems as of October 2025. Depending on the category of search, methods range from analysis of telescope and specimen data to radios used to detect and transmit interstellar communication. Interstellar travel remains largely hypothetical, with only the Voyager 1 and Voyager 2 probes confirmed to have entered the interstellar medium. The concept of extraterrestrial life, especially intelligent life, has greatly influenced culture and fiction. A key debate centers on contacting extraterrestrial intelligence: some advocate active attempts, while others warn it could be risky, given human history of exploiting other societies. Context Initially, after the Big Bang, the universe was too hot to allow life. It is estimated that the temperature of the universe was around 10 billion Kelvin at the one-second mark. Roughly 15 million years later, it cooled to temperate levels, though the elements of organic life were yet nonexistent. The only freely available elements at that point were hydrogen and helium. Carbon and oxygen (and later, water) would not appear until 50 million years later, created through stellar fusion. At that point, the difficulty for life to appear was not the temperature, but the scarcity of free heavy elements. Planetary systems emerged, and the first organic compounds may have formed in the protoplanetary disk of dust grains that would eventually create rocky planets like Earth. Although Earth was in a molten state after its birth and may have burned any organics that fell on it, it would have been more receptive once it cooled down. Once the right conditions on Earth were met, life started by a chemical process known as abiogenesis. Alternatively, life may have formed less frequently, then spread—by meteoroids, for example—between habitable planets in a process called panspermia. During most of its stellar evolution, stars combine hydrogen nuclei to make helium nuclei by stellar fusion, and the comparatively lighter weight of helium allows the star to release the extra energy. The process continues until the star uses all of its available fuel, with the speed of consumption being related to the size of the star. During its last stages, stars start combining helium nuclei to form carbon nuclei. The larger stars can further combine carbon nuclei to create oxygen and silicon, oxygen into neon and sulfur, and so on until iron. Ultimately, the star blows much of its content back into the stellar medium, where it would join clouds that would eventually become new generations of stars and planets. Many of those materials are the raw components of life on Earth. As this process takes place in all the universe, said materials are ubiquitous in the cosmos and not a rarity from the Solar System. Earth is a planet in the Solar System, a planetary system formed by a star at the center, the Sun, and the objects that orbit it: other planets, moons, asteroids, and comets. The sun is part of the Milky Way, a galaxy. The Milky Way is part of the Local Group, a galaxy group that is in turn part of the Laniakea Supercluster. The universe is composed of all similar structures in existence. The immense distances between celestial objects are a difficulty for studying extraterrestrial life. So far, humans have only set foot on the Moon and sent robotic probes to other planets and moons in the Solar System. Although probes can withstand conditions that may be lethal to humans, the distances cause time delays: the New Horizons took nine years after launch to reach Pluto. No probe has ever reached extrasolar planetary systems. The Voyager 2 left the Solar System at a speed of 50,000 kilometers per hour; if it headed towards the Alpha Centauri system, the closest one to Earth at 4.4 light years, it would reach it in 100,000 years. Under current technology, such systems can only be studied by telescopes, which have limitations. It is estimated that dark matter has a larger amount of combined matter than stars and gas clouds, but as it plays no role in the stellar evolution of stars and planets, it is usually not taken into account by astrobiology. There is an area around a star, the circumstellar habitable zone or "Goldilocks zone", wherein water may be at the right temperature to exist in liquid form at a planetary surface. This area is neither too close to the star, where water would become steam, nor too far away, where water would be frozen as ice. However, although useful as an approximation, planetary habitability is complex and defined by several factors. Being in the habitable zone is not enough for a planet to be habitable, not even to actually have such liquid water. Venus is located in the solar system's habitable zone, but does not have liquid water because of the conditions of its atmosphere. Jovian planets or gas giants are not considered habitable even if they orbit close enough to their stars as hot Jupiters, due to crushing atmospheric pressures. The actual distances for the habitable zones vary according to the type of star, and even the solar activity of each specific star influences the local habitability. The type of star also defines the time the habitable zone will exist, as its presence and limits will change along with the star's stellar evolution. The Big Bang occurred 13.8 billion years ago, the Solar System was formed 4.6 billion years ago, and the first hominids appeared 6 million years ago. Life on other planets may have started, evolved, given birth to extraterrestrial intelligences, and perhaps even faced a planetary extinction event millions or billions of years ago. When considered from a cosmic perspective, the brief times of existence of Earth's species may suggest that extraterrestrial life may be equally fleeting under such a scale. During a period of about 7 million years, from about 10 to 17 million years after the Big Bang, the background temperature was between 373 and 273 K (100 and 0 °C; 212 and 32 °F), allowing the possibility of liquid water if any planets existed. Avi Loeb (2014) speculated that primitive life might in principle have appeared during this window, which he called "the Habitable Epoch of the Early Universe". Life on Earth is quite ubiquitous across the planet and has adapted over time to almost all the available environments in it, extremophiles and the deep biosphere thrive at even the most hostile ones. As a result, it is inferred that life in other celestial bodies may be equally adaptive. However, the origin of life is unrelated to its ease of adaptation and may have stricter requirements. A celestial body may not have any life on it, even if it were habitable. Likelihood of existence Life in the cosmos beyond Earth has been observed. The hypothesis of ubiquitous extraterrestrial life relies on three main ideas. The first one, the size of the universe, allows for plenty of planets to have a similar habitability to Earth, and the age of the universe gives enough time for a long process analog to the history of Earth to happen there. The second is that the substances that make life, such as carbon and water, are ubiquitous in the universe. The third is that the physical laws are universal, which means that the forces that would facilitate or prevent the existence of life would be the same ones as on Earth. According to this argument, made by scientists such as Carl Sagan and Stephen Hawking, it would be improbable for life not to exist somewhere else other than Earth. This argument is embodied in the Copernican principle, which states that Earth does not occupy a unique position in the Universe, and the mediocrity principle, which states that there is nothing special about life on Earth. Other authors consider instead that life in the cosmos, or at least multicellular life, may actually be rare. The Rare Earth hypothesis maintains that life on Earth is possible because of a series of factors that range from the location in the galaxy and the configuration of the Solar System to local characteristics of the planet, and that it is unlikely that another planet simultaneously meets all such requirements. The proponents of this hypothesis consider that very little evidence suggests the existence of extraterrestrial life and that, at this point, it is just a desired result and not a reasonable scientific explanation for any gathered data. In 1961, astronomer and astrophysicist Frank Drake devised the Drake equation as a way to stimulate scientific dialogue at a meeting on the search for extraterrestrial intelligence (SETI). The Drake equation is a probabilistic argument used to estimate the number of active, communicative extraterrestrial civilizations in the Milky Way galaxy. The Drake equation is:: xix where: and Drake's proposed estimates are as follows, but numbers on the right side of the equation are agreed as speculative and open to substitution: 10,000 = 5 ⋅ 0.5 ⋅ 2 ⋅ 1 ⋅ 0.2 ⋅ 1 ⋅ 10,000 {\displaystyle 10{,}000=5\cdot 0.5\cdot 2\cdot 1\cdot 0.2\cdot 1\cdot 10{,}000} [better source needed] The Drake equation has proved controversial since, although it is written as a math equation, none of its values were known at the time. Although some values may eventually be measured, others are based on social sciences and are not knowable by their very nature. This does not allow one to make noteworthy conclusions from the equation. Based on observations from the Hubble Space Telescope, there are nearly 2 trillion galaxies in the observable universe. It is estimated that at least ten percent of all Sun-like stars have a system of planets. In other words, there are 6.25×1018 stars with planets orbiting them in the observable universe. Even if it is assumed that only one out of a billion of these stars has planets supporting life, there would be some 6.25 billion life-supporting planetary systems in the observable universe. A 2013 study based on results from the Kepler spacecraft estimated that the Milky Way contains at least as many planets as it does stars, resulting in 100–400 billion exoplanets. The Nebular hypothesis that explains the formation of the Solar System and other planetary systems would suggest that those can have several configurations, and not all of them may have rocky planets within the habitable zone. The apparent contradiction between high estimates of the probability of the existence of extraterrestrial civilisations and the lack of evidence for such civilisations is known as the Fermi paradox. Dennis W. Sciama claimed that life's existence in the universe depends on various fundamental constants. Zhi-Wei Wang and Samuel L. Braunstein suggest that a random universe capable of supporting life is likely to be just barely able to do so, giving a potential explanation to the Fermi paradox. Biochemical basis If extraterrestrial life exists, it could range from simple microorganisms and multicellular organisms similar to animals or plants, to complex alien intelligences akin to humans. When scientists talk about extraterrestrial life, they consider all those types. Although it is possible that extraterrestrial life may have other configurations, scientists use the hierarchy of lifeforms from Earth for simplicity, as it is the only one known to exist. The first basic requirement for life is an environment with non-equilibrium thermodynamics, which means that the thermodynamic equilibrium must be broken by a source of energy. The traditional sources of energy in the cosmos are the stars, such as for life on Earth, which depends on the energy of the sun. However, there are other alternative energy sources, such as volcanoes, plate tectonics, and hydrothermal vents. There are ecosystems on Earth in deep areas of the ocean that do not receive sunlight, and take energy from black smokers instead. Magnetic fields and radioactivity have also been proposed as sources of energy, although they would be less efficient ones. Life on Earth requires water in a liquid state as a solvent in which biochemical reactions take place. It is highly unlikely that an abiogenesis process can start within a gaseous or solid medium: the atom speeds, either too fast or too slow, make it difficult for specific ones to meet and start chemical reactions. A liquid medium also allows the transport of nutrients and substances required for metabolism. Sufficient quantities of carbon and other elements, along with water, might enable the formation of living organisms on terrestrial planets with a chemical make-up and temperature range similar to that of Earth. Life based on ammonia rather than water has been suggested as an alternative, though this solvent appears less suitable than water. It is also conceivable that there are forms of life whose solvent is a liquid hydrocarbon, such as methane, ethane or propane. Another unknown aspect of potential extraterrestrial life would be the chemical elements that would compose it. Life on Earth is largely composed of carbon, but there could be other hypothetical types of biochemistry. A replacement for carbon would need to be able to create complex molecules, store information required for evolution, and be freely available in the medium. To create DNA, RNA, or a close analog, such an element should be able to bind its atoms with many others, creating complex and stable molecules. It should be able to create at least three covalent bonds: two for making long strings and at least a third to add new links and allow for diverse information. Only nine elements meet this requirement: boron, nitrogen, phosphorus, arsenic, antimony (three bonds), carbon, silicon, germanium and tin (four bonds). As for abundance, carbon, nitrogen, and silicon are the most abundant ones in the universe, far more than the others. On Earth's crust the most abundant of those elements is silicon, in the Hydrosphere it is carbon and in the atmosphere, it is carbon and nitrogen. Silicon, however, has disadvantages over carbon. The molecules formed with silicon atoms are less stable, and more vulnerable to acids, oxygen, and light. An ecosystem of silicon-based lifeforms would require very low temperatures, high atmospheric pressure, an atmosphere devoid of oxygen, and a solvent other than water. The low temperatures required would add an extra problem, the difficulty to kickstart a process of abiogenesis to create life in the first place. Norman Horowitz, head of the Jet Propulsion Laboratory bioscience section for the Mariner and Viking missions from 1965 to 1976 considered that the great versatility of the carbon atom makes it the element most likely to provide solutions, even exotic solutions, to the problems of survival of life on other planets. However, he also considered that the conditions found on Mars were incompatible with carbon based life. Even if extraterrestrial life is based on carbon and uses water as a solvent, like Earth life, it may still have a radically different biochemistry. Life is generally considered to be a product of natural selection. It has been proposed that to undergo natural selection a living entity must have the capacity to replicate itself, the capacity to avoid damage/decay, and the capacity to acquire and process resources in support of the first two capacities. Life on Earth may have started with an RNA world and later evolved to its current form, where some of the RNA tasks were transferred to DNA and proteins. Extraterrestrial life may still be stuck using RNA, or evolve into other configurations. It is unclear if our biochemistry is the most efficient one that could be generated, or which elements would follow a similar pattern. However, it is likely that, even if cells had a different composition to those from Earth, they would still have a cell membrane. Life on Earth jumped from prokaryotes to eukaryotes and from unicellular organisms to multicellular organisms through evolution. So far no alternative process to achieve such a result has been conceived, even if hypothetical. Evolution requires life to be divided into individual organisms, and no alternative organisation has been satisfactorily proposed either. At the basic level, membranes define the limit of a cell, between it and its environment, while remaining partially open to exchange energy and resources with it. The evolution from simple cells to eukaryotes, and from them to multicellular lifeforms, is not guaranteed. The Cambrian explosion took place thousands of millions of years after the origin of life, and its causes are not fully known yet. On the other hand, the jump to multicellularity took place several times, which suggests that it could be a case of convergent evolution, and so likely to take place on other planets as well. Palaeontologist Simon Conway Morris considers that convergent evolution would lead to kingdoms similar to our plants and animals, and that many features are likely to develop in alien animals as well, such as bilateral symmetry, limbs, digestive systems and heads with sensory organs. Scientists from the University of Oxford analysed it from the perspective of evolutionary theory and wrote in a study in the International Journal of Astrobiology that aliens may be similar to humans. The planetary context would also have an influence: a planet with higher gravity would have smaller animals, and other types of stars can lead to non-green photosynthesizers. The amount of energy available would also affect biodiversity, as an ecosystem sustained by black smokers or hydrothermal vents would have less energy available than those sustained by a star's light and heat, and so its lifeforms would not grow beyond a certain complexity. There is also research in assessing the capacity of life for developing intelligence. It has been suggested that this capacity arises with the number of potential niches a planet contains, and that the complexity of life itself is reflected in the information density of planetary environments, which in turn can be computed from its niches. It is common knowledge that the conditions on other planets in the solar system, in addition to the many galaxies outside of the Milky Way galaxy, are very harsh and seem to be too extreme to harbor any life. The environmental conditions on these planets can have intense UV radiation paired with extreme temperatures, lack of water, and much more that can lead to conditions that don't seem to favor the creation or maintenance of extraterrestrial life. However, there has been much historical evidence that some of the earliest and most basic forms of life on Earth originated in some extreme environments that seem unlikely to have harbored life at least at one point in Earth's history. Fossil evidence as well as many historical theories backed up by years of research and studies have marked environments like hydrothermal vents or acidic hot springs as some of the first places that life could have originated on Earth. These environments can be considered extreme when compared to the typical ecosystems that the majority of life on Earth now inhabit, as hydrothermal vents are scorching hot due to the magma escaping from the Earth's mantle and meeting the much colder oceanic water. Even in today's world, there can be a diverse population of bacteria found inhabiting the area surrounding these hydrothermal vents which can suggest that some form of life can be supported even in the harshest of environments like the other planets in the solar system. The aspects of these harsh environments that make them ideal for the origin of life on Earth, as well as the possibility of creation of life on other planets, is the chemical reactions forming spontaneously. For example, the hydrothermal vents found on the ocean floor are known to support many chemosynthetic processes which allow organisms to utilize energy through reduced chemical compounds that fix carbon. In return, these reactions will allow for organisms to live in relatively low oxygenated environments while maintaining enough energy to support themselves. The early Earth environment was reducing and therefore, these carbon fixing compounds were necessary for the survival and possible origin of life on Earth. With the little amount of information that scientists have found regarding the atmosphere on other planets in the Milky Way galaxy and beyond, the atmospheres are most likely reducing or with very low oxygen levels, especially when compared with Earth's atmosphere. If there were the necessary elements and ions on these planets, the same carbon fixing, reduced chemical compounds occurring around hydrothermal vents could also occur on these planets' surfaces and possibly result in the origin of extraterrestrial life. Planetary habitability in the Solar System The Solar System has a wide variety of planets, dwarf planets, and moons, and each one is studied for its potential to host life. Each one has its own specific conditions that may benefit or harm life. So far, the only lifeforms found are those from Earth. No extraterrestrial intelligence other than humans exists or has ever existed within the Solar System. Astrobiologist Mary Voytek points out that it would be unlikely to find large ecosystems, as they would have already been detected by now. The inner Solar System is likely devoid of life. However, Venus is still of interest to astrobiologists, as it is a terrestrial planet that was likely similar to Earth in its early stages and developed in a different way. There is a greenhouse effect, the surface is the hottest in the Solar System, sulfuric acid clouds, all surface liquid water is lost, and it has a thick carbon-dioxide atmosphere with huge pressure. Comparing both helps to understand the precise differences that lead to beneficial or harmful conditions for life. And despite the conditions against life on Venus, there are suspicions that microbial life-forms may still survive in high-altitude clouds. Mars is a cold and almost airless desert, inhospitable to life. However, recent studies revealed that water on Mars used to be quite abundant, forming rivers, lakes, and perhaps even oceans. Mars may have been habitable back then, and life on Mars may have been possible. But when the planetary core ceased to generate a magnetic field, solar winds removed the atmosphere and the planet became vulnerable to solar radiation. Ancient life-forms may still have left fossilised remains, and microbes may still survive deep underground. As mentioned, the gas giants and ice giants are unlikely to contain life. The most distant solar system bodies, found in the Kuiper Belt and outwards, are locked in permanent deep-freeze, but cannot be ruled out completely. Although the giant planets themselves are highly unlikely to have life, there is much hope to find it on moons orbiting these planets. Europa, from the Jovian system, has a subsurface ocean below a thick layer of ice. Ganymede and Callisto also have subsurface oceans, but life is less likely in them because water is sandwiched between layers of solid ice. Europa would have contact between the ocean and the rocky surface, which helps the chemical reactions. It may be difficult to dig so deep in order to study those oceans, though. Enceladus, a tiny moon of Saturn with another subsurface ocean, may not need to be dug, as it releases water to space in eruption columns. The space probe Cassini flew inside one of these, but could not make a full study because NASA did not expect this phenomenon and did not equip the probe to study ocean water. Still, Cassini detected complex organic molecules, salts, evidence of hydrothermal activity, hydrogen, and methane. Titan is the only celestial body in the Solar System besides Earth that has liquid bodies on the surface. It has rivers, lakes, and rain of hydrocarbons, methane, and ethane, and even a cycle similar to Earth's water cycle. This special context encourages speculations about lifeforms with different biochemistry, but the cold temperatures would make such chemistry take place at a very slow pace. Water is rock-solid on the surface, but Titan does have a subsurface water ocean like several other moons. However, it is of such a great depth that it would be very difficult to access it for study. Scientific search The science that searches and studies life in the universe, both on Earth and elsewhere, is called astrobiology. With the study of Earth's life, the only known form of life, astrobiology seeks to study how life starts and evolves and the requirements for its continuous existence. This helps to determine what to look for when searching for life in other celestial bodies. This is a complex area of study, and uses the combined perspectives of several scientific disciplines, such as astronomy, biology, chemistry, geology, oceanography, and atmospheric sciences. The scientific search for extraterrestrial life is being carried out both directly and indirectly. As of September 2017[update], 3,667 exoplanets in 2,747 systems have been identified, and other planets and moons in the Solar System hold the potential for hosting primitive life such as microorganisms. As of 8 February 2021, an updated status of studies considering the possible detection of lifeforms on Venus (via phosphine) and Mars (via methane) was reported. Scientists search for biosignatures within the Solar System by studying planetary surfaces and examining meteorites. Some claim to have identified evidence that microbial life has existed on Mars. In 1996, a controversial report stated that structures resembling nanobacteria were discovered in a meteorite, ALH84001, formed of rock ejected from Mars. Although all the unusual properties of the meteorite were eventually explained as the result of inorganic processes, the controversy over its discovery laid the groundwork for the development of astrobiology. An experiment on the two Viking Mars landers reported gas emissions from heated Martian soil samples that some scientists argue are consistent with the presence of living microorganisms. Lack of corroborating evidence from other experiments on the same samples suggests that a non-biological reaction is a more likely hypothesis. In February 2005 NASA scientists reported they may have found some evidence of extraterrestrial life on Mars. The two scientists, Carol Stoker and Larry Lemke of NASA's Ames Research Center, based their claim on methane signatures found in Mars's atmosphere resembling the methane production of some forms of primitive life on Earth, as well as on their own study of primitive life near the Rio Tinto river in Spain. NASA officials soon distanced NASA from the scientists' claims, and Stoker herself backed off from her initial assertions. In November 2011, NASA launched the Mars Science Laboratory that landed the Curiosity rover on Mars. It is designed to assess the past and present habitability on Mars using a variety of scientific instruments. The rover landed on Mars at Gale Crater in August 2012. A group of scientists at Cornell University started a catalog of microorganisms, with the way each one reacts to sunlight. The goal is to help with the search for similar organisms in exoplanets, as the starlight reflected by planets rich in such organisms would have a specific spectrum, unlike that of starlight reflected from lifeless planets. If Earth was studied from afar with this system, it would reveal a shade of green, as a result of the abundance of plants with photosynthesis. In August 2011, NASA studied meteorites found on Antarctica, finding adenine, guanine, hypoxanthine, and xanthine. Adenine and guanine are components of DNA, and the others are used in other biological processes. The studies ruled out pollution of the meteorites on Earth, as those components would not be freely available the way they were found in the samples. This discovery suggests that several organic molecules that serve as building blocks of life may be generated within asteroids and comets. In October 2011, scientists reported that cosmic dust contains complex organic compounds ("amorphous organic solids with a mixed aromatic-aliphatic structure") that could be created naturally, and rapidly, by stars. It is still unclear if those compounds played a role in the creation of life on Earth, but Sun Kwok, of the University of Hong Kong, thinks so. "If this is the case, life on Earth may have had an easier time getting started as these organics can serve as basic ingredients for life." In August 2012, and in a world first, astronomers at Copenhagen University reported the detection of a specific sugar molecule, glycolaldehyde, in a distant star system. The molecule was found around the protostellar binary IRAS 16293-2422, which is located 400 light years from Earth. Glycolaldehyde is needed to form ribonucleic acid, or RNA, which is similar in function to DNA. This finding suggests that complex organic molecules may form in stellar systems prior to the formation of planets, eventually arriving on young planets early in their formation. In December 2023, astronomers reported the first time discovery, in the plumes of Enceladus, moon of the planet Saturn, of hydrogen cyanide, a possible chemical essential for life as we know it, as well as other organic molecules, some of which are yet to be better identified and understood. According to the researchers, "these [newly discovered] compounds could potentially support extant microbial communities or drive complex organic synthesis leading to the origin of life." Although most searches are focused on the biology of extraterrestrial life, an extraterrestrial intelligence capable enough to develop a civilization may be detectable by other means as well. Technology may generate technosignatures, effects on the native planet that may not be caused by natural causes. There are three main types of techno-signatures considered: interstellar communications, effects on the atmosphere, and planetary-sized structures such as Dyson spheres. Organizations such as the SETI Institute search the cosmos for potential forms of communication. They started with radio waves, and now search for laser pulses as well. The challenge for this search is that there are natural sources of such signals as well, such as gamma-ray bursts and supernovae, and the difference between a natural signal and an artificial one would be in its specific patterns. Astronomers intend to use artificial intelligence for this, as it can manage large amounts of data and is devoid of biases and preconceptions. Besides, even if there is an advanced extraterrestrial civilization, there is no guarantee that it is transmitting radio communications in the direction of Earth. The length of time required for a signal to travel across space means that a potential answer may arrive decades or centuries after the initial message. The atmosphere of Earth is rich in nitrogen dioxide as a result of air pollution, which can be detectable. The natural abundance of carbon, which is also relatively reactive, makes it likely to be a basic component of the development of a potential extraterrestrial technological civilization, as it is on Earth. Fossil fuels may likely be generated and used on such worlds as well. The abundance of chlorofluorocarbons in the atmosphere can also be a clear technosignature, considering their role in ozone depletion. Light pollution may be another technosignature, as multiple lights on the night side of a rocky planet can be a sign of advanced technological development. However, modern telescopes are not strong enough to study exoplanets with the required level of detail to perceive it. The Kardashev scale proposes that a civilization may eventually start consuming energy directly from its local star. This would require giant structures built next to it, called Dyson spheres. Those speculative structures would cause an excess infrared radiation, that telescopes may notice. The infrared radiation is typical of young stars, surrounded by dusty protoplanetary disks that will eventually form planets. An older star such as the Sun would have no natural reason to have excess infrared radiation. The presence of heavy elements in a star's light-spectrum is another potential biosignature; such elements would (in theory) be found if the star were being used as an incinerator/repository for nuclear waste products. Some astronomers search for extrasolar planets that may be conducive to life, narrowing the search to terrestrial planets within the habitable zones of their stars. Since 1992, over four thousand exoplanets have been discovered (6,128 planets in 4,584 planetary systems including 1,017 multiple planetary systems as of 30 October 2025). The extrasolar planets so far discovered range in size from that of terrestrial planets similar to Earth's size to that of gas giants larger than Jupiter. The number of observed exoplanets is expected to increase greatly in the coming years.[better source needed] The Kepler space telescope has also detected a few thousand candidate planets, of which about 11% may be false positives. There is at least one planet on average per star. About 1 in 5 Sun-like stars[a] have an "Earth-sized"[b] planet in the habitable zone,[c] with the nearest expected to be within 12 light-years distance from Earth. Assuming 200 billion stars in the Milky Way,[d] that would be 11 billion potentially habitable Earth-sized planets in the Milky Way, rising to 40 billion if red dwarfs are included. The rogue planets in the Milky Way possibly number in the trillions. The nearest known exoplanet is Proxima Centauri b, located 4.2 light-years (1.3 pc) from Earth in the southern constellation of Centaurus. As of March 2014[update], the least massive exoplanet known is PSR B1257+12 A, which is about twice the mass of the Moon. The most massive planet listed on the NASA Exoplanet Archive is DENIS-P J082303.1−491201 b, about 29 times the mass of Jupiter, although according to most definitions of a planet, it is too massive to be a planet and may be a brown dwarf instead. Almost all of the planets detected so far are within the Milky Way, but there have also been a few possible detections of extragalactic planets. The study of planetary habitability also considers a wide range of other factors in determining the suitability of a planet for hosting life. One sign that a planet probably already contains life is the presence of an atmosphere with significant amounts of oxygen, since that gas is highly reactive and generally would not last long without constant replenishment. This replenishment occurs on Earth through photosynthetic organisms. One way to analyse the atmosphere of an exoplanet is through spectrography when it transits its star, though this might only be feasible with dim stars like white dwarfs. History and cultural impact The modern concept of extraterrestrial life is based on assumptions that were not commonplace during the early days of astronomy. The first explanations for the celestial objects seen in the night sky were based on mythology. Scholars from Ancient Greece were the first to consider that the universe is inherently understandable and rejected explanations based on supernatural incomprehensible forces, such as the myth of the Sun being pulled across the sky in the chariot of Apollo. They had not developed the scientific method yet and based their ideas on pure thought and speculation, but they developed precursor ideas to it, such as that explanations had to be discarded if they contradict observable facts. The discussions of those Greek scholars established many of the pillars that would eventually lead to the idea of extraterrestrial life, such as Earth being round and not flat. The cosmos was first structured in a geocentric model that considered that the sun and all other celestial bodies revolve around Earth. However, they did not consider them as worlds. In Greek understanding, the world was composed by both Earth and the celestial objects with noticeable movements. Anaximander thought that the cosmos was made from apeiron, a substance that created the world, and that the world would eventually return to the cosmos. Eventually two groups emerged, the atomists that thought that matter at both Earth and the cosmos was equally made of small atoms of the classical elements (earth, water, fire and air), and the Aristotelians who thought that those elements were exclusive of Earth and that the cosmos was made of a fifth one, the aether. Atomist Epicurus thought that the processes that created the world, its animals and plants should have created other worlds elsewhere, along with their own animals and plants. Aristotle thought instead that all the earth element naturally fell towards the center of the universe, and that would make it impossible for other planets to exist elsewhere. Under that reasoning, Earth was not only in the center, it was also the only planet in the universe. Cosmic pluralism, the plurality of worlds, or simply pluralism, describes the philosophical belief in numerous "worlds" in addition to Earth, which might harbor extraterrestrial life. The earliest recorded assertion of extraterrestrial human life is found in ancient scriptures of Jainism. There are multiple "worlds" mentioned in Jain scriptures that support human life. These include, among others, Bharat Kshetra, Mahavideh Kshetra, Airavat Kshetra, and Hari kshetra. Medieval Muslim writers like Fakhr al-Din al-Razi and Muhammad al-Baqir supported cosmic pluralism on the basis of the Qur'an. Chaucer's poem The House of Fame engaged in medieval thought experiments that postulated the plurality of worlds. However, those ideas about other worlds were different from the current knowledge about the structure of the universe, and did not postulate the existence of planetary systems other than the Solar System. When those authors talk about other worlds, they talk about places located at the center of their own systems, and with their own stellar vaults and cosmos surrounding them. The Greek ideas and the disputes between atomists and Aristotelians outlived the fall of the Greek empire. The Great Library of Alexandria compiled information about it, part of which was translated by Islamic scholars and thus survived the end of the Library. Baghdad combined the knowledge of the Greeks, the Indians, the Chinese and its own scholars, and the knowledge expanded through the Byzantine Empire. From there it eventually returned to Europe by the time of the Middle Ages. However, as the Greek atomist doctrine held that the world was created by random movements of atoms, with no need for a creator deity, it became associated with atheism, and the dispute intertwined with religious ones. Still, the Church did not react to those topics in a homogeneous way, and there were stricter and more permissive views within the church itself. The first known mention of the term 'panspermia' was in the writings of the 5th-century BC Greek philosopher Anaxagoras. He proposed the idea that life exists everywhere. By the time of the late Middle Ages there were many known inaccuracies in the geocentric model, but it was kept in use because naked eye observations provided limited data. Nicolaus Copernicus started the Copernican Revolution by proposing that the planets revolve around the sun rather than Earth. His proposal had little acceptance at first because, as he kept the assumption that orbits were perfect circles, his model led to as many inaccuracies as the geocentric one. Tycho Brahe improved the available data with naked-eye observatories, which worked with highly complex sextants and quadrants. Tycho could not make sense of his observations, but Johannes Kepler did: orbits were not perfect circles, but ellipses. This knowledge benefited the Copernican model, which worked now almost perfectly. The invention of the telescope a short time later, perfected by Galileo Galilei, clarified the final doubts, and the paradigm shift was completed. Under this new understanding, the notion of extraterrestrial life became feasible: if Earth is but just a planet orbiting around a star, there may be planets similar to Earth elsewhere. The astronomical study of distant bodies also proved that physical laws are the same elsewhere in the universe as on Earth, with nothing making the planet truly special. The new ideas were met with resistance from the Catholic church. Galileo was tried for the heliocentric model, which was considered heretical, and forced to recant it. The best-known early-modern proponent of ideas of extraterrestrial life was the Italian philosopher Giordano Bruno, who argued in the 16th century for an infinite universe in which every star is surrounded by its own planetary system. Bruno wrote that other worlds "have no less virtue nor a nature different to that of our earth" and, like Earth, "contain animals and inhabitants". Bruno's belief in the plurality of worlds was one of the charges leveled against him by the Venetian Holy Inquisition, which tried and executed him. The heliocentric model was further strengthened by the postulation of the theory of gravity by Sir Isaac Newton. This theory provided the mathematics that explains the motions of all things in the universe, including planetary orbits. By this point, the geocentric model was definitely discarded. By this time, the use of the scientific method had become a standard, and new discoveries were expected to provide evidence and rigorous mathematical explanations. Science also took a deeper interest in the mechanics of natural phenomena, trying to explain not just the way nature works but also the reasons for working that way. There was very little actual discussion about extraterrestrial life before this point, as the Aristotelian ideas remained influential while geocentrism was still accepted. When it was finally proved wrong, it not only meant that Earth was not the center of the universe, but also that the lights seen in the sky were not just lights, but physical objects. The notion that life may exist in them as well soon became an ongoing topic of discussion, although one with no practical ways to investigate. The possibility of extraterrestrials remained a widespread speculation as scientific discovery accelerated. William Herschel, the discoverer of Uranus, was one of many 18th–19th-century astronomers who believed that the Solar System is populated by alien life. Other scholars of the period who championed "cosmic pluralism" included Immanuel Kant and Benjamin Franklin. At the height of the Enlightenment, even the Sun and Moon were considered candidates for extraterrestrial inhabitants. Speculation about life on Mars increased in the late 19th century, following telescopic observation of apparent Martian canals – which soon, however, turned out to be optical illusions. Despite this, in 1895, American astronomer Percival Lowell published his book Mars, followed by Mars and its Canals in 1906, proposing that the canals were the work of a long-gone civilisation. Spectroscopic analysis of Mars's atmosphere began in earnest in 1894, when U.S. astronomer William Wallace Campbell showed that neither water nor oxygen was present in the Martian atmosphere. By 1909 better telescopes and the best perihelic opposition of Mars since 1877 conclusively put an end to the canal hypothesis. As a consequence of the belief in the spontaneous generation there was little thought about the conditions of each celestial body: it was simply assumed that life would thrive anywhere. This theory was disproved by Louis Pasteur in the 19th century. Popular belief in thriving alien civilisations elsewhere in the solar system still remained strong until Mariner 4 and Mariner 9 provided close images of Mars, which debunked forever the idea of the existence of Martians and decreased the previous expectations of finding alien life in general. The end of the spontaneous generation belief forced investigation into the origin of life. Although abiogenesis is the more accepted theory, a number of authors reclaimed the term "panspermia" and proposed that life was brought to Earth from elsewhere. Some of those authors are Jöns Jacob Berzelius (1834), Kelvin (1871), Hermann von Helmholtz (1879) and, somewhat later, by Svante Arrhenius (1903). The science fiction genre, although not so named during the time, developed during the late 19th century. The expansion of the genre of extraterrestrials in fiction influenced the popular perception over the real-life topic, making people eager to jump to conclusions about the discovery of aliens. Science marched at a slower pace, some discoveries fueled expectations and others dashed excessive hopes. For example, with the advent of telescopes, most structures seen on the Moon or Mars were immediately attributed to Selenites or Martians, and later ones (such as more powerful telescopes) revealed that all such discoveries were natural features. A famous case is the Cydonia region of Mars, first imaged by the Viking 1 orbiter. The low-resolution photos showed a rock formation that resembled a human face, but later spacecraft took photos in higher detail that showed that there was nothing special about the site. The search and study of extraterrestrial life became a science of its own, astrobiology. Also known as exobiology, this discipline is studied by the NASA, the ESA, the INAF, and others. Astrobiology studies life from Earth as well, but with a cosmic perspective. For example, abiogenesis is of interest to astrobiology, not because of the origin of life on Earth, but for the chances of a similar process taking place in other celestial bodies. Many aspects of life, from its definition to its chemistry, are analyzed as either likely to be similar in all forms of life across the cosmos or only native to Earth. Astrobiology, however, remains constrained by the current lack of extraterrestrial life-forms to study, as all life on Earth comes from the same ancestor, and it is hard to infer general characteristics from a group with a single example to analyse. The 20th century came with great technological advances, speculations about future hypothetical technologies, and an increased basic knowledge of science by the general population thanks to science divulgation through the mass media. The public interest in extraterrestrial life and the lack of discoveries by mainstream science led to the emergence of pseudosciences that provided affirmative, if questionable, answers to the existence of aliens. Ufology claims that many unidentified flying objects (UFOs) would be spaceships from alien species, and ancient astronauts hypothesis claim that aliens would have visited Earth in antiquity and prehistoric times but people would have failed to understand it by then. Most UFOs or UFO sightings can be readily explained as sightings of Earth-based aircraft (including top-secret aircraft), known astronomical objects or weather phenomenons, or as hoaxes. Looking beyond the pseudosciences, Lewis White Beck strove to elevate the level of public discourse on the topic of extraterrestrial life by tracing the evolution of philosophical thought over the centuries from ancient times into the modern era. His review of the contributions made by Lucretius, Plutarch, Aristotle, Copernicus, Immanuel Kant, John Wilkins, Charles Darwin and Karl Marx demonstrated that even in modern times, humanity could be profoundly influenced in its search for extraterrestrial life by subtle and comforting archetypal ideas which are largely derived from firmly held religious, philosophical and existential belief systems. On a positive note, however, Beck further argued that even if the search for extraterrestrial life proves to be unsuccessful, the endeavor itself could have beneficial consequences by assisting humanity in its attempt to actualize superior ways of living here on Earth. By the 21st century, it was accepted that multicellular life in the Solar System can only exist on Earth, but the interest in extraterrestrial life increased regardless. This is a result of the advances in several sciences. The knowledge of planetary habitability allows to consider on scientific terms the likelihood of finding life at each specific celestial body, as it is known which features are beneficial and harmful for life. Astronomy and telescopes also improved to the point exoplanets can be confirmed and even studied, increasing the number of search places. Life may still exist elsewhere in the Solar System in unicellular form, but the advances in spacecraft allow to send robots to study samples in situ, with tools of growing complexity and reliability. Although no extraterrestrial life has been found and life may still be just a rarity from Earth, there are scientific reasons to suspect that it can exist elsewhere, and technological advances that may detect it if it does. Many scientists are optimistic about the chances of finding alien life. In the words of SETI's Frank Drake, "All we know for sure is that the sky is not littered with powerful microwave transmitters". Drake noted that it is entirely possible that advanced technology results in communication being carried out in some way other than conventional radio transmission. At the same time, the data returned by space probes, and giant strides in detection methods, have allowed science to begin delineating habitability criteria on other worlds, and to confirm that at least other planets are plentiful, though aliens remain a question mark. The Wow! signal, detected in 1977 by a SETI project, remains a subject of speculative debate. On the other hand, other scientists are pessimistic. Jacques Monod wrote that "Man knows at last that he is alone in the indifferent immensity of the universe, whence which he has emerged by chance". In 2000, geologist and paleontologist Peter Ward and astrobiologist Donald Brownlee published a book entitled Rare Earth: Why Complex Life is Uncommon in the Universe.[better source needed] In it, they discussed the Rare Earth hypothesis, in which they claim that Earth-like life is rare in the universe, whereas microbial life is common. Ward and Brownlee are open to the idea of evolution on other planets that is not based on essential Earth-like characteristics such as DNA and carbon. As for the possible risks, theoretical physicist Stephen Hawking warned in 2010 that humans should not try to contact alien life forms. He warned that aliens might pillage Earth for resources. "If aliens visit us, the outcome would be much as when Columbus landed in America, which didn't turn out well for the Native Americans", he said. Jared Diamond had earlier expressed similar concerns. On 20 July 2015, Hawking and Russian billionaire Yuri Milner, along with the SETI Institute, announced a well-funded effort, called the Breakthrough Initiatives, to expand efforts to search for extraterrestrial life. The group contracted the services of the 100-meter Robert C. Byrd Green Bank Telescope in West Virginia in the United States and the 64-meter Parkes Telescope in New South Wales, Australia. On 13 February 2015, scientists (including Geoffrey Marcy, Seth Shostak, Frank Drake and David Brin) at a convention of the American Association for the Advancement of Science, discussed Active SETI and whether transmitting a message to possible intelligent extraterrestrials in the Cosmos was a good idea; one result was a statement, signed by many, that a "worldwide scientific, political and humanitarian discussion must occur before any message is sent". Government responses The 1967 Outer Space Treaty and the 1979 Moon Agreement define rules of planetary protection against potentially hazardous extraterrestrial life. COSPAR also provides guidelines for planetary protection. A committee of the United Nations Office for Outer Space Affairs had in 1977 discussed for a year strategies for interacting with extraterrestrial life or intelligence. The discussion ended without any conclusions. As of 2010, the UN lacks response mechanisms for the case of an extraterrestrial contact. One of the NASA divisions is the Office of Safety and Mission Assurance (OSMA), also known as the Planetary Protection Office. A part of its mission is to "rigorously preclude backward contamination of Earth by extraterrestrial life." In 2016, the Chinese Government released a white paper detailing its space program. According to the document, one of the research objectives of the program is the search for extraterrestrial life. It is also one of the objectives of the Chinese Five-hundred-meter Aperture Spherical Telescope (FAST) program. In 2020, Dmitry Rogozin, the head of the Russian space agency, said the search for extraterrestrial life is one of the main goals of deep space research. He also acknowledged the possibility of existence of primitive life on other planets of the Solar System. The French space agency has an office for the study of "non-identified aero spatial phenomena". The agency is maintaining a publicly accessible database of such phenomena, with over 1600 detailed entries. According to the head of the office, the vast majority of entries have a mundane explanation; but for 25% of entries, their extraterrestrial origin can neither be confirmed nor denied. In 2020, chairman of the Israel Space Agency Isaac Ben-Israel stated that the probability of detecting life in outer space is "quite large". But he disagrees with his former colleague Haim Eshed who stated that there are contacts between an advanced alien civilisation and some of Earth's governments. In fiction Although the idea of extraterrestrial peoples became feasible once astronomy developed enough to understand the nature of planets, they were not thought of as being any different from humans. Having no scientific explanation for the origin of mankind and its relation to other species, there was no reason to expect them to be any other way. This was changed by the 1859 book On the Origin of Species by Charles Darwin, which proposed the theory of evolution. Now with the notion that evolution on other planets may take other directions, science fiction authors created bizarre aliens, clearly distinct from humans. A usual way to do that was to add body features from other animals, such as insects or octopuses. Costuming and special effects feasibility alongside budget considerations forced films and TV series to tone down the fantasy, but these limitations lessened since the 1990s with the advent of computer-generated imagery (CGI), and later on as CGI became more effective and less expensive. Real-life events sometimes captivate people's imagination and this influences the works of fiction. For example, during the Barney and Betty Hill incident, the first recorded claim of an alien abduction, the couple reported that they were abducted and experimented on by aliens with oversized heads, big eyes, pale grey skin, and small noses, a description that eventually became the grey alien archetype once used in works of fiction. See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/State_of_Palestine] | [TOKENS: 16524] |
Contents Palestine Palestine,[i] officially the State of Palestine,[ii][g] is a country in West Asia. It encompasses the Israeli-occupied West Bank, including East Jerusalem, and the Gaza Strip, collectively known as the Palestinian territories, or occupied Palestinian territory. The territories share the vast majority of their borders with Israel, with the West Bank bordering Jordan to the east and the Gaza Strip bordering Egypt to the southwest. It has a total land area of 6,020 square kilometres (2,320 sq mi) while its population exceeds five million. Its proclaimed capital is Jerusalem, while Ramallah serves as its de facto administrative center. Gaza was its largest city prior to evacuations in 2023. Situated at a continental crossroad, the Palestine region was ruled by various empires and experienced various demographic changes from antiquity to the modern era. It was treading ground for the Nile and Mesopotamian armies and merchants from North Africa, China and India. The region has religious significance. The ongoing Israeli–Palestinian conflict dates back to the rise of the Zionist movement, supported by the United Kingdom during World War I. The war saw Britain occupying Palestine from the Ottoman Empire, where it set up Mandatory Palestine under the auspices of the League of Nations. Increased Jewish immigration led to intercommunal conflict between Jews and Palestinian Arabs, which escalated into a civil war in 1947 after a proposed partitioning by the United Nations was rejected by the Palestinians and other Arab nations. The 1948 Palestine war saw the forcible displacement of a majority of the Arab population, and consequently the establishment of Israel; these events are referred to by Palestinians as the Nakba ('catastrophe'). In the Six-Day War in 1967, Israel occupied the West Bank and the Gaza Strip, which had been held by Jordan and Egypt respectively. The Palestine Liberation Organization (PLO) declared independence in 1988. In 1993, the PLO signed the Oslo Accords with Israel, creating limited PLO governance in the West Bank and Gaza Strip through the Palestinian Authority (PA). Israel withdrew from Gaza in its unilateral disengagement in 2005, but the territory is still considered to be under military occupation and has been blockaded by Israel. In 2007, internal divisions between political factions led to a takeover of Gaza by Hamas. Since then, the West Bank has been governed in part by the Fatah-led PA, while the Gaza Strip has remained under the control of Hamas. Israel has constructed large settlements in the occupied West Bank and East Jerusalem since 1967, which currently house more than 670,000 Israeli settlers, which are illegal under international law. In 2023, Hamas launched the October 7 attacks against Israel, citing Israel's blockade of Gaza, Israeli occupation, and violence against Palestinians. In response, Israel launched a military campaign in Gaza, which has caused large-scale loss of life, mass population displacement, a humanitarian crisis, and an ongoing famine in the Gaza Strip. Israel has committed genocide against the Palestinian people during its ongoing invasion and bombing of the Gaza Strip. Palestine is permanent non-member observer state at the United Nations (UN) and is recognized as a sovereign state by 157 of the UN's 193 member states. The questions of Palestine's borders, legal and diplomatic status of Jerusalem, and the right of return of Palestinian refugees remain unsolved. Some of the other challenges to Palestine include restrictions on movement, ineffective government and Israeli settlements and settler violence, as well as an overall poor security situation. Despite these challenges, the country maintains an emerging economy and sees frequent tourism. Arabic is the official language of the country. While the majority of Palestinians practice Islam, Christianity also has a presence. Palestine is also a member of several international organizations, including the Arab League and the Organization of Islamic Cooperation, UNESCO, and a delegation of parliamentarians sit at the Parliamentary Assembly of the Council of Europe. Etymology The term "Palestine" (in Latin, Palæstina) comes via ancient Greek from a Semitic toponym for the general area dating back to the late second millennium BCE, a reflex of which is also to be found in the Biblical ethnonym Philistines. The term "Palestine" has been used to refer to the area at the southeast corner of the Mediterranean Sea beside Syria. In the 5th century BCE, Herodotus, in his work The Histories, used the term to describe a "district of Syria, called Palaistínē" (Ancient Greek: Συρίη ἡ Παλαιστίνη καλεομένη), in which Phoenicians interacted with other maritime peoples. Currently, the terms "Palestine", "State of Palestine", and "occupied Palestinian territory (oPt or OPT)" are interchangeable depending on context. Specifically, the term "occupied Palestinian territory" refers as a whole to the geographical area of the Palestinian territory occupied by Israel since 1967. Palestine can, depending on contexts, be referred to as a country or a state, and its authorities can generally be identified as the Government of Palestine. History The region of Palestine is part of the Levant, a land bridge between Africa and Eurasia that has traditionally served as the "crossroads of Western Asia, the Eastern Mediterranean, and Northeast Africa". Lying to the west of the Jordan Rift Valley, Palestine is, in tectonic terms, located in the "northwest of the Arabian Plate". A crossroads for religion, culture, commerce, and politics, Palestine was among the earliest regions to see human habitation, agricultural communities and civilization. In the Bronze Age, the Canaanites established city-states influenced by surrounding civilizations, among them Egypt, which ruled the area in the Late Bronze Age. During the Iron Age, the kingdoms of Israel and Judah, emerged in the interior, while kingdoms belonging to Philistia and Phoenicia ruled the Palestinian coast. The Assyrians conquered the region in the 8th century BCE, then the Babylonians c. 601 BCE, followed by the Persian Achaemenid Empire that conquered the Babylonian Empire in 539 BCE. Alexander the Great conquered the Persian Empire in the late 330s BCE, intensifying Hellenizing influences. Though Palestinian elites, in particular urban notable families who worked within the Ottoman bureaucracy, generally retained their loyalty to the Ottomans, they also played a significant role proportionately in the rise of Arab nationalism, and the Pan-Arabic movements that arose in response to both the emergence of the Young Turks movement and the subsequent weakening of Ottoman power in World War I. The onset of the Zionist movement, which sought to establish a Jewish homeland in Palestine, also exercised a strong influence on Palestinian national consciousness. Abdul Hamid, the last sultan of the Ottoman Empire to exert effective control over a fracturing state, opposed the Zionist movement's efforts in Palestine. The end of the Ottoman Empire's rule in Palestine coincided with the conclusion of World War I. The failure of Emir Faisal to establish a Greater Syria in the face of French and British colonial claims to the area, also shaped Palestinian elites' efforts to secure local autonomy. In the aftermath of the war, Palestine came under British control with the implementation of the British Mandate for Palestine in 1920. The defeat of the Ottoman Empire in World War I resulted in the dismantling of their rule. In 1920, the League of Nations granted Britain the mandate to govern Palestine, leading to the subsequent period of British administration. In 1917, Jerusalem was captured by British forces led by General Allenby, marking the end of Ottoman rule in the city. By 1920, tensions escalated between Jewish and Arab communities, resulting in violent clashes and riots across Palestine. The League of Nations approved the British Mandate for Palestine in 1922, entrusting Britain with the administration of the region. Throughout the 1920s, Palestine experienced growing resistance from both Jewish and Arab nationalist movements, which manifested in sporadic violence and protests against British policies. In 1929, violent riots erupted in Palestine due to disputes over Jewish immigration and access to the Western Wall in Jerusalem. The 1930s witnessed the outbreak of the Arab Revolt, as Arab nationalists demanded an end to Jewish immigration and the establishment of an independent Arab state. In response to the Arab Revolt, the British deployed military forces and implemented stringent security measures in an effort to quell the uprising. Arab nationalist groups, led by the Arab Higher Committee, called for an end to Jewish immigration and land sales to Jews. The issuance of the 1939 White Paper by the British government aimed to address escalating tensions between Arabs and Jews in Palestine. This policy document imposed restrictions on Jewish immigration and land purchases, with the intention to limit the establishment of a Jewish state. Met with strong opposition from the Zionist movement, the White Paper was perceived as a betrayal of the Balfour Declaration and Zionist aspirations for a Jewish homeland. In response to the White Paper, the Zionist community in Palestine organized a strike in 1939, rallying against the restrictions on Jewish immigration and land acquisition. This anti-White Paper strike involved demonstrations, civil disobedience, and a shutdown of businesses. Supported by various Zionist organizations, including the Jewish Agency and the Histadrut (General Federation of Jewish Labor), the anti-White Paper strike aimed to protest and challenge the limitations imposed by the British government. In the late 1930s and 1940s, several Zionist militant groups, including the Irgun, Hagana, and Lehi, carried out acts of violence against British military and civilian targets in their pursuit of an independent Jewish state. Menachem Begin and Yitzhak Shamir, who later became Prime Ministers of Israel, were behind these terrorist attacks. In 1946, a bombing orchestrated by the Irgun at the King David Hotel in Jerusalem resulted in the deaths of 91 people, including British officials, civilians, and hotel staff. The Exodus 1947 incident unfolded when a ship carrying Jewish Holocaust survivors, who sought refuge in Palestine, was intercepted by the British navy, leading to clashes and the eventual deportation of the refugees back to Europe. During World War II, Palestine served as a strategically significant location for British military operations against Axis forces in North Africa. The Grand Mufti of Jerusalem, Haj Amin al-Husseini, collaborated with Nazi Germany while in exile during World War II. In 1947, the United Nations proposed a partition plan for Palestine, suggesting separate Jewish and Arab states, but it was rejected by the Palestinians and by neighbouring Arab nations, leading to the outbreak of a civil war in Palestine, the first phase of the broader 1948 Palestine war. During the war, Israel gained additional territories that were designated to be part of the Arab state under the UN plan. On May 14, 1948, on the eve of final British withdrawal, the Jewish Agency for Israel, headed by David Ben-Gurion, declared the establishment of the State of Israel. The neighbouring Arab states of Transjordan, Egypt, and the other members of the Arab League of the time entered the war in Palestine beginning the 1948 Arab–Israeli war. By the end of the war, Egypt occupied the Gaza Strip, and Transjordan occupied and then annexed the West Bank. Egypt initially supported the creation of an All-Palestine Government but disbanded it in 1959. Transjordan never recognized it and instead decided to incorporate the West Bank with its own territory to form Jordan. The annexation was ratified in 1950 but was rejected by the international community. In 1964, when the West Bank was controlled by Jordan, the Palestine Liberation Organization was established there with the goal to confront Israel. The Palestinian National Charter of the PLO defines the boundaries of Palestine as the whole remaining territory of the mandate, including Israel. The Six-Day War in 1967, when Israel fought against Egypt, Jordan, and Syria, ended with Israel occupying the West Bank and the Gaza Strip, besides other territories.[better source needed] Following the Six-Day War, the PLO moved to Jordan, but relocated to Lebanon in 1971.[better source needed] The October 1974 Arab League summit designated the PLO as the "sole legitimate representative of the Palestinian people" and reaffirmed "their right to establish an independent state of urgency". In November 1974, the PLO was recognized as competent on all matters concerning the question of Palestine by the UN General Assembly granting them observer status as a "non-state entity" at the UN. Through the Camp David Accords of 1979, Egypt signaled an end to any claim of its own over the Gaza Strip. In July 1988, Jordan ceded its claims to the West Bank—with the exception of guardianship over Haram al-Sharif—to the PLO. After Israel captured and occupied the West Bank from Jordan and Gaza Strip from Egypt, it began to establish Israeli settlements there. Administration of the Arab population of these territories was performed by the Israeli Civil Administration of the Coordinator of Government Activities in the Territories and by local municipal councils present since before the Israeli takeover. In 1980, Israel decided to freeze elections for these councils and to establish instead Village Leagues, whose officials were under Israeli influence. Later this model became ineffective for both Israel and the Palestinians, and the Village Leagues began to break up, with the last being the Hebron League, dissolved in February 1988. The First Intifada broke out in 1987, characterized by widespread protests, strikes, and acts of civil disobedience by Palestinians in the Gaza Strip and the West Bank against Israeli occupation. In November 1988, the PLO legislature, while in exile, declared the establishment of the "State of Palestine". In the month following, it was quickly recognized by many states, including Egypt and Jordan. In the Palestinian Declaration of Independence, the State of Palestine is described as being established on the "Palestinian territory", without explicitly specifying further. After the 1988 Declaration of Independence, the UN General Assembly officially acknowledged the proclamation and decided to use the designation "Palestine" instead of "Palestine Liberation Organization" in the UN. In spite of this decision, the PLO did not participate at the UN in its capacity of the State of Palestine's government. Violent clashes between Palestinian protesters and Israeli forces intensified throughout 1989, resulting in a significant loss of life and escalating tensions in the occupied territories. 1990 witnessed the imposition of strict measures by the Israeli government, including curfews and closures, in an attempt to suppress the Intifada and maintain control over the occupied territories. The 1990–1991 Gulf War brought increased attention to the conflict, leading to heightened diplomatic efforts to find a peaceful resolution. Saddam Hussein was a supporter of Palestinian cause and won support from Arafat during the war. Following the invasion of Kuwait, Saddam surprised the international community by presenting a peace offer to Israel and withdrawing Iraqi forces from Kuwait, in exchange of withdrawal from the West Bank, the Gaza Strip, East Jerusalem and Golan Heights. Though the peace offer was rejected, Saddam then ordered firing of scud missiles into Israeli territory. This movement was supported by Palestinians. The war also led to the expulsion of Palestinians from Kuwait and Saudi Arabia, as their government supported Iraq. In 1993, the Oslo Accords were signed between Israel and the Palestine Liberation Organization (PLO), leading to the establishment of the Palestinian Authority (PA) and a potential path to peace. Yasser Arafat was elected as president of the newly formed Palestinian Authority in 1994, marking a significant step towards self-governance.[f] Israel acknowledged the PLO negotiating team as "representing the Palestinian people", in return for the PLO recognizing Israel's right to exist in peace, acceptance of UN Security Council resolutions 242 and 338, and its rejection of "violence and terrorism". As a result, in 1994 the PLO established the Palestinian National Authority (PNA or PA) territorial administration, that exercises some governmental functions[f] in parts of the West Bank and the Gaza Strip. As envisioned in the Oslo Accords, Israel allowed the PLO to establish interim administrative institutions in the Palestinian territories, which came in the form of the PNA. It was given civilian control in Area B and civilian and security control in Area A, and remained without involvement in Area C. The peace process gained opposition from both Palestinians and Israelis. Islamist militant organizations such as Hamas and Islamic Jihad opposed the attack and responded by conducting attacks on civilians across Israel. In 1994, Baruch Goldstein, an Israeli extremist shot 29 people to death in Hebron, known as the Cave of the Patriarchs massacre. These events led an increase in Palestinian opposition to the peace process. Tragically, in 1995, Israeli Prime Minister Yitzhak Rabin was assassinated by Yigal Amir – an extremist, causing political instability in the region. The first-ever Palestinian general elections took place in 1996, resulting in Arafat's re-election as president and the formation of a Palestinian Legislative Council. Initiating the implementation of the Oslo Accords, Israel began redeploying its forces from select Palestinian cities in the West Bank in 1997. Negotiations between Israel and the Palestinian Authority continued, albeit with slow progress and contentious debates on Jerusalem, settlements, and refugees in 1998. In 1997, Israeli government led by Benjamin Netanyahu and the Palestinian government signed the Hebron Protocol, which outlined the redeployment of Israeli forces from parts of Hebron in the West Bank, granting the government greater control over the city. Israel and the Palestinian government signed the Wye River Memorandum in 1998, aiming to advance the implementation of the Oslo Accords. The agreement included provisions for Israeli withdrawals and security cooperation. The period of the Oslo Years brought a great prosperity to the government-controlled areas, despite some economic issues. The Palestinian Authority built the country's second airport in Gaza, after the Jerusalem International Airport. Inaugural ceremony of the airport was attended by Bill Clinton and Nelson Mandela. In 1999, Ehud Barak assumed the position of Israeli Prime Minister, renewing efforts to reach a final status agreement with the Palestinians. The Camp David Summit in 2000 aimed to resolve the remaining issues but concluded without a comprehensive agreement, serving as a milestone in the peace process. A peace summit between Yasser Arafat and Ehud Barak was mediated by Bill Clinton in 2000. It was supposed to be the final agreement ending conflict officially forever; however, the agreement failed to address the Palestinian refugee issues, the status of Jerusalem and Israeli security concerns. Both sides blamed each other for the summit failures. This became one of the main triggers for the uprising that would happen next. In September 2000, then opposition leader from the Likud Party, Ariel Sharon, made a proactive visit to the Temple Mount and delivered a controversial speech, which angered Palestinian Jerusalemites. The tensions escalated into riots. Bloody clashes took place around Jerusalem. Escalating violence resulted in the closure of Jerusalem Airport, which hasn't operated to date. More and more riots between Jews and Arabs took place in October 2000 in Israel. In the same month, two Israeli soldiers were lynched and killed in Ramallah. Between November and December clashes between Palestinians and Israelis increased further. In 2001 the Taba summit was held between Israel and Palestine, but the summit failed and Ariel Sharon became prime minister in the 2001 elections. By 2001, attacks by Palestinian militant groups on Israel increased. Gaza Airport was destroyed in an airstrike by the Israeli army in 2001, with Israel claiming this was retaliation for previous attacks by Hamas. In January 2002, the IDF Shayetet 13 naval commandos captured the Karine A, a freighter carrying weapons from Iran towards Israel. UNSC Resolution 1397 was passed, which reaffirmed a two-state solution and laid the groundwork for a road map for peace. Another attack by Hamas left 30 people killed in Netanya. A peace summit was organized by the Arab League in Beirut, which was endorsed by Arafat and nearly ignored by Israel. In 2002, Israel launched Operation Defensive Shield after the Passover massacre. Heavy fighting between IDF and Palestinian fighters took place in Jenin. The Church of the Nativity was besieged by the IDF for one week until successful negotiations took place, which resulted in withdrawal of the Israeli troops from the church. Between 2003 and 2004, people from Qawasameh tribe in Hebron were either killed or blew themselves up in suicide bombing. Ariel Sharon ordered construction of barriers across Palestinian-controlled areas and Israeli settlements in the West Bank to prevent future attacks. Saddam Hussein provided financial support to Palestinian militants from Iraq during the intifada period, from 2000 until his overthrow in 2003. A peace proposal was made in 2003, which was supported by Arafat and rejected by Sharon. In 2004 Hamas's leader and co-founder, Ahmed Yassin, was assassinated by the Israeli army in Gaza. Yasser Arafat was confined to his headquarters in Ramallah. On 11 November, Yasser Arafat died in Paris. In the first week of 2005, Mahmoud Abbas was elected as the president of the State of Palestine. In 2005, Israel completely withdrew from the Gaza Strip by destroying its settlements there. By 2005, the situation began de-escalating. In 2006, Hamas won in Palestinian legislative elections. This led to a political standoff with Fatah. Armed clashes took place across both the West Bank and the Gaza Strip. The clashes turned into a civil war, which ended in bloody clashes on the Gaza Strip. As a result, Hamas gained control over all the territory of Gaza. Hundreds of people were killed in the civil war, including militants and civilians. Since then Hamas has gained more independence in its military practices. Since 2007, Israel has been leading a partial blockade on Gaza. Another peace summit was organized by the Arab League in 2007, with the same offer which was presented at the 2002 summit. However, the peace process could not progress. The PNA gained full control of the Gaza Strip with the exception of its borders, airspace, and territorial waters.[f] The division between the West Bank and Gaza complicated efforts to achieve Palestinian unity and negotiate a comprehensive peace agreement with Israel. Multiple rounds of reconciliation talks were held, but no lasting agreement was reached. The division also hindered the establishment of a unified Palestinian state and led to different governance structures and policies in the two territories. Following the intra-Palestinian conflict in 2006, Hamas took over control of the Gaza Strip (it already had majority in the PLC), and Fatah took control of the West Bank. From 2007, the Gaza Strip was governed by Hamas, and the West Bank by the Fatah-party–led Palestinian Authority. International efforts to revive the peace process continued. The United States, under the leadership of different administrations, made various attempts to broker negotiations between Israel and the Palestinians. Significant obstacles, such as settlement expansion, the status of Jerusalem, borders, and the right of return for Palestinian refugees, remained unresolved. In recent years, diplomatic initiatives have emerged, including the normalization agreements between Israel and several Arab states, known as the Abraham Accords. These agreements, while not directly addressing the Israeli–Palestinian conflict, have reshaped regional dynamics and raised questions about the future of Palestinian aspirations for statehood. The status quo remains challenging for Palestinians, with ongoing issues of occupation, settlement expansion, restricted movement, and economic hardships. The October 7 attacks in Israel in 2023 were followed by the Gaza war. The war has caused widespread destruction, a humanitarian crisis, and an ongoing famine in the Gaza Strip. Most of the population was forcibly displaced. Since the start of the war, over 60,000 Palestinians in Gaza have been killed, almost half of them women and children, and more than 148,000 injured. Israel has committed genocide against the Palestinian people during its ongoing invasion and bombing of the Gaza Strip. A study in The Lancet estimated 64,260 deaths in Gaza from traumatic injuries by June 2024, while noting a potentially larger death toll when "indirect" deaths are included. As of May 2025, a comparable figure for traumatic-injury deaths would be 93,000. There was also a spillover of the war occurring in the West Bank. Geography Areas claimed by the country, known as the Palestinian territories, lie in the Southern Levant of the Middle East region. Palestine is part of the Fertile Crescent, along with Israel, Jordan, Lebanon, Iraq and Syria. The Gaza Strip borders the Mediterranean Sea to the west, Egypt to the south, and Israel to the north and east. The West Bank is bordered by Jordan to the east, and Israel to the north, south, and west. Palestine shares its maritime borders with Israel, Egypt and Cyprus. Thus, the two enclaves constituting the area claimed by the State of Palestine have no geographical border with one another, being separated by Israel. These areas would constitute the world's 163rd largest country by land area.[better source needed] The West Bank is a mountainous region. It is divided in three regions, namely the Mount Nablus (Jabal Nablus), the Hebron Hills and Jerusalem Mountains (Jibal al–Quds). The Samarian Hills and Judean Hills are mountain ranges in the West Bank, with Mount Nabi Yunis at a height of 1,030 metres (3,380 ft) in Hebron Governorate as their highest peak. Until 19th century, Hebron was highest city in the Middle East. Jerusalem is located on a plateau in the central highlands and is surrounded by valleys. The territory consists of fertile valleys, such as the Jezreel Valley and the Jordan River Valley. Palestine is home to world's largest olive tree, located in Jerusalem. Around 45% of Palestine's land is dedicated to growing olive trees. Palestine features significant lakes and rivers that play a vital role in its geography and ecosystems. The Jordan River flows southward, forming part of Palestine's eastern border and passing through the Sea of Galilee before reaching the Dead Sea. According to Christian traditions, it is site of the baptism of Jesus. The Dead Sea, bordering the country's east is the lowest point on the earth. Jericho, located nearby, is the lowest city in the world. Villages and suburban areas around Jerusalem are home to ancient water bodies. There are several river valleys (wadi) across the country. These waterways provide essential resources for agriculture and recreation while supporting various ecosystems. Three terrestrial ecoregions are found in the area: Eastern Mediterranean conifer–sclerophyllous–broadleaf forests, Arabian Desert, and Mesopotamian shrub desert. Palestine has a number of environmental issues; issues facing the Gaza Strip include desertification; salination of fresh water; sewage treatment; water-borne diseases; soil degradation; and depletion and contamination of underground water resources. In the West Bank, many of the same issues apply; although fresh water is much more plentiful, access is restricted by the ongoing dispute. Temperatures in Palestine vary widely. The climate in the West Bank is mostly Mediterranean, slightly cooler at elevated areas compared with the shoreline, west to the area. In the east, the West Bank includes much of the Judean Desert including the western shoreline of the Dead Sea, characterized by dry and hot climate. Gaza has a hot semi-arid climate (Köppen: BSh) with mild winters and dry hot summers. Spring arrives around March–April. The hottest months are July and August, with the average high being 33 °C (91 °F). The coldest month is January, with temperatures usually at 7 °C (45 °F). Rain is scarce and generally falls between November and March. Annual precipitation is approximately 4.57 inches (116 mm). Palestine does not have officially recognized national parks or protected areas. However, there are areas within the West Bank that are considered to have ecological and cultural significance and are being managed with conservation efforts. These areas are often referred to as nature reserves or protected zones. Located near Jericho in the West Bank, Wadi Qelt is a desert valley with unique flora and fauna. The reserve is known for its rugged landscapes, natural springs, and historical sites such as the St. George Monastery. Efforts have been made to protect the biodiversity and natural beauty of the area. The Judaean Desert is known for its Judaean Camels. Qalqilya Zoo in Qalqilya Governorate, is the only zoo currently active in the country. Gaza Zoo was closed due to poor conditions. The Israeli government has established various national parks in Area C, which is considered illegal under international law. Government and politics Palestine operates a semi-presidential system of government. The country consists of the institutions that are associated with the Palestine Liberation Organization (PLO), which includes the President of the State of Palestine,[e] who is appointed by the Palestinian Central Council, the Palestinian National Council, and the Executive Committee of the Palestine Liberation Organization, which performs the functions of a government in exile,[excessive citations] maintaining an extensive foreign-relations network. The PLO is combination of several political parties. These should be distinguished from the President of the Palestinian National Authority, Palestinian Legislative Council, and PNA Cabinet, all of which are instead associated with the Palestinian National Authority (PNA). Palestine's founding document is the Palestinian Declaration of Independence, which should be distinguished from the unrelated PLO Palestinian National Covenant and PNA Palestine Basic Law. The Palestinian government is divided into two geographic entities – the Palestinian Authority, governed by Fatah, which has partial control over the West Bank, and the Gaza Strip, which is under control of the militant group Hamas. Fatah is a secular party that was founded by Yasser Arafat and enjoys relatively good relations with the western powers. On the other hand, Hamas is a militant group based on Palestinian nationalist and Islamic ideology, inspired by the Muslim Brotherhood. Hamas has tense relations with the United States, but it receives support from Iran. Popular Front for the Liberation of Palestine is another popular secular party, founded by George Habash. Mahmoud Abbas is the president of the country since 2005. Mohammad Shtayyeh was the prime minister of Palestine and resigned in 2024. In 2024, Mohammad Mustafa was appointed as the new prime minister of the country after the resignation of Shtayyeh. Yahya Sinwar was leader of the Hamas government in the Gaza Strip before his death in October 2024. According to Freedom House, the PNA governs Palestine in an authoritarian manner, including by repressing activists and journalists critical of the government. Jerusalem, including Haram ash-Sharif, is claimed as capital by Palestine, despite being under occupation by Israel. The temporary administration center is in Ramallah, which is 10 km from Jerusalem. Muqata hosts state ministries and representative offices. In 2000, a government building was built in the Jerusalem suburb of Abu Dis, to house the office of Yasser Arafat and the Palestinian parliament. Since the Second Intifada, the condition of the town made this site unsuitable to operate as a capital, either temporarily or permanently. Nevertheless, the Palestinians have maintained a presence in the city, and some countries have their consulates in Jerusalem delegated to Palestine. The State of Palestine is divided into sixteen administrative divisions. The governorates in the West Bank are grouped into three areas per the Oslo II Accord. Area A forms 18% of the West Bank by area and is administered by the Palestinian government. Area B forms 22% of the West Bank and is under Palestinian civil control and joint Israeli–Palestinian security control. Area C, except East Jerusalem, forms 60% of the West Bank and is administered by the Israeli Civil Administration. The Palestinian government provides the education and medical services to the 150,000 Palestinians in the area, an arrangement agreed upon in the Oslo II accord by Israeli and Palestinian leadership. More than 99% of Area C is off-limits to Palestinians, due to security concerns, and is a point of ongoing negotiation. There are about 330,000 Israelis living in settlements in Area C. Although Area C is under martial law, Israelis living there are entitled to full civic rights. Palestinian enclaves currently under Palestinian administration in red (Areas A and B; not including Gaza Strip, which is under Hamas rule). East Jerusalem, comprising the small pre-1967 Jordanian eastern-sector Jerusalem municipality, together with a significant area of the pre-1967 West Bank demarcated by Israel in 1967, is administered as part of the Jerusalem District of Israel. It is claimed by Palestine as part of the Jerusalem Governorate. It was effectively annexed by Israel in 1967, by application of Israeli law, jurisdiction and administration under a 1948 law amended for the purpose, this purported annexation being constitutionally reaffirmed (by implication) in Basic Law: Jerusalem 1980, but this annexation is not recognized by any other country. In 2010, of the 456,000 people in East Jerusalem, roughly 60% were Palestinians and 40% were Israelis. However, since the late 2000s, Israel's West Bank Security Barrier has effectively re-annexed tens of thousands of Palestinians bearing Israeli ID cards to the West Bank, leaving East Jerusalem within the barrier with a small Israeli majority (60%).[citation needed] Under the Oslo Accords, Jerusalem was proposed to be included in future negotiations. According to Israel, the Oslo Accords prohibit the Palestinian Authority from operating in Jerusalem. However, certain parts of Jerusalem, those neighborhoods which are located outside the historic Old City but are part of East Jerusalem, were allotted to the Palestinian Authority.a[iii] Foreign relations are maintained in the framework of the Ministry of Foreign Affairs. The Palestine Liberation Organization (PLO) represents the State of Palestine and maintains embassies in countries that recognize it. It also participates in international organizations as a member, associate, or observer. In some cases, due to conflicting sources, it is difficult to determine if the participation is on behalf of the State of Palestine, the PLO as a non-state entity, or the Palestinian National Authority (PNA). The Vatican shifted recognition to the State of Palestine in May 2015, following the 2012 UN vote. This change aligned with the Holy See's evolving position. Currently, 156 UN member states (80.8%) recognize the State of Palestine. Though some do not recognize it, they acknowledge the PLO as the representative of the Palestinian people. The PLO's executive committee acts as the government, empowered by the PNC. It is a full member of the Arab League, the Organization of Islamic Cooperation and the Union for the Mediterranean. Sweden took a significant step in 2013 by upgrading the status of the Palestinian representative office to a full embassy. They became the first EU member state outside the former communist bloc to officially recognize the State of Palestine.[excessive citations] Members of the Arab League and member of the Organization of Islamic Cooperation have strongly supported the country's position in its conflict with Israel. Iran has been a strong ally of Palestine since the Islamic revolution and has provided military support to Palestinian fedayeen and militant groups, including Hamas through its Axis of Resistance, which includes a military coalition of governments and rebels from Iraq, Syria, Lebanon and Yemen.[excessive citations] Hamas is also part of the axis of resistance. Even before the emergence of the Iran-backed Islamic Resistance in Iraq, Iraq was a strong supporter of Palestine when it was under the Ba'athist government of Saddam Hussein. Turkey is a supporter of Hamas and Qatar has been a key-financial supporter and has hosted Hamas leaders. In 1988, as part of the request to admit it to UNESCO, an explanatory note was prepared that listed 92 states that had recognized the State of Palestine, including both Arab and non-Arab states such as India.: 19 India, which historically was a strong supporter of the Palestinian cause, especially in the period pre-1990s, has gradually shifted to more balanced foreign policy since 1991-92, including establishing full diplomatic relations and growing ties with Israel, while still maintaining some diplomatic support for Palestine. Muammar Gaddafi of Libya was a supporter of Palestinian independence and was sought as a mediator in the Arab–Israeli conflict when he presented a one-state peace offer titled Isratin in 2000. Relations with the United Arab Emirates deteriorated when it signed normalization agreement with Israel. During the Sri Lankan Civil War, the PLO provided training for Tamil rebels to fight against the Sri Lankan government. The Republic of Ireland, Venezuela and South Africa are political allies of Palestine and have strongly advocated for establishment of independent Palestine. As a result of the ongoing war, support for the country has increased. Since Israel's invasion of Gaza, many countries in support of Palestinians have officially recognized the country. This includes Armenia, Spain, Norway, The Bahamas, Jamaica, Barbados and Trinidad and Tobago. The Palestine Liberation Organization (PLO) declared the establishment of the State of Palestine on 15 November 1988. There is a wide range of views on the legal status of the State of Palestine, both among international states and legal scholars. The existence of a state of Palestine is recognized by the states that have established bilateral diplomatic relations with it. In January 2015, the International Criminal Court affirmed Palestine's "State" status after its UN observer recognition, a move condemned by Israeli leaders as a form of "diplomatic terrorism". In December 2015, the UN General Assembly passed a resolution demanding Palestinian sovereignty over natural resources in the occupied territories. It called on Israel to cease exploitation and damage while granting Palestinians the right to seek restitution. In 1988, the State of Palestine's declaration of independence was acknowledged by the General Assembly with Resolution 43/177. In 2012, the United Nations General Assembly passed Resolution 67/19, granting Palestine "non-member observer state" status, effectively recognizing it as a sovereign state. In August 2015, Palestine's representatives at the United Nations presented a draft resolution that would allow the non-member observer states Palestine and the Holy See to raise their flags at the United Nations headquarters. Initially, the Palestinians presented their initiative as a joint effort with the Holy See, which the Holy See denied. In a letter to the Secretary General and the President of the General Assembly, Israel's Ambassador at the UN Ron Prosor called the step "another cynical misuse of the UN ... in order to score political points". After the vote, which was passed by 119 votes to 8 with 45 countries abstaining, the US Ambassador Samantha Power said that "raising the Palestinian flag will not bring Israelis and Palestinians any closer together". US Department of State spokesman Mark Toner called it a "counterproductive" attempt to pursue statehood claims outside of a negotiated settlement. At the ceremony itself, UN Secretary-General Ban Ki-moon said the occasion was a "day of pride for the Palestinian people around the world, a day of hope", and declared "Now is the time to restore confidence by both Israelis and Palestinians for a peaceful settlement and, at last, the realization of two states for two peoples." The State of Palestine has been recognized by 157 of the 193 UN members and since 2012 has had a status of a non-member observer state in the United Nations. This limited status is largely due to the fact that the United States, a permanent member of the UN Security Council with veto power, has consistently used its veto or threatened to do so to block Palestine's full UN membership. On 29 November 2012, in a 138–9 vote (with 41 abstentions and 5 absences), the United Nations General Assembly passed resolution 67/19, upgrading Palestine from an "observer entity" to a "non-member observer state" within the United Nations System, which was described as recognition of the PLO's sovereignty.[excessive citations] Palestine's UN status is equivalent to that of the Holy See. The UN has permitted Palestine to title its representative office to the UN as "The Permanent Observer Mission of the State of Palestine to the United Nations". Palestine has instructed its diplomats to officially represent "The State of Palestine"—no longer the Palestinian National Authority. On 17 December 2012, UN Chief of Protocol Yeocheol Yoon declared that "the designation of 'State of Palestine' shall be used by the Secretariat in all official United Nations documents", thus recognizing the title 'State of Palestine' as the state's official name for all UN purposes. On 21 December 2012, a UN memorandum discussed appropriate terminology to be used following GA 67/19. It was noted therein that there was no legal impediment to using the designation Palestine to refer to the geographical area of the Palestinian territory. At the same time, it was explained that there was also no bar to the continued use of the term "Occupied Palestinian Territory including East Jerusalem" or such other terminology as might customarily be used by the Assembly. As of 23 September 2025, 157 (81.3%) of the 193 member states of the United Nations have recognized the State of Palestine. Many of the countries that do not recognize the State of Palestine nevertheless recognize the PLO as the "representative of the Palestinian people". The PLO's Executive Committee is empowered by the Palestinian National Council to perform the functions of government of the State of Palestine. On 2 April 2024, Riyad Mansour, the Palestinian ambassador to the UN, requested that the Security Council consider a renewed application for membership. As of April, seven UNSC members recognize Palestine but the US has indicated that it opposes the request and in addition, US law stipulates that US funding for the UN would be cut off in the event of full recognition without an Israeli–Palestinian agreement. On 18 April, the US vetoed a widely supported UN resolution that would have admitted Palestine as a full UN member. A May 2024 UNGA resolution came into force with the 2024 general assembly. The resolution, which recognized the Palestinian right to become a full member state, also granted the right to Palestinians to submit proposals and amendments and Palestine was permitted to take a seat with other member states in the assembly. The Palestinian Security Services consists of the armed forces and intelligence agencies, which were established during the Oslo Accords. Their function is to maintain internal security and enforce law in the PA-controlled areas. It does not operate as an independent armed force of a country. Before the Oslo Accords, the PLO led armed rebellion against Israel, which included coalition of militant groups and included its own military branch – the Palestine Liberation Army. Since the 1993–1995 agreements, it has been inactive and operates only in Syria. Palestinian fedayeen are the Palestinian militants and guerilla army. They are considered as "freedom fighter" by Palestinians and "terrorists" by Israelis. Hamas considers itself as an independent force, which is more powerful and influential than PSF, along with other militant organizations such as Islamic Jihad (Al-Quds Bridage). It is a guerilla army, which is supported by Iran, Qatar and Turkey. According to the CIA World Factbook, the Qassam Brigades have 20,000 to 25,000 members, although this number is disputed. Israel's 2005 withdrawal from Gaza provided Hamas with the opportunity to develop its military wing. Iran and Hezbollah have smuggled weapons to Hamas overland through the Sinai Peninsula via Sudan and Libya, as well as by sea. Intensive military training and accumulated weapons have allowed Hamas to gradually organize regional units as large as brigades containing 2,500–3,500 fighters each. Since 2020, joint exercises conducted with other militant groups in Gaza like the Palestinian Islamic Jihad (PIJ) have habituated units to operating in a coordinated fashion, supported Hamas command and control, and facilitated cooperation between Hamas and smaller factions. Such efforts began in earnest in 2007, upon Hamas's seizure of power in the Gaza Strip. Iran has since supplied materiel and know-how for Hamas to build a sizable rocket arsenal, with more than 10,000 rockets and mortar shells fired in the current conflict. With Iran's help, Hamas has developed robust domestic rocket production that uses pipes, electrical wiring, and other everyday materials for improvised production. The State of Palestine has a number of security forces, including a Civil Police Force, National Security Forces and Intelligence Services, with the function of maintaining security and protecting Palestinian citizens and the Palestinian State. All of these forces are part of Palestinian Security Services. The PSF is primarily responsible for maintaining internal security, law enforcement, and counterterrorism operations in areas under Palestinian Authority control. The Palestinian Liberation Army (PLA) is the standing army of the Palestine Liberation Organization (PLO). It was established during the early years of the Palestinian national movement but has largely been inactive since the Oslo Accords. The PLA's role was intended to be a conventional military force but has shifted to a more symbolic and political role. Economy Palestine is classified as a middle income and developing country by the IMF. In 2023, the GDP of the country was $40 billion and per-capita around $4,500. Due to its disputed status, the economic condition have been affected. Carbon dioxide emissions were 0.6 metric tons per capita in 2010. In 2011, Palestine's poverty rate was 25.8%. According to a new World Bank report, Palestinian economic growth is expected to soften in 2023.[needs update] Palestine's economy relies heavily upon international aid, remittances from overseas Palestinians and local industries. The State of Palestine's overall gross-domestic-product (GDP) declined by 35% in the first quarter of 2024, due to the ongoing war in Gaza, the Palestinian Central Bureau of Statistics (PCBS) reports. There was a stark difference between the West Bank, which witnessed a decline of 25% and in the Gaza Strip, the number is 86% amid the ongoing war. The manufacturing sector decreased by 29% in the West Bank and 95% in Gaza, while the construction sector decreased by 42% in the West Bank and essentially collapsed in Gaza, with a 99% decrease. After Israel occupied the West Bank and Gaza Strip in 1967, Palestinian agriculture suffered significant setbacks. The sector's contribution to the GDP declined, and the agricultural labor force decreased. The cultivated areas in the West Bank continuously declined since 1967. Palestinian farmers face obstacles in marketing and distributing their products, and Israeli restrictions on water usage have severely affected Palestinian agriculture. Over 85% of Palestinian water from the West Bank aquifers is used by Israel, and Palestinians are denied access to water resources from the Jordan and Yarmouk rivers. In Gaza, the coastal aquifer is suffering from saltwater intrusion. Israeli restrictions have limited the irrigation of Palestinian land, with only 6% of West Bank land cultivated by Palestinians being irrigated, while Israeli settlers irrigate around 70% of their land. The Gulf War in 1991 had severe repercussions on Palestinian agriculture, as the majority of exports were previously sent to the countries of the Arab Gulf. Palestinian exports to the Gulf States declined by 14% as a result of the war, causing a significant economic impact. Water supply and sanitation in the Palestinian territories are characterized by severe water shortage and are highly influenced by the Israeli occupation. The water resources of Palestine are partially controlled by Israel due in part from historical and geographical complexities with Israel granting partial autonomy in 2017. The division of groundwater is subject to provisions in the Oslo II Accord, agreed upon by both Israeli and Palestinian leadership.[citation needed] Israel provides the Palestinian territories water from its own water supply and desalinated water supplies, in 2012 supplying 52 MCM. Generally, the water quality is considerably worse in the Gaza Strip when compared to the West Bank. About a third to half of the delivered water in the Palestinian territories is lost in the distribution network. The lasting blockade of the Gaza Strip and the Gaza War have caused severe damage to the infrastructure in the Gaza Strip. Concerning wastewater, the existing treatment plants do not have the capacity to treat all of the produced wastewater, causing severe water pollution. The development of the sector highly depends on external financing. Manufacturing sectors in Palestine include textiles, food processing, pharmaceuticals, construction materials, furniture, plastic products, stone, and electronics. Notable products include clothing, olive oil, dairy, furniture, ceramics, and construction materials. Before the Second Intifada, Palestine had a strong industrial base in Jerusalem and Gaza. Barriers erected in the West Bank have made movement of goods difficult; the blockade of the Gaza Strip has severely affected the territory's economic conditions. As of 2023[update], according to the Ministry of Economy, the manufacturing sector expected to grow by 2.5% and create 79,000 jobs over the following six years. Palestine mainly exports articles of stone (limestone, marble – 13.3%), furniture (11.7%), plastics (10.2%) and iron and steel (9.1%). Most of these products are exported to Jordan, the United States, Israel and Egypt. Hebron is the most industrially advanced city in the region and serves as an export hub for Palestinian products. More than 40% of the national economy produced there. The most advanced printing press in the Middle East is in Hebron. Many quarries are in the surrounding region. Silicon reserves are found in the Gaza territory. Jerusalem stone, extracted in the West Bank, has been used for constructing many structures in Jerusalem. Hebron is widely known for its glass production. Nablus is noted for its Nablus soap. Some of the companies operating in the Palestinian territories include Siniora Foods, Sinokrot Industries, Schneider Electric, PepsiCo and Coca-Cola. Israeli–Palestinian economic peace efforts have resulted in several initiatives, such as the Valley of Peace initiative and Breaking the Impasse, which promote industrial projects between Israel, Palestine and other Arab countries, with the goal of promoting peace and ending conflict. These include joint industrial parks opened in Palestine. The Palestinian Authority has built industrial cities in Gaza, Bethlehem, Jericho, Jenin and Hebron. Some are in joint cooperation with European countries. Palestine does not produce its own oil or gas. But as per UN reports, "sizeable reserves of oil and gas" lie in the Palestinian territories. Due to its state of conflict, most of the energy and fuel in Palestine are imported from Israel and other all neighboring countries such as Egypt, Jordan and Saudi Arabia. In 2012, electricity available in West Bank and Gaza was 5,370 GW-hour (3,700 in the West Bank and 1,670 in Gaza), while the annual per capita consumption of electricity (after deducting transmission loss) was 950 kWh. The Gaza Power Plant is the only power plant in the Gaza Strip. It is owned by Gaza Power Generating Company (GPGC), a subsidiary of the Palestine Electric Company PLC (PEC). Jerusalem District Electricity Company, a subsidiary of PEC, provides electricity to Palestinian residents of Jerusalem. Government officials have increasingly focused on solar energy to reduce dependency on Israel for energy. Palestine Investment Fund have launched "Noor Palestine", a project which aims to provide power in Palestine. Qudra Energy, a joint venture between Bank of Palestine and NAPCO have established solar power plants across Jammala, Nablus, Birzeit and Ramallah. In 2019, under Noor Palestine campaign, first solar power plant and solar park was inaugurated in Jenin. Two more solar parks have been planned for Jericho and Tubas. A new solar power plant is under construction at Abu Dis campus of Al-Quds University, for serving Palestinian Jerusalemites. Palestine holds massive potential reserves of oil and gas. Over 3 billion barrels (480,000,000 m3) of oil are estimated to exist off the coast and beneath occupied Palestinian lands. The Levant Basin holds around 1.7 billion barrels (270,000,000 m3) of oil, with another 1.5 billion barrels (240,000,000 m3) beneath the occupied West Bank area. Around 2 billion barrels (320,000,000 m3) of oil reserves are believed to exist in shore of the Gaza Strip. According to a report by the UNCTAD, around 1,250 billion barrels (1.99×1011 m3) of oil reserves are in the occupied Palestinian territory of the West Bank, probably the Meged oil field. As per the Palestinian Authority, 80% of this oil field falls under the lands owned by Palestinians. Masadder, a subsidiary of the Palestine Investment Fund is developing the oilfield in the West Bank. Block-1 field, which spans an area of 432 square kilometres (167 sq mi) from northwest Ramallah to Qalqilya in Palestine, has significant potential for recoverable hydrocarbon resources. It is estimated to have a P90 (a level of certainty) of 0.03 billion barrels (4,800,000 m3) of recoverable oil and 6,000,000,000 cubic feet (170,000,000 m3) of recoverable gas. The estimated cost for the development of the field is $390 million, and it will be carried out under a production sharing agreement with the Government of Palestine. Currently, an initial pre-exploration work program is underway to prepare for designing an exploration plan for approval, which will precede the full-fledged development of the field. Natural gas in Palestine is mostly found in Gaza Strip. Gaza Marine is a natural gas field, located around 32 kilometres (20 mi) from the coast of the territory in the Mediterranean shore. It holds gas reserves ranging between 28 billion cubic metres (990 billion cubic feet) to 32 billion cubic metres (1.1 trillion cubic feet). These estimates far exceed the needs of the Palestinian territories in energy. The gas field was discovered by the British Gas Group in 1999. Upon the discovery of the gas field, it was lauded by Yasser Arafat as a "Gift from God". A regional cooperation between the Palestinian Authority, Israel and Egypt were signed for developing the field and Hamas also gave approval to the Palestinian Authority. However, since the ongoing war in Gaza, this project have been delayed. Two airports of Palestine – Jerusalem International Airport and Gaza International Airport – were destroyed by Israel in the early years of the second intifada. Since then no airport has been operational in the country. Palestinians used to travel through airports in Israel – Ben Gurion Airport and Ramon Airport – and Queen Alia International Airport of Amman, capital of Jordan. Many proposals have been made by both the government and private entities to build airports in the country. In 2021, the most recent proposal was made by both the Palestinian government and Israeli government to redevelop Qalandia Airport as a binational airport for both Israelis and Palestinians. Gaza Strip is the only coastal region of Palestine, where Port of Gaza is located. It is under naval siege by Israel, since the territory's blockade. During Oslo years, the Palestinian government collaborated with the Netherlands and France to build an international seaport but the project was abandoned. In 2021, then prime minister of Israel Naftali Bennett launched a development project for Gaza, which would include a seaport. Tourism in the country refers to tourism in East Jerusalem, the West Bank and the Gaza Strip. In 2010, 4.6 million people visited the Palestinian territories, compared to 2.6 million in 2009. Of that number, 2.2 million were foreign tourists, while 2.7 million were domestic. Most tourists come for only a few hours or as part of a day trip itinerary. In the last quarter of 2012 over 150,000 guests stayed in West Bank hotels. 40% were European and 9% were from the United States and Canada. Lonely Planet travel guide writes that "the West Bank is not the easiest place in which to travel but the effort is richly rewarded." Sacred sites such as the Western Wall, the Church of the Holy Sepulchre, and the Al-Aqsa Mosque draw countless pilgrims and visitors each year. In 2013, Palestinian Authority Tourism minister Rula Ma'ay'a stated that her government aims to encourage international visits to Palestine, but the occupation is the main factor preventing the tourism sector from becoming a major income source to Palestinians. There are no visa conditions imposed on foreign nationals other than those imposed by the visa policy of Israel. Access to Jerusalem, the West Bank, and Gaza is completely controlled by the government of Israel. Entry to the occupied Palestinian territories requires only a valid international passport. Tourism is mostly centered around Jerusalem and Bethlehem. Jericho is a popular tourist spot for local Palestinians. Palestine is known as the "Silicon Valley of NGOs". The high tech industry in Palestine, have experienced good growth since 2008. In 2020, the Palestinian Central Bureau of Statistics (PCBS) and the Ministry of Telecom and Information Technology said there were 4.2 million cellular mobile subscribers in Palestine compared to 2.6 million at the end of 2010. The number of ADSL subscribers in Palestine increased to about 363 thousand by the end of 2019, from 119 thousand over the same period. In 2020, 97% of Palestinian households had at least one cellular mobile line. At least one smartphone is owned by 86% of households (91% in the West Bank and 78% in Gaza Strip). About 80% of the Palestinian households have access to the internet in their homes and about a third have a computer. In June 2020, the World Bank approved a US$15 million grant for the Technology for Youth and Jobs (TechStart) Project aiming to help the Palestinian IT sector upgrade the capabilities of firms and create more high-quality jobs. Kanthan Shankar, World Bank Country Director for West Bank and Gaza said "The IT sector has the potential to make a strong contribution to economic growth. It can offer opportunities to Palestinian youth, who constitute 30% of the population and suffer from acute unemployment." The Palestine Monetary Authority has issued guidelines for the operation and provision of electronic payment services including e-wallet and prepaid cards. The Protocol on Economic Relations, also known as the Paris Protocol was signed between the PLO and Israel, which prohibited Palestinian Authority from having its own currency. This agreement paved a way for the government to collect taxes. Prior to 1994, the occupied Palestinian territories had limited banking options, with Palestinians avoiding Israeli banks. This resulted in an under-banked region and a cash-based economy. Currently, there are 14 banks operating in Palestine, including Palestinian, Jordanian, and Egyptian banks, compared to 21 in 2000. The number of banks has decreased over time due to mergers and acquisitions. Deposits in Palestinian banks have seen significant growth, increasing from US$1.2 billion in 2007 to US$6.9 billion in 2018, representing a 475% increase. The banking sector has shown impressive annual growth rates in deposits and loan portfolios, surpassing global averages. The combined loan facilities provided by all banks on 31 December 2018, amounted to US$8.4 billion, marking a significant growth of 492 percent compared to US$1.42 billion in 2007. Palestinian registered banks accounted for US$0.60 billion or 42 percent of total deposits in 2007. In 2018, the loans extended by Palestinian registered banks reached US$5.02 billion, representing 61 percent of total loans. This showcases a remarkable 737 percent increase between 2007 and 2018. Currently, Palestinian registered banks hold 57 percent of customer deposits and provide 61 percent of the loans, compared to 26 percent of deposits and 42 percent of loans in 2007. According to a report by the World Bank, the economic impact of Israel's closure policy has been profound, directly contributing to a significant decline in economic activity, widespread unemployment, and a rise in poverty since the onset of the Second Intifada in September 2000.[additional citation(s) needed] The Israeli restrictions imposed on Area C alone result in an estimated annual loss of approximately $3.4 billion, which accounts for nearly half of the current Palestinian GDP. These restrictions have severely hindered economic growth and development in the region. In the aftermath of the 2014 Gaza War, where many structures were damaged or destroyed, the flow of construction and raw materials into Gaza has been severely limited. Regular exports from the region have been completely halted, exacerbating the economic challenges faced by the population. One of the burdensome measures imposed by Israel is the "back-to-back" system enforced at crossing points within Palestinian territories. This policy forces shippers to unload and reload their goods from one truck to another, resulting in significant transportation costs and longer transit times for both finished products and raw materials. Under the 1995 Oslo II Accord, it was agreed that governance of Area C would be transferred to the Palestinian Authority within 18 months, except for matters to be determined in the final status agreement. However, Israel has failed to fulfill its obligations under the Oslo agreement. The European Commission has highlighted the detrimental impact of the Israeli West Bank barrier, estimating that it has led to an annual economic impoverishment of Palestinians by 2–3% of GDP.[additional citation(s) needed] Furthermore, the escalating number of internal and external closures continues to have a devastating effect on any prospects for economic recovery in the region. In 2015, the economic impact of Israel's illegal use of Palestinian natural resources was conservatively estimated[by whom?] at US$1.83 billion, equivalent to 22% of Palestine's GDP that year. In a 2015 World Bank report, the manufacturing sector's share of GDP decreased from 19% to 10% between the signing of the Oslo Accords until 2011.[additional citation(s) needed] The same report, which adopted conservative estimates, suggests that access to Area 'C' in specific sectors like Dead Sea minerals, telecommunications, mining, tourism, and construction could contribute at least 22% to Palestinian GDP. The report notes that Israel and Jordan together generate around $4.2 billion annually from the sale of these products, representing 6% of the global potash supply and 73% of global bromine output. Overall, if Palestinians had unrestricted access to their own land in Area 'C,' the potential economic benefits for Palestine could increase by 35% of GDP, amounting to at least $3.4 billion annually.[according to whom?] Similarly, water restrictions incurred a cost of US$1.903 billion, equivalent to 23.4% of GDP, while Israel's ongoing blockade of the Gaza Strip resulted in a cost of $1.908 billion US$, representing 23.5% of GDP in 2010. Demographics According to the Palestinian Central Bureau of Statistics (PCBS), as of 26 May 2021, the State of Palestine 2021 mid-year population is 5,227,193. Ala Owad, the president of the PCBS, estimated a population of 5.3 million at year end, 2021. Within an area of 6,020 square kilometres (2,320 mi2), there is a population density of about 827 people per square kilometer. To put this in a wider context, the average population density of the world was 25 people per square kilometre in 2017. Half of the Palestinian population live in the diaspora, or are refugees. Due to being in a state of conflict with Israel, the subsequent wars have resulted in the widespread displacement of Palestinians, known as Nakba or Naksa. In the 1948 war, around 700,000 Palestinians were expelled. Most of them are seeking refuge in neighboring Arab countries like Jordan, Iraq, Lebanon and Egypt, while others live as expats in Saudi Arabia, Qatar, Oman and Kuwait. A large number of Palestinians are in the United States, the United Kingdom and the rest of Europe. Palestinians are natively Arab, and speak the Arabic language. Bedouin communities of Palestinian nationality comprise a minority in the West Bank, particularly around the Hebron Hills and rural Jerusalem. As of 2013, approximately 40,000 Bedouins reside in the West Bank and 5,000 Bedouins live in the Gaza Strip. Jahalin and Ta'amireh are two major Bedouin tribes in the country. A large number of non-Arab ethnic groups also live in the country, with their members holding Palestinian citizenship as well. These include groups of Kurds, Nawar, Assyrians, Romani, Druze, Africans, Dom, Russians, Turks and Armenians. Most of the non-Arab Palestinian communities reside around Jerusalem. About 5,000 Assyrians live in Palestine, mostly in the holy cities of Jerusalem and Bethlehem. An estimated population of between 200 and 450 black Africans, known as Afro-Palestinians, live in Jerusalem. A small community of Kurds live in Hebron. The Nawar are a small Dom and Romani community, living in Jerusalem, who trace their origins to India. The Russian diaspora is also found in Palestine, particularly in the Russian Compound of Jerusalem and in Hebron. Most of them are Christians of the Russian Orthodox Church. In 2022, an estimate of approximately 5,000–6,000 Armenians lived across Israel and Palestine, of which around 1,000 Armenians lived in Jerusalem (Armenian Quarter) and the rest lived in Bethlehem. Since 1987, 400,000 to 500,000 Turks live in Palestine. Due to the 1947–1949 civil war, many Turkish families fled the region and settled in Jordan, Syria and Lebanon. According to a 2022 news article by Al Monitor, many families of Turkish origin in Gaza have been migrating to Turkey due to the "deteriorating economic conditions in the besieged enclave". Minorities of the country are also subjected to occupation and restrictions by Israel. Arabic is the official language of the State of Palestine, specifically the Palestinian Arabic dialect which is commonly spoken by the local population. Hebrew and English are also widely spoken. Around 16% of the population consists of Israeli settlers, whose primary language is typically Hebrew. Many Palestinians use Hebrew as a second or third language. The country has been known for its religious significance and site of many holy places, with religion playing an important role in shaping the country's society and culture. It is traditionally part of the Holy Land, which is considered sacred land to Abrahamic religions and other faiths as well. The Basic Law states that Islam is the official religion but also grants freedom of religion, calling for respect for other faiths. Religious minorities are represented in the legislature for the Palestinian National Authority. 98% of Palestinians are Muslim, the vast majority of whom are followers of the Sunni branch of Islam and a small minority of Ahmadiyya. 15% are nondenominational Muslims. Palestinian Christians represent a significant minority of 1%, followed by much smaller religious communities, including Baha'is and Samaritans. The largest concentration of Christians are in Bethlehem, Beit Sahour, and Beit Jala in the West Bank, as well as in the Gaza Strip. Denominationally, most Palestinian Christians belong to Eastern Orthodox or Oriental Orthodox churches, including the Greek Orthodox Church, Armenian Apostolic Church, and Syriac Orthodox Church. There are significant group of Roman Catholics, Greek Catholics (Melkites), and Protestant denominations. With a population of 350 people, Samaritans are highly concentrated around the Mount Gerizim. Due to similarities between Samaritanism and Judaism, Samaritans are often referred to as "the Jews of Palestine". The PLO considers those Jews as Palestinians, who lived in the region peacefully before the rise of Zionism. Certain individuals, especially anti-Zionists, consider themselves Palestinian Jews, such as Ilan Halevi and Uri Davis. Around 600,000 Israeli settlers, mostly Jews, live in the Israeli settlements, illegal under international law, across the West Bank. Jericho synagogue, situated in Jericho is the only synagogue maintained by the Palestinian Authority. The literacy rate of Palestine was 96.3% according to a 2014 report by the United Nations Development Programme, which is high by international standards. There is a gender difference in the population aged above 15 with 5.9% of women considered illiterate compared to 1.6% of men. Illiteracy among women has fallen from 20.3% in 1997 to less than 6% in 2014. In the State of Palestine, the Gaza Strip has the highest literacy rate. According to a press blog of Columbia University, Palestinians are the most educated refugees. The education system in Palestine encompasses both the West Bank and the Gaza Strip, and it is administered by the Ministry of Education and Higher Education. Basic education in Palestine includes primary school (grades 1–4) and preparatory school (grades 5–10). Secondary education consists of general secondary education (grades 11–12) and vocational education. The curriculum includes subjects such as Arabic, English, mathematics, science, social studies, and physical education. Islamic and Christian religious studies are also part of the curriculum as per the educational ministry. The West Bank and the Gaza Strip together have 14 universities, 18 university colleges, 20 community colleges, and 3,000 schools. An-Najah National University in Nablus is the largest university in the country, followed by Al-Quds University in Jerusalem and Birzeit University in Birzeit near Ramallah.[excessive citations] Al-Quds University achieved a 5-star rating in quality standards and was termed the "most socially responsible university in the Arab world". In 2018, Birzeit University was ranked as one of the top 2.7% of universities worldwide in the 2019 edition of the World University Rankings. Ali H. Nayfeh, a Palestinian scientist from Tulkarem, is regarded as the most influential scholar and scientist in the area of applied nonlinear dynamics in mechanics and engineering. According to the Palestinian Ministry of Health (MOH), as of 2017, there were 743 primary health care centers in Palestine (583 in the West Bank and 160 in Gaza), and 81 hospitals (51 in the West Bank, including East Jerusalem, and 30 in Gaza). The largest hospital of the West Bank is in Nablus, while Al-Shifa Hospital is largest in the Gaza Strip. Operating under the auspices of the World Health Organization (WHO), the Health Cluster for the occupied Palestinian territory (oPt) was established in 2009 and represents a partnership of over 70 local and international nongovernmental organizations and UN agencies providing a framework for health actors involved in the humanitarian response for the oPt. The Cluster is co-chaired by the MOH to ensure alignment with national policies and plans. The report of WHO Director-General of 1 May 2019 describes health sector conditions in the oPt identifying strategic priorities and current obstacles to their achievement pursuant to the country cooperation strategy for WHO and the Occupied Palestinian Territory 2017–2020. Culture Palestinians are ethnically and linguistically considered part of the Arab world. The culture of Palestine has had a heavy influence on religion, arts, literature, sports architecture, and cinema. UNESCO have recognized Palestinian culture. The Palestine Festival of Literature (PalFest) brings together Palestinian and international writers, musicians, and artists for a celebration of literature and culture. The annual Palestine Cinema Days festival showcases Palestinian films and filmmakers. Culture of Palestine is an amalgamation of indigenous traditions, Arab customs, and the heritage of various empires that have ruled the region. The land of Palestine has witnessed the presence of ancient civilizations such as the Canaanites, Philistines, and Israelites, each contributing to its cultural fabric. The Arab conquest in the 7th century brought the influence of Islam, which has been a cornerstone of Palestinian identity ever since. Islamic traditions, including language, art, and architecture, have infused the culture with distinct features. Palestinian cultural expression often serves as a form of resistance against occupation and oppression. Street art, such as the work of Banksy in Bethlehem, and the annual Palestinian music and arts festival, Al-Mahatta, are examples of this cultural resistance. The Old City of Jerusalem, with its religious sites like the Western Wall, the Al-Aqsa Mosque, and the Church of the Holy Sepulchre, holds immense cultural and historical significance. Other notable cultural sites include the ancient city of Jericho, the archaeological site of Sebastia, and the town of Bethlehem.[excessive citations] A large number of cultural centers are found throughout the country, almost in all major cities. In 2009, Jerusalem was named as Arab Capital of Culture and Bethlehem participated in the Arab Capital of Culture in 2020. Palestinian cuisine was ranked among 100 best cuisines in the world by TasteAtlas. Palestinian architecture encompasses a rich heritage that reflects the cultural and historical diversity of the region. Throughout its history, Palestinian architecture has been influenced by various civilizations, including Islamic, Byzantine, Crusader, and Ottoman. Traditional Palestinian architecture is characterized by its use of local materials such as stone and traditional construction techniques. The architectural style varies across different regions, with notable features including arched doorways, domes, and intricate geometric patterns. Islamic architecture has left a profound impact on Palestinian buildings. Mosques, mausoleums, and madrasas showcase exquisite craftsmanship, with notable examples including the Al-Aqsa Mosque in Jerusalem and the Great Mosque of Nablus. Rawabi is home to the largest Roman amphitheatre in the Middle East and the Arab world. Palestine is home to several Byzantine and Crusader architectural marvels. The Church of the Holy Sepulchre in Jerusalem, which dates back to the 4th century, is a significant pilgrimage site. The Crusader fortress of Krak des Chevaliers in the Golan Heights is another remarkable example. During the Ottoman period, numerous mosques, palaces, and public buildings were constructed throughout Palestine. The iconic Dome of the Rock in Jerusalem underwent restoration and renovation in the Ottoman era, showcasing a blend of Islamic and Byzantine architectural elements. Rasem Badran and Mohamed Hadid are popular Palestinian architects. In recent years, modern architecture has emerged in Palestine, blending traditional elements with contemporary designs. The Palestinian Museum in Birzeit, designed by Heneghan Peng Architects, exemplifies this fusion, incorporating local motifs and sustainable building practices. International Convention Center in Bethlehem, is a prominent structure showcasing contemporary Palestinian architecture. Another notable building is the Palestinian National Theatre in Jerusalem. Elements of modern architecture can be found in shopping malls, luxury hotels, technology parks and high rise skyscrapers. The Palestine Trade Tower in Ramallah is the tallest building in Palestine. Traditional Palestinian music is deeply rooted in the region's history and culture. It features instruments such as the oud (a stringed instrument), the qanun (a type of zither), and various percussion instruments. Traditional folk songs often depict themes of love, longing, and daily life experiences. Artists like Mohammad Assaf, winner of the Arab Idol competition, have gained international recognition for their renditions of traditional Palestinian songs. Dabke is a popular Palestinian dance form accompanied by music. The lively and rhythmic music is characterized by the use of the mijwiz (a reed flute), the tablah (a drum), and the handclapping of dancers. Dabke songs are often performed at weddings, celebrations, and cultural events, fostering a sense of community and shared identity. Palestinian pop music has gained popularity in recent years, blending modern elements with traditional influences. Artists like Mohammed Assaf, Amal Murkus, and Rim Banna have contributed to the contemporary pop scene with their unique styles and powerful voices. Their songs address both personal and political themes, resonating with Palestinians and audiences worldwide. Palestinian hip-hop has emerged as a powerful medium for expressing the realities and struggles faced by Palestinians. Artists such as DAM, Shadia Mansour, and Tamer Nafar have gained international recognition for their socially conscious lyrics, addressing topics such as occupation, identity, and resistance. Palestinian hip-hop serves as a form of cultural resistance, amplifying the voices of Palestinian youth. Rim Banna was a Palestinian singer known for her ethereal vocals and her dedication to preserving Palestinian folk music. Reem Kelani, a Palestinian musician based in the United Kingdom, is renowned for her powerful voice and her reinterpretation of traditional Palestinian songs. Dalal Abu Amneh is a popular Palestinian singer and poet. There are a number of newspapers, news agencies, and satellite television stations in Palestine. Its news agencies include Ma'an News Agency, Wafa, and Palestine News Network. Al-Aqsa TV, Al-Quds TV, and Sanabel TV are its main satellite broadcasters. Palestinian cinema production is centered in Jerusalem, with prominent local scenes in Ramallah, Bethlehem, and Nablus. Makram Khoury, Mohammad Bakri, Hiam Abbass, and Amal Murkus emerged as popular faces in Palestinian cinema during the 1970s and 1980s. Areen Omari, Valantina Abu Oqsa, Saleh Bakri, Tawfeek Barhom, and Ashraf Barhom became popular in the mid-1990s, while Leem Lubany and Clara Khoury have gained acclaim since 2000. Popular Palestinian movies include Wedding in Galilee (1987), Chronicle of a Disappearance (1996), Divine Intervention (2002), Paradise Now (2005), The Time That Remains (2009), and Omar (2013). Documentary filmmaking has played a significant role in capturing and documenting the Palestinian experience. Films like 5 Broken Cameras by Emad Burnat and Guy Davidi have received critical acclaim. Palestinian filmmakers often face unique challenges due to the political situation in the region, with many films made under the rules and struggles of occupation. The Palestinian Film Festival, held annually in various cities around the world, showcases Palestinian cinema and provides a platform for Palestinian filmmakers to share their stories. Palestine has been participating in the Olympic Games since 1996, with athletes competing in various sports, including athletics, swimming, judo, and taekwondo. Palestinian Olympians represent their nation on the international stage. The country is a member of the International Olympic Committee. In addition to football, basketball, handball, and volleyball are also popular sports in Palestine. The Palestinian Basketball Federation and Palestinian Handball Federation oversee these sports' development and organization. Association football (soccer) is the most popular sport in Palestine, with the Palestine national football team representing the state in international football and governed by FIFA worldwide. The Palestine Cup is the premier domestic football competition in Palestine. It features teams from the West Bank and Gaza Strip, and the winner represents Palestine in the AFC Cup. Faisal Al-Husseini International Stadium, located Jerusalem, stands as the largest stadium in Palestine. It serves as the home ground for the national football team. Other notable stadiums include Dora International Stadium in Hebron, Palestine Stadium in Gaza and Nablus Football Stadium in the Nablus. Mohammed Hamada is the first weightlifter from Palestine, who won gold at 2022 International Weightlifting Federation Junior World Championships in Greece. See also Notes References Further reading External links 32°00′N 35°15′E / 32.000°N 35.250°E / 32.000; 35.250 |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/The_Walt_Disney_Company] | [TOKENS: 16801] |
Contents The Walt Disney Company The Walt Disney Company, commonly known as simply Disney, is an American multinational mass media and entertainment conglomerate headquartered at the Walt Disney Studios complex in Burbank, California. Founded on October 16, 1923, as an animation studio by brothers Walt Disney and Roy Oliver Disney as Disney Brothers Cartoon Studio, Disney operated under the names Walt Disney Studio and Walt Disney Productions before adopting its current name in 1986. In 1928, Disney established itself as a leader in the animation industry with the short film Steamboat Willie. The film used synchronized sound to become the first post-produced sound cartoon, and popularized Mickey Mouse, who became Disney's mascot and corporate icon. After becoming a success by the early 1940s, Disney diversified into live-action films, television, and theme parks in the 1950s. However, following Walt Disney's death in 1966, the company's profits, especially in the animation sector, began to decline. In 1984, Disney's shareholders voted Michael Eisner as CEO, who led a reversal of the company's decline through a combination of international theme park expansion and the highly successful Disney Renaissance period of animation from 1989 to 1999. In 2005, under the new CEO Bob Iger, the company continued to expand into a major entertainment conglomerate with the acquisitions of Pixar in 2006, Marvel Entertainment in 2009, Lucasfilm in 2012, and 21st Century Fox in 2019. In 2020, Bob Chapek became the head of Disney after Iger's retirement. However, Chapek was ousted in 2022 and Iger was reinstated as CEO. Disney operates the largest television and film studio in Hollywood. Walt Disney Studios includes Walt Disney Pictures, Walt Disney Animation Studios, Pixar, Marvel Studios, Lucasfilm, 20th Century Studios, 20th Century Animation, and Searchlight Pictures. Disney's other main business units include divisions operating the ABC television network; cable television networks such as Disney Channel, ESPN, Freeform, FX, and National Geographic; publishing, merchandising, music, and theater divisions; direct-to-consumer streaming services such as Disney+, ESPN+, Hulu, and Hotstar; and Disney Experiences, which includes several theme parks, resort hotels, and cruise lines around the world. Disney is one of the biggest and best-known companies in the world. Often regarded as one of the most influential entertainment brands in history, Disney is credited with revolutionizing the animation industry. In 2023, it was ranked 87th on the 2023 Forbes Global 2000, and 48th on the Fortune 500 list of biggest companies in the United States by revenue. Since its founding, the company has won 135 Academy Awards, 26 of which were awarded to Walt. The company has produced films which have been featured on many lists of the greatest films of all time and is one of the key players on the development of the theme park industry. The company has been public since 1940 and trades on the New York Stock Exchange (NYSE) and has been a component of the Dow Jones Industrial Average since 1991. In August 2020, about two-thirds of the stock was owned by large financial institutions. The company celebrated its 100th anniversary on October 16, 2023. History In 1921, American animators Walt Disney and Ub Iwerks founded Laugh-O-Gram Studio in Kansas City, Missouri. Iwerks and Disney went on to create short films at the studio. The final one, in 1923, was entitled Alice's Wonderland and depicted child actress Virginia Davis interacting with animated characters. While Laugh-O-Gram's shorts were popular in Kansas City, the studio went bankrupt in 1923 and Disney moved to Los Angeles, to join his brother Roy O. Disney, who was recovering from tuberculosis. Shortly after Walt's move, New York film distributor Margaret J. Winkler purchased Alice's Wonderland, which began to gain popularity. Disney signed a contract with Winkler for $1,500 to create six series of Alice Comedies, with an option for two more six-episode series. Walt and Roy Disney founded Disney Brothers Cartoon Studio on October 16, 1923, to produce the films. In January 1926, the Disney's moved into a new studio on Hyperion Street and the studio's name was changed to Walt Disney Studio. After producing Alice films over the next 4 years, Winkler handed the role of distributing the studio's shorts to her husband, Charles Mintz. In 1927, Mintz asked for a new series, and Disney created his first series of fully animated shorts, starring a character named Oswald the Lucky Rabbit. The series was produced by Winkler Pictures and distributed by Universal Pictures. The Walt Disney Studios completed 26 Oswald shorts. In 1928, Disney and Mintz entered into a contract dispute, with Disney asking for a larger fee, while Mintz sought to reduce the price. Disney discovered Universal Pictures owned the intellectual property rights to Oswald, and Mintz threatened to produce the shorts without him if he did not accept the reduction in payment. Disney declined and Mintz signed 4 of Walt Disney Studio's primary animators to start his own studio; Iwerks was the only top animator to remain with the Disney brothers. Disney and Iwerks replaced Oswald with a mouse character originally named Mortimer Mouse, before Disney's wife urged him to change the name to Mickey Mouse. In May 1928, Mickey Mouse debuted in test screenings of the shorts Plane Crazy and The Gallopin' Gaucho. Later that year, the studio produced Steamboat Willie, its first sound film and third short in the Mickey Mouse series, which was made using synchronized sound, becoming the first post-produced sound cartoon. The sound was created using Powers' Cinephone system, which used Lee de Forest's Phonofilm system. Pat Powers' company distributed Steamboat Willie, which was an immediate hit. In 1929, the company successfully re-released the two earlier films with synchronized sound. After the release of Steamboat Willie at the Colony Theater in New York, Mickey Mouse became an immensely popular character. Disney Brothers Studio made several cartoons featuring Mickey and other characters. In August 1929, the company began making the Silly Symphony series with Columbia Pictures as the distributor, because the Disney brothers felt they were not receiving their share of profits from Powers. Powers ended his contract with Iwerks, who later started his own studio. Carl W. Stalling played an important role in starting the series, and composed the music for early films but left the company after Iwerks' departure. In September, theater manager Harry Woodin requested permission to start a Mickey Mouse Club at his theater the Fox Dome to boost attendance. Disney agreed, but David E. Dow started the first-such club at Elsinore Theatre before Woodin could start his. On December 21, the first meeting at Elsinore Theatre was attended by around 1,200 children. On July 24, 1930, Joseph Conley, president of King Features Syndicate, wrote to the Disney studio and asked the company to produce a Mickey Mouse comic strip; production started in November and samples were sent to King Features. On December 16, 1930, the Walt Disney Studios partnership was reorganized as a corporation with the name Walt Disney Productions, Limited, which had a merchandising division named Walt Disney Enterprises, and subsidiaries called Disney Film Recording Company, Limited and Liled Realty and Investment Company; the latter of which managed real estate holdings. Walt Disney and his wife held 60% (6,000 shares) of the company, and Roy Disney owned 40%. The comic strip Mickey Mouse debuted on January 13, 1930, in New York Daily Mirror and by 1931, the strip was published in 60 newspapers in the US, and in 20 other countries. After realizing releasing merchandise based on the characters would generate more revenue, a man in New York offered Disney $300 for license to put Mickey Mouse on writing tablets he was manufacturing. Disney accepted and Mickey Mouse became the first licensed character. In 1933, Disney asked Kay Kamen, the owner of a Kansas City advertising firm, to run Disney's merchandising; Kamen agreed and transformed Disney's merchandising. Within a year, Kamen had 40 licenses for Mickey Mouse and within two years, had made $35 million worth of sales. In 1934, Disney said he made more money from the merchandising of Mickey Mouse than from the character's films. The Waterbury Clock Company created a Mickey Mouse watch, which became so popular it saved the company from bankruptcy during the Great Depression. During a promotional event at Macy's, 11,000 Mickey Mouse watches sold in one day; and within two years, two-and-a-half million watches were sold. As Mickey Mouse become a heroic character rather than a mischievous one, Disney needed another character that could produce gags. Disney invited radio presenter Clarence Nash to the animation studio; Disney wanted to use Nash to play Donald Duck, a talking duck that would be the studio's new gag character. Donald Duck made his first appearance in 1934 in The Wise Little Hen. Though he did not become popular as quickly as Mickey had, Donald Duck had a featured role in Donald and Pluto (1936) and was given his own series. After a disagreement with Columbia Pictures about the Silly Symphony cartoons, Disney signed a distribution contract with United Artists from 1932 to 1937 to distribute them. In 1932, Disney signed an exclusive contract with Technicolor to produce cartoons in color until the end of 1935, beginning with the Silly Symphony short Flowers and Trees (1932). The film was the first full-color cartoon and won the Academy Award for Best Cartoon. In 1933, The Three Little Pigs, another popular Silly Symphony short, was released and also won the Academy Award for Best Cartoon. The song from the film "Who's Afraid of the Big Bad Wolf?", which was composed by Frank Churchill—who wrote other Silly Symphonies songs—became popular and remained so throughout the 1930s, and became one of the best-known Disney songs. Other Silly Symphonies films won the Best Cartoon award from 1931 to 1939, except for 1938, when another Disney film, Ferdinand the Bull, won it. In 1934, Walt Disney announced a feature-length animated film, Snow White and the Seven Dwarfs. It would be the first cel animated feature and the first animated feature produced in the US. Its novelty made it a risky venture; Roy tried to persuade Walt not to produce it, arguing it would bankrupt the studio, and while widely anticipated by the public, it was referred to by some critics as "Disney's Folly". Walt directed the animators to take a realistic approach, creating scenes as though they were live action. While making the film, the company created the multiplane camera, consisting of pieces of glass upon which drawings were placed at different distances to create an illusion of depth in the backgrounds. After United Artists attempted to attain future television rights to the Disney shorts, Walt signed a distribution contract with RKO Radio Pictures on March 2, 1936. Walt Disney Productions exceeded its original budget of $150,000 for Snow White by ten times; its production eventually cost the company $1.5 million. Snow White took 3 years to make, premiering on December 12, 1937. It was an immediate critical and commercial success, becoming the highest-grossing film up to that point, grossing $8 million (equivalent to $179,166,667 in 2025 dollars); after re-releases, it grossed a total of $998,440,000 in the US adjusted for inflation. Using the profits from Snow White, Disney financed the construction of a new 51-acre studio complex in Burbank, which the company fully moved into in 1940 and where the company is still headquartered. In April 1940, Disney Productions had its initial public offering, with the common stock remaining with Disney and his family. Disney did not want to go public, but the company needed the money. Shortly before Snow White's release, work began on the company's next features, Pinocchio and Bambi. Pinocchio was released in February 1940 while Bambi was postponed. Despite Pinocchio's critical acclaim (it won the Academy Awards for Best Song and Best Score and was lauded for groundbreaking achievements in animation), the film performed poorly at the box office, due to World War II affecting the international box office. The company's third feature Fantasia (1940) introduced groundbreaking advancements in cinema technology, chiefly Fantasound, an early surround sound system making it the first commercial film to be shown in stereo. However, Fantasia similarly performed poorly at the box office. In 1941, the company experienced a major setback when 300 of its 800 animators, led by one of the top animators Art Babbitt, went on strike for 5 weeks for unionization and higher pay. Walt Disney publicly accused the strikers of being party to a communist conspiracy and fired many of them, including some of the studio's best. Roy unsuccessfully attempted to persuade the company's main distributors to invest in the studio, which could no longer afford to offset production costs with employee layoffs. The anthology film The Reluctant Dragon (1941) ran $100,000 short of its production cost, contributing to the studio's financial woes.[clarification needed] While negotiations to end the strike were underway, Walt and studio animators embarked on a 12-week goodwill visit to South America, funded by the Office of the Coordinator of Inter-American Affairs. During the trip, the animators began plotting films, taking inspiration from the local environments and music. As a result of the strike, federal mediators compelled the studio to recognize the Screen Cartoonist's Guild and several animators left, leaving it with 694 employees. To recover from financial losses, Disney rushed the studio's 4th animated feature Dumbo (1941) into production on a reduced budget; this performed well at the box office, infusing the studio with much needed cash. After US entry into World War II, many of the company's animators were drafted into the army. 500 United States Army soldiers occupied the studio for 8 months to protect a nearby Lockheed aircraft plant. While they were there, the soldiers fixed equipment in large soundstages and converted storage sheds into ammunition depots. The United States Navy asked Disney to produce propaganda films to gain support for the war, and with the studio badly in need of profits, Disney agreed, signing a contract for 20 war-related shorts for $90,000. Most of the company's employees worked on the project, which spawned films such as Victory Through Air Power, and others which included some of the company's characters. In August 1942, Disney released its fifth feature film, Bambi, after five years in development, and performed poorly at the box office. Later, as products of the South American trip, Disney released the features Saludos Amigos (1942) and The Three Caballeros (1944). This was a new strategy of releasing package films, collections of short cartoons grouped to make feature films. Both performed poorly. Disney released more package films through the rest of the decade, including Make Mine Music (1946), Fun and Fancy Free (1947), Melody Time (1948), and The Adventures of Ichabod and Mr. Toad (1949), to try to recover from its financial losses. Disney began producing less-expensive live-action films mixed with animation, beginning with Song of the South (1946) which would become one of Disney's most controversial films. As a result of its financial problems, Disney began re-releasing its feature films in 1944. In 1948, it began premiering the nature documentary series, True-Life Adventures, which ran until 1960, winning 8 Academy Awards. In 1949, the Walt Disney Music Company was founded to help with profits for merchandising. In the 1950s, Disney returned to producing full-length animated feature films, beginning with Cinderella (1950), its first feature in eight years. A critical and commercial success, Cinderella saved Disney after the financial pitfalls of the wartime era; it was its most financially successful film since Snow White, making $8 million in its first year. Walt began to reduce his involvement with animation, focusing his attention on the studio's increasingly diverse portfolio of projects, including live-action films (of which Treasure Island was the studio's first), television and amusement parks. In 1950 the company made its first foray into television when NBC aired "One Hour in Wonderland", a promotional program for Disney's next animated film, Alice in Wonderland (1951), and sponsored by Coca-Cola. Alice was financially unsuccessful, falling $1 million short of the production budget. In February 1953, Disney's next animated film Peter Pan was released to financial success; it was the last Disney film distributed by RKO after Disney ended its contract and created its own distribution company Buena Vista Distribution. According to Walt, he first had the idea of building an amusement park during a visit to Griffith Park with his daughters. He said he watched them ride a carousel and thought there "should be ... some kind of amusement enterprise built where the parents and the children could have fun together". Initially planning the construction of an eight-acre (3.2 ha) Mickey Mouse Park near the Burbank studio, Walt changed the planned amusement park's name to Disneylandia, then to Disneyland. A new company, WED Enterprises (now Walt Disney Imagineering), was formed in 1952 to design and construct the park. Drawing inspiration from amusement parks in the US and Europe, Walt approached the design of Disneyland with an emphasis on thematic storytelling and cleanliness, innovative approaches for amusement parks of the time. The plan to build the park in Burbank was abandoned when Walt realized 8 acres would not be enough to accomplish his vision. Disney acquired 160 acres (65 ha) of orange groves in Anaheim, southeast of LA in neighboring Orange County, at $6,200 per acre to build the park. Construction began in July 1954. Walt formed a new company called Retlaw to handle his personal business, primarily Carolwood Pacific Railroad. To finance the construction of Disneyland, Disney sold his home at Smoke Tree Ranch in Palm Springs and the company promoted it with a television series of the same name aired on ABC. The Disneyland television series, which would be the first in a long-running series of successful anthology television programs for the company, was a success and garnered over 50% of viewers in its time slot, along with praise from critics. In August, Walt formed another company Disneyland, Inc. to finance the park, whose construction costs totaled $17 million. In October, with the success of Disneyland, ABC allowed Disney to produce The Mickey Mouse Club, a variety show for children; the show included a daily Disney cartoon, a children's newsreel, and a talent show. It was presented by a host, and talented children and adults called "Mousketeers" and "Mooseketeers", respectively. After the first season, over ten million children and five million adults watched it daily; and two million Mickey Mouse ears, which the cast wore, were sold. In December 1954, the five-part miniseries Davy Crockett, premiered as part of Disneyland, starring Fess Parker. According to writer Neal Gabler, "[It] became an overnight national sensation", selling 10 million Crockett coonskin caps. The show's theme song "The Ballad of Davy Crockett" became part of American pop culture, selling 10 million records. Los Angeles Times called it "the greatest merchandising fad the world had ever seen". In June 1955, Disney's 15th animated film Lady and the Tramp was released and performed better at the box office than any other Disney films since Snow White. Disneyland opened on July 17, 1955; it was a major media event, broadcast live on ABC with actors Art Linkletter, Bob Cummings, and Ronald Reagan hosting. It garnered over 90 million viewers, becoming the most-watched live broadcast to that date. While the park's opening day was disastrous (restaurants ran out of food, the Mark Twain Riverboat began to sink, other rides malfunctioned, and the drinking fountains were not working in the 100 °F. (38 °C) heat), the park became a success with 161,657 visitors in its first week and 20,000 visitors a day in its first month. After its first year, 3.6 million people had visited, and after its second year, four million more guests came, making it more popular than the Grand Canyon and Yellowstone National Park. That year, the company earned a gross total of $24.5 million compared to the $11 million the previous year. Disney continued to delegate much of the animation work to the studio's top animators, known as the Nine Old Men. The company produced an average of five films per year throughout the 1950s and 60s. Animated features of this period included Sleeping Beauty (1959), One Hundred and One Dalmatians (1961), and The Sword in the Stone (1963). Sleeping Beauty was a financial loss for the company, and at $6 million, had the highest production costs up to that point. One Hundred and One Dalmatians introduced an animation technique using the xerography process to electromagnetically transfer the drawings to animation cels, resulting in a transformed art style for the studio's animated films. In 1956, the Sherman Brothers, Robert and Richard, were asked to produce a theme song for the television series Zorro. The company hired them as exclusive staff songwriters, an arrangement that lasted 10 years. They wrote many songs for Disney's films and theme parks, and several were commercial hits. In the late 1950s, Disney ventured into comedy with the live-action films The Shaggy Dog (1959), which became the highest-grossing film in the US and Canada for Disney at over $9 million, and The Absent Minded Professor (1961), both starring Fred MacMurray. Disney also made live-action films based on children's books including Pollyanna (1960) and Swiss Family Robinson (1960). Child actor Hayley Mills starred in Pollyanna, for which she won an Academy Juvenile Award. Mills starred in 5 other Disney films, including a dual role as the twins in The Parent Trap (1961). Another child actor, Kevin Corcoran, was prominent in many Disney live-action films, first appearing in a serial for The Mickey Mouse Club, where he would play a boy named Moochie. He worked alongside Mills in Pollyanna, and starred in features such as Old Yeller (1957), Toby Tyler (1960), and Swiss Family Robinson. In 1964, the live action/animation musical film Mary Poppins was released to major commercial success and rapturous critical acclaim, becoming the year's highest-grossing film and winning five Academy Awards, including Best Actress for Julie Andrews as Poppins and Best Song for the Sherman Brothers, who also won Best Score for the film's "Chim Chim Cher-ee". Throughout the 1960s, Dean Jones, whom The Guardian called "the figure who most represented Walt Disney Productions in the 1960s", starred in 10 Disney films, including That Darn Cat! (1965), The Ugly Dachshund (1966), and The Love Bug (1968). Disney's last child actor of the 1960s was Kurt Russell, who had signed a ten-year contract. He featured in films such as The Computer Wore Tennis Shoes (1969), The Horse in the Gray Flannel Suit (1968) alongside Dean Jones, The Barefoot Executive (1971), and The Strongest Man in the World (1975). In late 1959, Walt had an idea to build another park in Palm Beach, Florida, called the City of Tomorrow, a city that would be full of technological improvements. In 1964, the company chose land southwest of Orlando, Florida to build the park and acquired 27,000 acres (10,927 ha). On November 15, 1965, Walt, along with Roy and Florida's governor Haydon Burns, announced plans for a park called Disney World, which included Magic Kingdom—a larger version of Disneyland—and the City of Tomorrow, at the park's center. By 1967, the company had made expansions to Disneyland, and more rides were added in 1966 and 1967, at a cost of $20 million. The new rides included Walt Disney's Enchanted Tiki Room, which was the first attraction to use Audio-Animatronics; Walt Disney's Carousel of Progress, which debuted at the 1964 New York World's Fair before moving to Disneyland in 1967; and Dumbo the Flying Elephant. On November 20, 1964, Walt sold most of WED Enterprise to Walt Disney Productions for $3.8 million after being persuaded by Roy, who thought Walt having his own company would cause legal problems. When Disney started looking for a sponsor for the Florida project, Walt renamed the City of Tomorrow as the Experimental Prototype Community of Tomorrow (EPCOT). Walt, who had been a heavy smoker since World War I, fell very sick and he died on December 15, 1966, aged 65, of lung cancer, at St. Joseph Hospital across the street from the studio. In 1967, the last two films Walt had worked on were released; the animated film The Jungle Book, which was Disney's most successful film for the next two decades, and the live-action musical The Happiest Millionaire. After Walt's death, the company largely abandoned animation, but made several live-action films. Its animation staff declined from 500 to 125 employees, with the company only hiring 21 people from 1970 to 1977. Disney's first post-Walt animated film The Aristocats was released in 1970; according to Dave Kehr of Chicago Tribune, "the absence of his [Walt's] hand is evident". The following year, the anti-fascist musical Bedknobs and Broomsticks was released and won the Oscar for Best Special Visual Effects. At the time of Walt's death, Roy was ready to retire but wanted to keep Walt's legacy alive; he became the first CEO and chairman of the company. In May 1967, Roy had legislation passed by Florida's legislatures to grant Disney World its own quasi-government agency in an area called Reedy Creek Improvement District. Roy changed Disney World's name to Walt Disney World to memorialize Walt. EPCOT became less the City of Tomorrow, and was shelved and eventually transformed into another theme park. After 18 months of construction at a cost of around $400 million, Walt Disney World's first park the Magic Kingdom, along with Disney's Contemporary Resort and Disney's Polynesian Resort, opened on October 1, 1971, with 10,400 visitors. A parade with over 1,000 band members, 4,000 Disney entertainers, and a choir from the US Army marched down Main Street. The icon of the park was the Cinderella Castle. On Thanksgiving Day, cars traveling to the Magic Kingdom caused traffic jams along interstate roads. On December 21, 1971, Roy died of cerebral hemorrhage at St. Joseph Hospital. Donn Tatum, a senior executive and former president of Disney, became the first non-Disney-family-member to become CEO and chairman. Card Walker, who had been with the company since 1938, became its president. By June 30, 1973, Disney had over 23,000 employees and a gross revenue of $257,751,000 over a nine-month period, compared to the year before when it made $220,026,000. In November, Disney released the animated film Robin Hood (1973), which became Disney's biggest international-grossing movie at $18 million. Throughout the 1970s, Disney released live-action films such as The Computer Wore Tennis Shoes' sequel Now You See Him, Now You Don't; The Love Bug sequels Herbie Rides Again (1974) and Herbie Goes to Monte Carlo (1977); Escape to Witch Mountain (1975); and Freaky Friday (1976). In 1976, Card Walker became CEO of the company, with Tatum remaining chairman until 1980, when Walker replaced him. In 1977, Roy E. Disney, Roy O. Disney's son and the only Disney working for the company, resigned as an executive because of disagreements with company decisions. In 1977, Disney released the successful animated film The Rescuers, which grossed $48 million. The live action/animated musical Pete's Dragon was released in 1977, grossing $16 million in the US and Canada, but was a disappointment to the company. In 1979, Disney's first PG-rated film and most expensive film to that point at $26 million The Black Hole was released, showing Disney could use special effects. It grossed $35 million, a disappointment to the company, which thought it would be a hit like Star Wars (1977). The Black Hole was a response to other Science fiction films of the era. In September, 12 animators, which was over 15% of the department, resigned. Led by Don Bluth, they left because of a conflict with the training program and the atmosphere, and started their own company Don Bluth Productions. In 1981, Disney released Dumbo to VHS and Alice in Wonderland the following year, leading Disney to eventually release all its films on home media. On July 24, Walt Disney's World on Ice, a two-year tour of ice shows featuring Disney charters, made its premiere at the Brendan Byrne Meadowlands Arena after Disney licensed its characters to Feld Entertainment. The same month, Disney's animated film The Fox and the Hound was released and became the highest-grossing animated film to that point at $40 million. It was the first film that did not involve Walt and the last major work done by Disney's Nine Old Men, who were replaced with younger animators. As profits started to decline, on October 1, 1982, Epcot, then known as EPCOT Center, opened as the second theme park in Walt Disney World, with around 10,000 people in attendance during the opening. The park cost over $900 million to construct, and consisted of the Future World pavilion and World Showcase representing Mexico, China, Germany, Italy, America, Japan, France, the UK, and Canada; Morocco and Norway were added in 1984 and 1988, respectively. The animation industry continued to decline and 69% of the company's profits were from its theme parks; in 1982, there were 12 million visitors to Walt Disney World, a figure that declined by 5% the following June. On July 9, 1982, Disney released Tron, one of the first films to extensively use computer-generated imagery (CGI). It was a big influence on other CGI movies, though it received mixed reviews. In 1982, the company lost $27 million. On April 15, 1983, Disney's first park outside the US, Tokyo Disneyland, opened in Urayasu. Costing around $1.4 billion, construction started in 1979 when Disney and the Oriental Land Company agreed to build a park together. Within its first ten years, the park had over 140 million visitors. After an investment of $100 million, on April 18, Disney started a pay-to-watch cable television channel called Disney Channel, a 16-hours-a-day service showing Disney films, twelve programs, and two magazines shows for adults. Although it was expected to do well, the company lost $48 million after its first year, with around 916,000 subscribers. In 1983, Walt's son-in-law Ron W. Miller, who had been president since 1978, became its CEO, and Raymond Watson became chairman. Miller wanted the studio to produce more content for mature audiences, and Disney founded film distribution label Touchstone Pictures to produce films geared toward adults and teenagers in 1984. Splash (1984) was the first film released under the label, and a much-needed success, grossing over $6 million in its first week. Disney's first R-rated film Down and Out in Beverly Hills (1986) was released and was another hit, grossing $62 million. The following year, Disney's first PG-13 rated film Adventures in Babysitting was released. In 1984, Saul Steinberg attempted to buy out the company [fr], holding 11% of the stocks. He offered to buy 49% for $1.3 billion or the entire company for $2.75 billion. Disney, which had less than $10 million, rejected Steinberg's offer and offered to buy all of his stock for $326 million. Steinberg agreed, and Disney paid it all with part of a $1.3 billion bank loan, putting the company $866 million in debt. In 1984, shareholders Roy E. Disney, Sid Bass, Lillian and Diane Disney, and Irwin L. Jacobs—who together owned about 36% of the shares [fr], forced out CEO Miller and replaced him with Michael Eisner, a former president of Paramount Pictures, and appointed Frank Wells as president. Eisner's first act was to make it a major film studio, which at the time it was not considered. Eisner appointed Jeffrey Katzenberg as chairman and Roy E. Disney as head of animation. Eisner wanted to produce an animated film every 18 months rather than four years, as the company had been doing. To help with the film division, the company started making Saturday-morning cartoons to create new Disney characters for merchandising and produced films through Touchstone. Under Eisner, Disney became more involved with television, creating Touchstone Television and producing the television sitcom The Golden Girls, which was a hit. The company spent $15 million promoting its theme parks, raising visitor numbers by 10%. In 1984, Disney produced The Black Cauldron, then the most-expensive animated movie at $40 million, its first animated film to feature computer-generated imagery, and its first PG-rated animation because of its adult themes. The film was a box-office failure, leading the company to move the animation department from the studio in Burbank to a warehouse in Glendale, California. The film-financing partnership Silver Screen Partners II, which was organized in 1985, financed films for Disney with $193 million. In January 1987, Silver Screen Partners III began financing movies for Disney with $300 million raised by E.F. Hutton, the largest amount raised for a film-financing limited partnership. Silver Screen IV was also set up to finance Disney's studios. In 1986, the company changed its name from Walt Disney Productions to the Walt Disney Company, stating the old name only referred to the film industry. With Disney's animation industry declining, the animation department needed its next movie The Great Mouse Detective to be a success. It grossed $25 million at the box office, becoming a much-needed financial success. To generate more revenue from merchandising, the company opened its first retail store Disney Store in Glendale in 1987. Because of its success, the company opened two more in California, and by 1990, it had 215 throughout the United States. In 1989, the company garnered $411 million in revenue and made a profit of $187 million. In 1987, the company signed an agreement with the Government of France to build a resort named Euro Disneyland in Paris; it would consist of two theme parks named Disneyland Park and Walt Disney Studios Park, a golf course, and 6 hotels. In 1988, Disney's 27th animated film Oliver & Company was released the same day as that of former Disney animator Don Bluth's The Land Before Time. Oliver & Company out-competed The Land Before Time, becoming the first animated film to gross over $100 million in its initial release, and the highest-grossing animated film in its initial run. Disney became the box-office-leading Hollywood studio for the first time, with films such as Who Framed Roger Rabbit (1988), Three Men and a Baby (1987), and Good Morning, Vietnam (1987). The company's gross revenue went from $165 million in 1983 to $876 million in 1987, and operating income went from −$33 million in 1983 to +130 million in 1987. The studio's net income rose by 66%, along with a 26% growth in revenue. Los Angeles Times called Disney's recovery "a real rarity in the corporate world". On May 1, 1989, Disney opened Disney-MGM Studios, its third amusement park at Walt Disney World, and later became Hollywood Studios. The new park demonstrated to visitors the movie-making process, until 2008, when it was changed to make guests feel they are in movies. Following the opening of Disney-MGM Studios, Disney opened the water park Typhoon Lagoon in June 1989; in 2022 it had 1.9 million visitors and was the most popular water park in the world. Also in 1989, Disney signed an agreement-in-principle to acquire The Jim Henson Company from its founder. The deal included Henson's programming library and Muppet characters—excluding the Muppets created for Sesame Street—as well as Henson's personal creative services. Henson, however, died in May 1990 before the deal was completed, resulting in the companies terminating merger negotiations. On November 17, 1989, Disney released The Little Mermaid, which was the start of the Disney Renaissance, a period in which the company released hugely successful and critically acclaimed animated films. The Little Mermaid became the animated film with the highest gross from its initial run and garnered $233 million at the box office; it won two Academy Awards; Best Original Score and Best Original Song for "Under the Sea". During the Disney Renaissance, composer Alan Menken and lyricist Howard Ashman wrote several Disney songs until Ashman died in 1991. Together they wrote 6 songs nominated for Academy Awards; with two winning songs—"Under the Sea" and "Beauty and the Beast". To produce music geared for the mainstream, including music for movie soundtracks, Disney founded the recording label Hollywood Records on January 1, 1990. In September 1990, Disney arranged for financing of up to $200 million by a unit of Nomura Securities for Interscope films made for Disney. On October 23, Disney formed Touchwood Pacific Partners, which replaced the Silver Screen Partnership series as the company's movie studios' primary source of funding. Disney's first animated sequel The Rescuers Down Under was released on November 16, 1990, and created using Computer Animation Production System (CAPS), digital software developed by Disney and Pixar—the computer division of Lucasfilm—becoming the first feature film to be entirely created digitally. Although the film struggled in the box office, grossing $47 million, it received positive reviews. In 1991, Disney and Pixar agreed to a deal to make three films together, the first one being Toy Story. Dow Jones & Company, wanting to replace 3 companies in its industrial average, chose to add Disney in May 1991, stating Disney "reflects the importance of entertainment and leisure activities in the economy". Disney's next animated film Beauty and the Beast was released on November 13, 1991, and grossed nearly $430 million. It was the first animated film to win a Golden Globe for Best Picture, and it received 6 Academy Award nominations, becoming the first animation nominated for Best Picture; it won Best Score, Best Sound, and Best Song. The film was critically acclaimed, with some critics considering it to be the best Disney film. To coincide with the 1992 release of The Mighty Ducks, Disney founded the National Hockey League team The Mighty Ducks of Anaheim. Disney's next animated feature Aladdin was released on November 11, 1992, and grossed $504 million, becoming the highest-grossing animated film to that point, and the first animated film to gross a half-billion dollars. It won two Academy Awards—Best Song for "A Whole New World" and Best Score; and "A Whole New World" was the first-and-only Disney song to win the Grammy for Song of the Year. For $60 million, Disney broadened its range of mature-audience films by acquiring independent film distributor Miramax Films in 1993. The same year, in a venture with The Nature Conservancy, Disney purchased 8,500 acres (3,439 ha) of Everglades headwaters in Florida to protect native animals and plant species, establishing the Disney Wilderness Preserve. On April 3, 1994, Frank Wells died in a helicopter crash; he, Eisner, and Katzenberg helped the company's market value go from $2 billion to $22 billion since taking office in 1984. On June 15 the same year, The Lion King was released and was a massive success, becoming the second-highest-grossing film of all time behind Jurassic Park and the highest-grossing animated film of all time, with a gross total of $969 million. It was critically praised and garnered two Academy Awards—Best Score and Best Song for "Can You Feel the Love Tonight". Soon after its release, Katzenberg left the company after Eisner refused to promote him to president. After leaving, he co-founded film studio DreamWorks SKG. Wells was later replaced with one of Eisner's friends Michael Ovitz on August 13, 1995. In 1994, Disney wanted to buy one of the major US television networks ABC, NBC, or CBS, which would give the company guaranteed distribution for its programming. Eisner planned to buy NBC, but the deal was canceled because General Electric wanted to keep a majority stake. In 1994, Disney's annual revenue reached $10 billion, 48% coming from film, 34% from theme parks, and 18% from merchandising. Disney's total net income was up 25% from the previous year at $1.1 billion. Grossing over $346 million, Pocahontas was released on June 16, garnering the Academy Awards for Best Musical or Comedy Score and Best Song for "Colors of the Wind". Pixar's and Disney's first co-release was the first-ever fully computer-generated film Toy Story, which was released on November 19, 1995, to critical acclaim and an end-run gross total of $361 million. The film won the Special Achievement Academy Award and was the first animated film to be nominated for Best Original Screenplay. In 1995, Disney announced the $19 billion acquisition of television network Capital Cities/ABC Inc., which was then the 2nd-largest corporate takeover in US history. Through the deal, Disney would obtain broadcast network ABC, an 80% majority stake in sports networks ESPN and ESPN 2, 50% in Lifetime Television, a majority stake of DIC Entertainment, and a 38% minority stake in A&E Television Networks. Following the deal, the company started Radio Disney, a youth-focused radio program on ABC Radio Network, on November 18, 1996. The Walt Disney Company launched its official website disney.com on February 22, 1996, mainly to promote its theme parks and merchandise. On June 19, the company's next animated film The Hunchback of Notre Dame was released, grossing $325 million at the box office. Because Ovitz's management style was different from Eisner's, Ovitz was fired as the company's president in 1996. Disney lost a $10.4 million lawsuit in September 1997 to Marsu B.V. over Disney's failure to produce as contracted 13 half-hours Marsupilami cartoon shows. Instead, Disney felt other internal "hot properties" deserved the company's attention. Disney, which since 1996 had owned a 25% stake in the Major League Baseball team California Angels, bought out the team in 1998 for $110 million, renamed it Anaheim Angels and renovated the stadium for $100 million. Hercules (1997) was released on June 13, and underperformed compared to earlier films, grossing $252 million. On February 24, Disney and Pixar signed a ten-year contract to make five films, with Disney as distributor. They would share the cost, profits, and logo credits, calling the films Disney-Pixar productions. During the Disney Renaissance, film division Touchstone also saw success with film such as Pretty Woman (1990), which has the highest number of ticket sales in the US for a romantic comedy and grossed $432 million; Sister Act (1992), which was one of the financially successful comedies of the early 1990s, grossing $231 million; action film Con Air (1997), which grossed $224 million; and the highest-grossing film of 1998 at $553 million Armageddon. At Disney World, the company opened Disney's Animal Kingdom, the largest theme park in the world covering 580 acres (230 ha) on Earth Day, April 22, 1998. It had six animal-themed lands, over 2,000 animals, and the Tree of Life at its center. Receiving positive reviews, Disney's next animated films Mulan and Disney-Pixar film A Bug's Life were released on June 5 and November 20, 1998. Mulan became the year's sixth-highest-grossing film at $304 million, and A Bug's Life was the year's fifth-highest at $363 million. In a $770-million transaction, on June 18, Disney bought a 43% stake of Internet search engine Infoseek for $70 million, also giving it Infoseek-acquired Starwave. Starting web portal Go.com in a joint venture with Infoseek in January 1999, Disney acquired the rest of Infoseek later that year. After unsuccessful negotiations with cruise lines Carnival and Royal Caribbean International, in 1994, Disney announced it would start its own cruise-line operation in 1998. The first two ships of the Disney Cruise Line were named Disney Magic and Disney Wonder, and built by Fincantieri in Italy. To accompany the cruises, Disney bought Gorda Cay as the line's private island, and spent $25 million remodeling it and renaming it Castaway Cay. On July 30, 1998, Disney Magic set sail as the line's first voyage. Marking the end of the Disney Renaissance, Tarzan (1999) was released on June 12, garnering $448 million at the box office and critical acclaim; it claimed the Academy Award for Best Original Song for Phil Collins' "You'll Be in My Heart". Disney-Pixar film Toy Story 2 was released on November 13, garnering praise and $511 million at the box office. To replace Ovitz, Eisner named ABC network chief Bob Iger Disney's president and chief operating officer in January 2000. In November, Disney sold DIC Entertainment back to Andy Heyward. Disney had another huge success with Pixar when they released Monsters, Inc. in 2001. Later, Disney bought children's cable network Fox Family Worldwide for $3 billion and the assumption of $2.3 billion in debt. The deal included a 76% stake in Fox Kids Europe, Latin American channel Fox Kids, more than 6,500 episodes from Saban Entertainment's programming library, and Fox Family Channel. It also bought the rights to the Winnie-the-Pooh. In 2001, Disney's operations had a net loss of $158 million after a decline in viewership of the ABC television network, as well as decreased tourism due to the September 11 attacks. Disney earnings in fiscal 2001 were $120 million compared with the previous year's $920 million. To help reduce costs, Disney announced it would lay off 4,000 employees and close 300–400 Disney stores. After winning the World Series in 2002, Disney sold the Anaheim Angels for $180 million in 2003. In 2003, Disney became the first studio to garner $3 billion in a year at the box office. The same year, Roy Disney announced his retirement because of how the company was being run, calling on Eisner to retire; the same week, board member Stanley Gold retired for the same reasons. Gold and Disney formed the "Save Disney" campaign. In 2004, at the company's annual meeting, the shareholders in a 43% vote voted Eisner out as chairman. On March 4, George J. Mitchell, who was a member of the board, was named as replacement. In April, Disney purchased the Muppets franchise from the Jim Henson Company for $75 million, founding Muppets Holding Company, LLC. Following the success of Disney-Pixar films Finding Nemo (2003), which became the second highest-grossing animated film of all time at $936 million, and The Incredibles (2004), Pixar looked for a new distributor once its deal with Disney ended in 2004. Disney sold the loss-making Disney Stores chain of 313 stores to Children's Place on October 20. Disney also sold the NHL team Mighty Ducks in 2005. Roy E. Disney decided to rejoin the company and was given the role of consultant with the title "Director Emeritus". In March 2005, Bob Iger, president of the company, became CEO after Eisner's retirement in September; Iger was officially named head of the company on October 1. Disney's eleventh theme park Hong Kong Disneyland opened on September 12, costing the company $3.5 billion to construct. On January 24, 2006, Disney began talks to acquire Pixar from Steve Jobs for $7.4 billion, and Iger appointed Pixar chief creative officer (CCO) John Lasseter and president Edwin Catmull the heads of the Walt Disney Animation Studios. A week later, Disney traded ABC Sports commentator Al Michaels to NBCUniversal, in exchange for the rights to Oswald the Lucky Rabbit and 26 cartoons featuring the character. On February 6, the company announced it would be merging its ABC Radio networks and 22 stations with Citadel Broadcasting in a $2.7 billion deal, though which Disney acquired 52% of television broadcasting company Citadel Communications. The Disney Channel movie High School Musical aired and its soundtrack was certified triple platinum, becoming the first Disney Channel film to do so. Disney's 2006 live-action film Pirates of the Caribbean: Dead Man's Chest was Disney's biggest hit to that date and the third-highest-grossing film ever, making $1 billion at the box office. On June 28, the company announced it was replacing George Mitchell as chairman with a board members and former CEO of P&G John E. Pepper Jr. The sequel High School Musical 2 was released in 2007 on Disney Channel and broke several cable rating records. In April 2007, the Muppets Holding Company was moved from Disney Consumer Products to the Walt Disney Studios division and renamed the Muppets Studios to relaunch the division. Pirates of the Caribbean: At World's End became the highest-grossing film of 2007 at $960 million. Disney-Pixar films Ratatouille (2007) and WALL-E (2008) were a tremendous success, with WALL-E winning the Oscar for Best Animated Feature. After acquiring most of Jetix Europe through the acquisition of Fox Family Worldwide, Disney bought the remainder of the company in 2008 for $318 million. Iger introduced D23 in 2009 as Disney's official fan club. In February, Disney announced a deal with DreamWorks Pictures to distribute 30 of its films over the next five years through Touchstone Pictures, with Disney getting 10% of the gross. The 2009 film Up garnered Disney $735 million at the box office, and the film won Best Animated Feature at the Academy Awards. Later that year, Disney launched a television channel named Disney XD, aimed at older children. Disney bought Marvel Entertainment and its assets for $4 billion in August, adding Marvel's comic-book characters to its merchandising line-up. In September, Disney partnered with News Corporation and NBCUniversal in a deal in which all parties would obtain 27% equity in streaming service Hulu, and Disney added ABC Family and Disney Channel to the streaming service. On December 16, Roy E. Disney died of stomach cancer; he was the last member of the Disney family to work for Disney. In March 2010, Haim Saban reacquired from Disney the Power Rangers franchise, including its 700-episode library, for around $100 million. Shortly after, Disney sold Miramax Films to an investment group headed by Ronald Tutor for $660 million. During that time, Disney released the live-action Alice in Wonderland and the Disney-Pixar film Toy Story 3, both of which grossed a little over $1 billion, making them the sixth-and-seventh films to do so; and Toy Story 3 became the first animated film to make over $1 billion and the highest-grossing animated film. That year, Disney became the first studio to release two $1-billion-dollar-earning films in one calendar year. In 2010, the company announced ImageMovers Digital, which it started in partnership with ImageMovers in 2007, would be closing by 2011. The following year, Disney released its last traditionally animated film Winnie the Pooh to theaters. The release of Pirates of the Caribbean: On Stranger Tides garnered a little over $1 billion, making it the eighth film to do so and Disney's highest-grossing film internationally, as well as the third-highest ever. In January 2011, the size of Disney Interactive Studios was reduced and 200 employees laid off. In April, Disney began constructing its new theme park Shanghai Disney Resort, costing $4.4 billion. In August, Iger stated after the success of the Pixar and Marvel purchases, he and the Walt Disney Company were planning to "buy either new characters or businesses that are capable of creating great characters and great stories". On October 30, 2012, Disney announced it would buy Lucasfilm for $4.05 billion from George Lucas. Through the deal, Disney gained access to franchises such as Star Wars, for which Disney said it would make a new film for every two-to-three year, with the first being released in 2015. The deal gave Disney access to the Indiana Jones franchise, visual-effects studio Industrial Light & Magic, and video game developer LucasArts. In February 2012, Disney completed its acquisition of UTV Software Communications, expanding its market into India and the rest of Asia. By March, Iger became Disney's chairman. Marvel film The Avengers became the third-highest-grossing film of all time with an initial-release gross of $1.3 billion. Making over $1.2 billion at the box office, the Marvel film Iron Man 3 was released in 2013. The same year, Disney's animated film Frozen was released and became the highest-grossing animated film of all time at $1.2 billion. Merchandising for the film became so popular it made the company $1 billion within a year, and a global shortage of merchandise for the film occurred. In March 2013, Iger announced Disney had no 2D animation films in development, and a month, later the hand-drawn animation division was closed, and several veteran animators were laid off. On March 24, 2014, Disney acquired Maker Studios, an active multi-channel network on YouTube, for $950 million. In June 2015, the company stated its consumer products and interactive divisions would merge to become new a subsidiary called Disney Consumer Products and Interactive Media. In August, Marvel Studios was placed under the Walt Disney Studios division. The company's 2015 releases include the successful animated film Inside Out, which grossed over $800 million, and the Marvel film Avengers: Age of Ultron, which grossed over $1.4 billion. Star Wars: The Force Awakens was released and grossed over $2 billion, making it the third-highest-grossing film of all time. On April 4, 2016, Disney announced COO Thomas O. Staggs, who was thought to be next in line after Iger, would leave in May, ending his 26-year career with Disney. Shanghai Disneyland opened on June 16, 2016, as the company's sixth theme-park resort. In a move to start a streaming service, Disney bought 33% of the stock in Major League Baseball technology company Bamtech for $1 billion in August. In 2016, four Disney film releases made over $1 billion; these were the animated film Zootopia, Marvel film Captain America: Civil War, Pixar film Finding Dory, and Rogue One: A Star Wars Story, making Disney the first studio to surpass $3 billion at the domestic box office. Disney made an attempt to buy social media platform Twitter to market its content and merchandise but canceled the deal. Iger stated this was because he thought Disney would be taking on responsibilities it did not need and that it did not "feel Disney" to him. On March 23, 2017, Disney announced Iger had agreed to a one-year extension as CEO to July 2019, and to remain as a consultant for three years. On August 8, 2017, Disney announced it would be ending its distribution deal with Netflix, with the intent of launching its own streaming platform by 2019. During that time, Disney paid $1.5 billion to acquire a 75% stake in BAMtech. Disney planned to start an ESPN streaming service with about "10,000 live regional, national, and international games and events a year" by 2018. In November, CCO John Lasseter said he would take a 6-month absence because of "missteps", reported to be sexual misconduct allegations. The same month, Disney and 21st Century Fox started negotiating a deal in which Disney would acquire most of Fox's assets. Beginning in March 2018, a reorganization of the company led to the creation of business segments Disney Parks, Experiences and Products and Direct-to-Consumer & International. Parks & Consumer Products was primarily a merger of Parks & Resorts and Consumer Products & Interactive Media, while Direct-to-Consumer & International took over for Disney International and global sales, distribution, and streaming units from Disney-ABC TV Group and Studios Entertainment plus Disney Digital Network. Iger described it as "strategically positioning our businesses" while according to The New York Times, the reorganization was done in expectation of the 21st Century Fox purchase. In 2017, two of Disney's films had revenues of over $1 billion; the live-action Beauty and the Beast and Star Wars: The Last Jedi. Disney launched subscription sports streaming service ESPN+ on April 12. In June 2018, Lasseter's departure by the end of the year was announced; he would stay as a consultant until then. To replace him; Disney promoted Jennifer Lee, co-director of Frozen and co-writer of Wreck-It Ralph (2012), as head of Walt Disney Animation Studios; and Pete Docter, who had been with Pixar since 1990 and directed Up, Monsters, Inc., and Inside Out, as head of Pixar. Comcast offered to buy 21st Century Fox for $65 billion over Disney's $51 billion bid but withdrew its offer after Disney countered with a $71 billion bid. Disney obtained antitrust approval from the United States Department of Justice to acquire Fox. Disney again made $7 billion at the box office with three film that made $1 billion; Marvel films Black Panther and Avengers: Infinity War—the latter taking over $2 billion and becoming the fifth-highest-grossing film ever— and Pixar film Incredibles 2. On March 20, 2019, Disney acquired 21st Century Fox's assets for $71 billion from Rupert Murdoch, making it the biggest acquisition in Disney's history. After the purchase, The New York Times described Disney as "an entertainment colossus the size of which the world has never seen". Through the acquisition, Disney gained 20th Century Fox; 20th Century Fox Television; Fox Searchlight Pictures; National Geographic Partners; Fox Networks Group; Indian television broadcaster Star India; streaming service Hotstar; and a 30% stake in Hulu, bringing its ownership on Hulu to 60%. Fox Corporation and its assets were excluded from the deal because of antitrust laws. Disney became the first film studio to have seven films gross $1 billion: Marvel's Captain Marvel, the live action Aladdin, Pixar's Toy Story 4, the CGI remake of The Lion King, Star Wars: The Rise of Skywalker, and the highest-grossing film of all time up to that point at $2.8 billion Avengers: Endgame. On November 12, Disney launched the Disney+ streaming service in the US, Canada and the Netherlands; At launch the service had 500 movies and 7,500 episodes of television shows from Disney, Pixar, Marvel, Star Wars, National Geographic, and other brands. Within the first day, the streaming platform had over 10 million subscriptions; and by 2022 it had over 135 million and was available in over 190 countries. At the beginning of 2020, Disney removed the Fox name from its assets, rebranding them as 20th Century Studios and Searchlight Pictures. Bob Chapek, who had been with the company for 18 years and was chairman of Disney Parks, Experiences and Products, became CEO after Iger resigned on February 25, 2020. Iger said he would stay as an Executive chairman until December 31, 2021, to help with its creative strategy. In April, Iger resumed operational duties as executive chairman to help the company during the COVID-19 pandemic, and Chapek was appointed to the board of directors. During the pandemic, Disney temporarily closed all its theme parks, delayed the release of several movies, and stopped all cruises. Due to the closures, Disney announced it would stop paying 100,000 employees but still provide healthcare benefits, and urged US employees to apply for government benefits, saving the company $500 million a month. Iger gave up his $47 million salary and Chapek took a 50% salary reduction. In the company's second fiscal quarter of 2020, Disney reported a $1.4 billion loss, with a fall in earnings of 91% to $475 million from the previous year's $5.4 billion. By August, two-thirds of the company was owned by large financial institutions. In September, the company dismissed 28,000 employees, 67% of whom were part-time, from its Parks, Experiences and Products division. Chairman of the division Josh D'Amaro wrote; "We initially hoped that this situation would be short-lived, and that we would recover quickly and return to normal. Seven months later, we find that has not been the case." Disney lost $4.7 billion in its fiscal third quarter of 2020. In November, Disney laid off another 4,000 employees, raising the total to 32,000 employees. The following month, Disney named Alan Bergman as chairman of its Disney Studios Content division to oversee its film studios. Due to the COVID-19 recession, Touchstone Television ceased operations in December, Disney announced in March 2021 it would be launching a new division called 20th Television Animation to focus on mature audiences, and Disney closed Blue Sky Studios in April 2021. Later that month, Disney and Sony agreed a multi-year licensing deal that would give Disney access to Sony's films from 2022 to 2026 to televise or stream on Disney+ once Sony's deal with Netflix ended. Although it performed poorly at the box office because of COVID-19, Disney's animated film Encanto (2021) was one of the biggest hits during the pandemic, with its song "We Don't Talk About Bruno" topping the US Billboard Hot 100 charts. After Iger's term as executive chairman ended on December 31, he announced he would resign as chairman. The company brought in an operating executive at the Carlyle Group and former board member Susan Arnold as Disney's first female chairperson. On March 10, Disney ceased operations in Russia because of Russia's invasion of Ukraine, and was the first major Hollywood studio to halt release of a major picture due to Russia's invasion; other movie studios followed. In March 2022, around 60 employees protested the company's silence on the Florida Parental Rights in Education Act that was dubbed the Don't Say Gay Bill, and prohibits non-age-appropriate classroom instruction on sexual orientation and gender identity in Florida's public-school districts. The protest was dubbed the "Disney Do Better Walkout"; employees protested near a Disney Studios lot, and other employees voiced their concerns through social media. Employees called on Disney to stop campaign contributions to Florida politicians who supported the bill, to help protect employees from it, and to stop construction at Walt Disney World in Florida. Chapek responded by stating the company had made a mistake by staying silent and said; "We pledge our ongoing support of the LGBTQ+ community". Amid Disney's response to the bill, the Florida Legislature passed a bill to remove Disney's quasi-government district Reedy Creek. On June 28, Disney's board members unanimously agreed to give Chapek a three-year contract extension. In August, Disney Streaming exceeded Netflix in total subscriptions with 221 million subscribers compared to Netflix's 220 million. On November 20, 2022, Iger accepted the position of Disney's CEO after Chapek was dismissed following poor earnings performance and decisions unpopular with other executives. The board announced Iger would serve for two years with a mandate to develop a strategy for renewed growth and help identify a successor. In November 2022, a group of YouTube TV subscribers in four states filed a class-action antitrust lawsuit against Disney, alleging that Disney's control of both ESPN and Hulu allowed the company to "inflate prices marketwise by raising the prices of its own products" and by requiring streaming services including YouTube TV and Sling TV to include ESPN in base packages, forcing subscribers to pay more for subscriptions than they would in a competitive market. In January 2023, Disney announced that Mark Parker would replace Arnold as the company's chairperson. In February 2023, Disney announced that it would be cutting $5.5 billion in costs, which includes eliminating 7,000 jobs representing 3% of its workforce. Disney reorganized into three divisions: Entertainment, ESPN, and Parks, Experiences and Products. In April 2023, Disney implemented the second and largest wave of job cuts, affecting Disney Entertainment, ESPN, and the Parks, Experiences and Products division. This move was part of the plan to cut costs by $5.5 billion. In 2023, Disney began its "100 Years of Wonder" campaign in celebration of the centennial anniversary of the company's founding. This included a new animated centennial logo intro for the Walt Disney Pictures division, a touring exhibition, events at the parks and a commemorative commercial that aired during Super Bowl LVII. In October 2023, Disney announced its entrance into sports betting through a partnership with Penn Entertainment, launching the ESPN Bet app, despite internal debates and concerns over brand image. This move marked a significant pivot from Iger's earlier stance against gambling, driven by the potential to attract younger audiences and secure a financial future for ESPN, amidst declining traditional television viewership and increasing online sports gambling revenue. In November 2023, Disney shortened the lengthy name of Disney Parks, Experiences and Products to Disney Experiences. In February 2024, Debra O'Connell, a longtime executive at Disney, was appointed president of a new news division that would include ABC News and local stations. O'Connell is responsible for ABC News's signature properties, including Good Morning America and World News Tonight. It will serve as an intermediary between Dana Walden, co-chair of Disney Entertainment and Kim Godwin, the ABC News president. Other online news units have similar processes. In February, Walt Disney and Reliance Industries announced the merger of their India TV and streaming media assets. On February 3, 2026, The Walt Disney Company finalized a $1.5 billion equity investment in Epic Games to develop a persistent universe operating within Fortnite and the Unreal Engine. The partnership integrates Disney’s intellectual property, including Marvel, Star Wars, and Pixar, into a digital ecosystem connected to Fortnite where users can engage with content and purchase digital goods. Under the leadership of Josh D’Amaro, this project serves as the company's primary expansion into the gaming sector, utilizing interactive environments and virtual simulations to reach a broader demographic of younger consumers. In July 2024, Ryan Mitchell Kramer, a Californian man, hacked and leaked over a terabyte of the company's Slack messages while pretending to be part of a hacktivist group "Nullbulge". Kramer managed to get access to the company's accounts by using a Trojan to steal the login credentials of the employee's work and personal accounts. Kramer claimed that the motive for the breach was the group's dislike of art generated by artificial intelligence, though it was later discovered that Kramer tried to extort the employees. Members of Generation Z were absent from the D23 fan event held in August 2024 in Anaheim, which was dominated by millennials representing all 50 US states and 36 countries. Disney chief brand officer Asad Ayaz pushed back against the idea that this was a symptom of a broader trend: "Our fandoms and our fans and different generations show up in different ways". Theme park experts noted that the true test of the enduring power of the Disney brand will be whether Generation Z takes Generation Alpha to Disney theme parks. In October 2024, Disney announced James P. Gorman would replace Mark Parker as chairman in January 2025. It also announced a successor to CEO Bob Iger would be named in early 2026. On May 7, 2025, Disney announced its seventh resort, Disneyland Abu Dhabi, planned for Yas Island. Similar to Tokyo Disney Resort, it will not be owned or managed by Disney, but instead by Miral Group. In September 2025, the American Broadcasting Company (ABC) suspended indefinitely production of the late-night talk show Jimmy Kimmel Live!, sparking widespread backlash from political leaders and commentators, entertainers, entertainment industry unions, constitutional scholars and the public, as well as boycotts against ABC and The Walt Disney Company. After a week of suspension since September 17, on September 22 Disney announced that Kimmel's suspension was lifted, and the show resumed broadcasting on September 23. Prior to the broadcast, President Donald Trump wrote, on his Truth Social platform, “I think we’re going to test ABC out on this. Let’s see how we do.” Lawyer Roberta Kaplan, representing shareholders American Federation of Teachers and Reporters Without Borders, stated that major media companies “should not succumb to unconstitutional threats or blackmail” and that a “credible basis to suspect” the board of breaching its fiduciary duty, by prioritizing “improper political and affiliate considerations.” On December 11, 2025, Disney agreed to acquire an equity stake in OpenAI for $1 billion. Simultaneously, the two companies also signed a three-year agreement to allow users to generate videos using OpenAI's Sora video platform and licensed Disney characters. On February 3, 2026, The Walt Disney Company's board of directors announced the unanimous election of Josh D'Amaro as the company's next Chief Executive Officer (CEO), succeeding Bob Iger effective March 18, 2026. D’Amaro, a 28-year veteran of the company and former Chairman of Disney Experiences, will also join the board of directors following the company's annual meeting. Simultaneously, Dana Walden was appointed to the newly created role of President and Chief Creative Officer (CCO), reporting directly to D'Amaro. Iger is set to remain with the company as a senior advisor and board member until his official retirement on December 31, 2026. Company units The Walt Disney Company operates three primary business segments: Leadership Awards and nominations As of 2022, the Walt Disney Company has won 135 Academy Awards, 32 of them were awarded to Walt. The company has won 16 Academy Awards for Best Animated Short Film, 16 for Best Original Song, 15 for Best Animated Feature, 11 for Best Original Score, 5 for Best Documentary Feature, 5 for Best Visual Effects, and several others as well special awards. Disney has also won 29 Golden Globe Awards, 51 British Academy of Film and Television Arts (BAFTA) awards, and 36 Grammy Awards as of 2022.[d] Legacy The Walt Disney Company is one of the world's largest entertainment companies and is considered to be a pioneer in the animation industry, having produced 790 features, 122 of which are animated films. Many of its films are considered to be the greatest of all time, including Pinocchio, Toy Story, Bambi, Ratatouille, Snow White and the Seven Dwarfs, and Mary Poppins. Disney has also created some of the most influential and memorable characters of all time, such as Mickey Mouse, Woody, Captain America (MCU), Jack Sparrow, Iron Man (MCU), and Elsa. Disney has been recognized for revolutionizing the animation industry; according to Den of Geek, the risk of making the first animated feature Snow White and the Seven Dwarfs has "changed cinema". The company, mainly through Walt, has introduced new technologies and more-advanced techniques for animating, as well as adding personalities to characters. Some of Disney's technological innovations for animation include invention of the multiplane camera, xerography, CAPS, deep canvas, and RenderMan. Many songs from the company's films have become extremely popular, and several have peaked at number one on Billboard's Hot 100. Some songs from the Silly Symphony series became immensely popular across the US. Disney has been ranked number 48 in the 2023 Fortune 500 list of the largest United States corporations by total revenue and fourth in Fortune's 2022 "World's Most Admired Companies". According to Smithsonian Magazine, there are "few symbols of pure Americana more potent than the Disney theme parks", which are "well-established cultural icons", with the company name and Mickey Mouse being "household names". Disney is one of the biggest competitors in the theme park industry with 12 parks, all of which were the top-25 most-visited parks in 2018. Disney theme parks worldwide had over 157 million visitors, making it the most-visited theme-park company in the world, doubling the attendance number of the second-most-visited company. Of the 157 million visitors, the Magic Kingdom had 20.8 million of the guests, making it the most-visited theme park in the world. When Disney first entered the theme park industry, CNN stated: "It changed an already legendary company. And it changed the entire theme park industry." According to The Orange County Register, Walt Disney World has "changed entertainment by showing how a theme park could help make a company into a lifestyle brand". Criticism and controversies The Walt Disney Company has been criticized for making purportedly sexist and racist content in the past, putting LGBT+ elements in its films, and not having enough LGBT+ representation. There have been controversies over alleged plagiarism, poor pay and working conditions, and poor treatment of animals. Disney has also been criticized for filming in the autonomous region of Xinjiang, where human rights abuses are taking place. Several of Disney's films have been considered to be racist; one of the company's most controversial films Song of the South was criticized for portraying racial stereotypes. For that reason, the film was never released to home video in the United States or Disney+. Other characters that have been called racist are Sunflower, a black centaurette who serves a white centaurette in Fantasia; the Siamese cats in Lady and the Tramp, who are considered to be overexaggerated as Asians; stereotypes of Native Americans in Peter Pan; and crows in Dumbo, who are depicted as African Americans who use jive talk, with their leader being named Jim Crow, believed to be in reference to racial segregation laws in the United States. When watching a film on Disney+ considered to have wrongful racist stereotypes, Disney added a disclaimer before the film starts to help avoid controversies. Disney has also been accused a number of times of plagiarizing already existing works in its films. Most notably, The Lion King has many similarities in its characters and events to an animated series called Kimba the White Lion by animator Osamu Tezuka. Atlantis: The Lost Empire also has many similarities to the anime show Nadia: The Secret of Blue Water that were considered so prevalent the latter show's creator Gainax was planning to sue Disney but was stopped by its series' network NHK. Kelly Wilson, creator of the short The Snowman (2014), filed two lawsuits, one which came after the first was rescinded, against Disney for copyright infringement in Disney's animated film Frozen. Disney later settled the lawsuit with Wilson, allowing the company to create a sequel to Frozen. Screenwriter Gary L. Goldman sued Disney over its film Zootopia, claiming he had earlier pitched an identical, same-titled story to the company. A judge dismissed the lawsuit, stating there was not enough evidence to prove any plagiarism. Disney itself is very protective about characters created by Walt Disney, and require every licensee to put the '© Disney' mark on their products or packaging. On 10 December 2025, Disney sent Google a cease-and-desist letter, accusing Google of copyright infringement, claiming that Google's AI models have generated characters that resemble those from iconic Disney shows like Frozen, Deadpool, Star Wars, and more. Disney has been criticized for both putting LGBT+ elements into its films and for having insufficient LGBT+ representation in its media. In the live-action film Beauty and the Beast, director Bill Condon announced LeFou would be depicted as a gay character, prompting Kuwait, Malaysia, and a theater in Alabama to ban the film, and Russia to give it a stricter rating. In Russia and several Middle Eastern countries, the Pixar movie Onward was banned for having Disney's first openly lesbian character Officer Specter, while others said Disney needed more representation of LGBT+ persons in its media. Because of a scene featuring two lesbians kissing, Pixar's Lightyear was banned in 13 predominantly Muslim countries. In a leaked video of a Disney meeting, participants talked about pushing LGBT+ themes in the company's media, angering some people, who say the company is "trying to sexualize children", while others applauded its actions. Some Disney Princess films have been considered to be sexist toward women. Snow White is said to be too worried about her appearance while Cinderella is deemed to have no talents. Aurora is also said to be weak because she is always waiting to be rescued. In some of the princess films, men have more dialogue, and there are more speaking male characters than female. Disney's more-recent films are considered to be less sexist than its earlier films. Disney has been accused of having poor working conditions. A protest by 2,000 workers at Disneyland in 2022 accused the company of poor pay at an average of $13 an hour, with some saying they were evicted from their homes. In 2010, at a factory in China where Disney products were being made, workers experienced working hours three times longer than those prescribed by law, and one of the workers died by suicide. In 1990, Disney paid $95,000 to avoid legal action over 16 animal-cruelty charges for beating vultures to death, shooting at birds, and starving some birds at Discovery Island. The company took these actions because the birds were attacking other animals and taking their food. When Animal Kingdom first opened, there were concerns about the animals because several of them died. Animal rights groups protested but the United States Department of Agriculture found no violations of animal-welfare regulations. Financial data See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/History_of_the_Jews_in_Poland] | [TOKENS: 25618] |
Contents History of the Jews in Poland The history of the Jews in Poland dates back at least 1,000 years. For centuries, Poland was home to the largest and most significant Jewish community in the world. Poland was a principal center of Jewish culture, because of the long period of statutory religious tolerance and social autonomy which ended after the Partitions of Poland in the 18th century. During World War II there was a nearly complete genocidal destruction of the Polish Jewish community by Nazi Germany and its collaborators of various nationalities, during the German occupation of Poland between 1939 and 1945, called the Holocaust. Since the fall of communism in Poland, there has been a renewed interest in Jewish culture, featuring an annual Jewish Culture Festival in Kraków, new study programs at Polish secondary schools and universities, and the opening of Warsaw's Museum of the History of Polish Jews. From the founding of the Kingdom of Poland in 1025 until the early years of the Polish–Lithuanian Commonwealth created in 1569, Poland was the most tolerant country in Europe. Poland became a shelter for Jews persecuted and expelled from various European countries and the home to the world's largest Jewish community of the time. According to some sources, about three-quarters of the world's Jews lived in Poland by the middle of the 16th century. With the weakening of the Commonwealth and growing religious strife (due to the Protestant Reformation and Catholic Counter-Reformation), Poland's traditional tolerance began to wane from the 17th century. After the Partitions of Poland in 1795 and the destruction of Poland as a sovereign state, Polish Jews became subject to the laws of the partitioning powers, including the increasingly antisemitic Russian Empire, as well as Austria-Hungary and Kingdom of Prussia (later a part of the German Empire). When Poland regained independence in the aftermath of World War I, it was still the center of the European Jewish world, with one of the world's largest Jewish communities of over 3 million. Antisemitism was a growing problem throughout Europe in those years, from both the political establishment and the general population. Throughout the interwar period, Poland supported Jewish emigration from Poland and the creation of a Jewish state in Palestine. The Polish state also supported Jewish paramilitary groups such as the Haganah, Betar, and Irgun, providing them with weapons and training. In 1939, at the start of World War II, Poland was partitioned between Nazi Germany and the Soviet Union (see Molotov–Ribbentrop Pact). One-fifth of the Polish population perished during World War II; the 3,000,000 Polish Jews murdered in the Holocaust, who constituted 90% of Polish Jewry, made up half of all Poles killed during the war. While the Holocaust occurred largely in German-occupied Poland, it was orchestrated and perpetrated by the Nazis. Polish attitudes to the Holocaust varied widely, from actively risking death in order to save Jewish lives, and passive refusal to inform on them, to indifference, blackmail, and in extreme cases, committing premeditated murders such as in the Jedwabne pogrom. Collaboration by non-Jewish Polish citizens in the Holocaust was sporadic, but incidents of hostility against Jews are well documented and have been a subject of renewed scholarly interest during the 21st century. In the post-war period, many of the approximately 200,000 Jewish survivors registered at the Central Committee of Polish Jews or CKŻP (of whom 136,000 arrived from the Soviet Union) left the Polish People's Republic for the nascent State of Israel or the Americas. Their departure was hastened by the destruction of Jewish institutions, post-war anti-Jewish violence, and the hostility of the Communist Party to both religion and private enterprise, but also because in 1946–1947 Poland was the only Eastern Bloc country to allow free Jewish aliyah to Israel, without visas or exit permits. Most of the remaining Jews left Poland in late 1968 as the result of the "anti-Zionist" campaign. After the fall of the Communist regime in 1989, the situation of Polish Jews became normalized and those who were Polish citizens before World War II were allowed to renew Polish citizenship. According to the 2021 Polish census, there were 17,156 Jews living in Poland as of 2021. Early history to Golden Age: 966–1572 The first Jews to visit Polish territory were traders, while permanent settlement began during the Crusades. Travelling along trade routes leading east to Kiev and Bukhara, Jewish merchants, known as Radhanites, crossed Silesia. One of them, Ibrahim ibn Yaqub (fl. 961–62), a diplomat and merchant from the town of Tortosa in al-Andalus sent by the Umayyad Caliph of Córdoba, Al-Hakam II, was the first chronicler to mention the Polish state ruled by Prince Mieszko I. In the summer of 965 or 966, Jacob made a trade and diplomatic journey from his native Toledo in Muslim Spain to the Holy Roman Empire and then to the Slavic countries. The first actual mention of Jews in Polish chronicles occurs in the 11th century, where it appears that Jews then lived in Gniezno, at that time the capital of the Polish kingdom of the Piast dynasty. Among the first Jews to arrive in Poland in 1097 or 1098 were those banished from Prague. The first permanent Jewish community is mentioned in 1085 by a Jewish scholar Jehuda ha-Kohen in the city of Przemyśl. As elsewhere in Central and Eastern Europe, the principal activity of Jews in medieval Poland was commerce and trade, including the export and import of goods such as cloth, linen, furs, hides, wax, metal objects, and slaves. The first extensive Jewish migration from Western Europe to Poland occurred at the time of the First Crusade in 1098. Under Bolesław III (1102–1139), Jews, encouraged by the tolerant regime of this ruler, settled throughout Poland, including over the border in Lithuanian territory as far as Kiev. Bolesław III recognized the utility of Jews in the development of the commercial interests of his country. Jews came to form the backbone of the Polish economy. Mieszko III employed Jews in his mint as engravers and technical supervisors, and the coins minted during that period even bear Hebraic markings. Jews worked on commission for the mints of other contemporary Polish princes, including Casimir the Just, Bolesław I the Tall and Władysław III Spindleshanks. Jews enjoyed undisturbed peace and prosperity in the many principalities into which the country was then divided; they formed the middle class in a country where the general population consisted of landlords (developing into szlachta, the unique Polish nobility) and peasants, and they were instrumental in promoting the commercial interests of the land. Another factor for the Jews to emigrate to Poland was the Magdeburg rights (or Magdeburg Law), a charter given to Jews, among others, that specifically outlined the rights and privileges that Jews had in Poland. For example, they could maintain communal autonomy, and live according to their own laws. This made it very attractive for Jewish communities to pick up and move to Poland. The first mention of Jewish settlers in Płock dates from 1237, in Kalisz from 1287 and a Żydowska (Jewish) street in Kraków in 1304. The tolerant situation was gradually altered by the Roman Catholic Church on the one hand, and by the neighboring German states on the other. There were, however, among the reigning princes some determined protectors of the Jewish inhabitants, who considered the presence of the latter most desirable as far as the economic development of the country was concerned. Prominent among such rulers was Bolesław the Pious of Kalisz, Prince of Great Poland. With the consent of the class representatives and higher officials, in 1264 he issued a General Charter of Jewish Liberties (commonly called the Statute of Kalisz), which granted all Jews the freedom to worship, trade, and travel. Similar privileges were granted to the Silesian Jews by the local princes, Henryk IV Probus of Wrocław in 1273–90, Henryk III of Głogów in 1274 and 1299, Henryk V the Fat of Legnica in 1290–95, and Bolko III the Generous of Legnica and Wrocław in 1295. Article 31 of the Statute of Kalisz tried to rein in the Catholic Church from disseminating blood libels against the Jews, by stating: "Accusing Jews of drinking Christian blood is expressly prohibited. If despite this a Jew should be accused of murdering a Christian child, such charge must be sustained by testimony of three Christians and three Jews." During the next hundred years, the Church pushed for the persecution of Jews while the rulers of Poland usually protected them. The Councils of Wrocław (1267), Buda (1279), and Łęczyca (1285) each segregated Jews, ordered them to wear a special emblem, banned them from holding offices where Christians would be subordinated to them, and forbade them from building more than one prayer house in each town. However, those church decrees required the cooperation of the Polish princes for enforcement, which was generally not forthcoming, due to the profits which the Jews' economic activity yielded to the princes. In 1332, King Casimir III the Great (1303–1370) amplified and expanded Bolesław's old charter with the Wiślicki Statute. Under his reign, streams of Jewish immigrants headed east to Poland and Jewish settlements are first mentioned as existing in Lvov (1356), Sandomierz (1367), and Kazimierz near Kraków (1386). Casimir, who according to a legend had a Jewish lover named Esterka from Opoczno was especially friendly to the Jews, and his reign is regarded as an era of great prosperity for Polish Jewry, and was nicknamed by his contemporaries "King of the serfs and Jews". Under penalty of death, he prohibited the kidnapping of Jewish children for the purpose of enforced Christian baptism. He inflicted heavy punishment for the desecration of Jewish cemeteries. Nevertheless, while the Jews of Poland enjoyed tranquility for the greater part of Casimir's reign, toward its close they were subjected to persecution on account of the Black Death. In 1348, the first blood libel accusation against Jews in Poland was recorded, and in 1367 the first pogrom took place in Poznań. Compared with the pitiless destruction of their co-religionists in Western Europe, however, Polish Jews did not fare badly; and Jewish refugees from Germany fled to the more hospitable cities in Poland. As a result of the marriage of Władysław II Jagiełło to Jadwiga, daughter of Louis I of Hungary, Lithuania was united with the kingdom of Poland. In 1388–1389, broad privileges were extended to Lithuanian Jews including freedom of religion and commerce on equal terms with the Christians. Under the rule of Władysław II, Polish Jews had increased in numbers and attained prosperity. However, religious persecution gradually increased, as the dogmatic clergy pushed for less official tolerance, pressured by the Council of Constance. In 1349 pogroms took place in many towns in Silesia. There were accusations of blood libel by the priests, and new riots against the Jews in Poznań in 1399. Accusations of blood libel by another fanatic priest led to the riots in Kraków in 1407, although the royal guard hastened to the rescue. Hysteria caused by the Black Death led to additional 14th-century outbreaks of violence against the Jews in Kalisz, Kraków and Bochnia. Traders and artisans jealous of Jewish prosperity, and fearing their rivalry, supported the harassment. In 1423, the Statute of Warka forbade Jews the granting of loans against letters of credit or mortgage and limited their operations exclusively to loans made on security of moveable property. In the 14th and 15th centuries, rich Jewish merchants and moneylenders leased the royal mint, salt mines and the collecting of customs and tolls. The most famous of them were Jordan and his son Lewko of Kraków in the 14th century and Jakub Slomkowicz of Lutsk, Wolczko of Drohobych, Natko of Lviv, Samson of Zhydachiv, Josko of Hrubieszów and Szania of Belz in the 15th century. For example, Wolczko of Drohobycz, King Ladislaus Jagiełło's broker, was the owner of several villages in the Ruthenian voivodship and the soltys (administrator) of the village of Werbiz. Also, Jews from Grodno were in this period owners of villages, manors, meadows, fish ponds and mills. However, until the end of the 15th century, agriculture as a source of income played only a minor role among Jewish families. More important were crafts for the needs of both their fellow Jews and the Christian population (fur making, tanning, tailoring). In 1454 anti-Jewish riots flared up in Bohemia's ethnically-German Wrocław and other Silesian cities, inspired by a Franciscan friar, John of Capistrano, who accused Jews of profaning the Christian religion. As a result, Jews were banished from Lower Silesia. Zbigniew Olesnicki then invited John to conduct a similar campaign in Kraków and several other cities, to lesser effect. The decline in the status of the Jews was briefly checked by Casimir IV Jagiellon (1447–1492), but soon the Polish nobility forced him to issue the Statutes of Nieszawa, which, among other things, abolished the ancient privileges of the Jews "as contrary to divine right and the law of the land." Nevertheless, the king continued to offer his protection to the Jews. Two years later Casimir issued another document announcing that he could not deprive the Jews of his benevolence on the basis of "the principle of tolerance which in conformity with God's laws obliged him to protect them". The policy of the government toward the Jews of Poland oscillated under Casimir's sons and successors, John I Albert (1492–1501) and Alexander Jagiellon (1501–1506). In 1495, Jews were ordered out of the center of Kraków and allowed to settle in the "Jewish town" of Kazimierz. In the same year, Alexander, when he was the Grand Duke of Lithuania, followed the 1492 example of Spanish rulers and banished Jews from Lithuania. For several years they took shelter in Poland until 1503 when — after becoming King of Poland — he allowed them to return to Lithuania. The next year he issued a proclamation in which he stated that a policy of tolerance befitted "kings and rulers". Poland became more tolerant just as the Jews were expelled from Spain in 1492, as well as from Austria, Hungary and Germany, thus stimulating Jewish immigration to Poland. Poland became recognized as a haven for Jewish exiles from Western Europe, and a cultural and spiritual center of the Jewish people. In 1503, King Alexander Jagiellon appointed Jacob Pollak the first Chief Rabbi of Poland. Pollak founded a yeshiva (a school for the study of rabbinic literature) in Kraków. The most prosperous period for Polish Jews began following a new influx of Jews after the accession of Sigismund I in 1506. Sigismund protected the Jews in his realm, which encouraged many Jews to emigate to Poland, where they founded a community at Kraków. Shalom Shachna (one of Jacob Pollak’s pupils) is counted among the pioneers of Talmudic scholarship in Poland. In 1515 Shachna established the yeshiva in Lublin, which had the third largest Jewish community in Poland during that period. Shachna’s yeshiva produced several prominent rabbis, including Moses Isserles and Solomon Luria, who succeeded Shachna as rosh yeshiva of Lublin. The first books to be published in the Hebrew and Yiddish languages in Poland were produced by the Halicz brothers, who established the first Jewish printing press in the Polish–Lithuanian Commonwealth in Kraków in 1534. That year, they published Sha'are Dura, written in Hebrew by Isaac ben Meir Halevi. The same year, they also published Mirkevet ha-Mishne, the first book in the world to be printed in Yiddish. During this period, Jewish mysticism and Kabbalah became popular and eminent Polish-Jewish scholars such as Mordecai Yoffe and Joel Sirkis devoted themselves to its study. From the 14th century until 1538, Jews functioned widely as the arendators (lease-holders) of royal prerogatives in Poland such as the mint, salt mines, customs, and tax farming. Pressured by the nobility, in 1538 the Sejm prohibited the Jews from participating in this highly lucrative activity. Although these so-called “great arenda” became one of the protected privileges of the Polish nobility, Jews continued to function as the arendators of landed estates leased from the nobility. Sigismund II Augustus followed his father's tolerant policy, granting limited autonomy to the Jews and laying the foundation for the qahal, or autonomous Jewish community. Poland-Lithuania became the main center for Ashkenazi Jewry and its yeshivot achieved fame during this period. Poland welcomed Jewish immigrants from Italy, as well as Sephardic Jews, Romaniote Jews, Mizrahi Jews, and Persian Jews. Jewish religious life thrived in many Polish communities. By the middle of the 16th century, about 75% of all Jews lived in Poland. By 1551, Polish Jews had been given permission to choose their own Chief Rabbi. Within the Jewish community, the Chief Rabbinate held power over law and finance, appointing judges and other officials. The Polish government permitted the Chief Rabbinate to grow in power, using the office for collection of taxes from the Jewish community. 30% of the money raised by the Rabbinate served Jewish causes, while the remainder went to the Crown for protection.[citation needed] The Polish–Lithuanian Commonwealth: 1569–1795 After the death of Sigismund II Augustus in 1572, the Polish nobility gathered in Warsaw and signed the Warsaw Confederation, a document in which representatives of all major religions pledged mutual support and tolerance. The central autonomous body that — along with the Chief Rabbinate — regulated Jewish life in Poland from about 1580 until 1764 was known as the Council of Four Lands. According to Gershon Hundert, the following eight or nine decades of relative prosperity and security experienced by Polish Jews witnessed the appearance of "a virtual galaxy of sparkling intellectual figures." Yeshivot were established in prominent Jewish communities such as Brześć, Lublin, Lwów, Ostróg, and Poznań. By 1600, the Jewish printing house in Kraków had printed some 200 works, mostly in Hebrew, but sometimes also in Yiddish. The growth of Talmudic scholarship in Poland was coincident with the greater prosperity of the Polish Jews, and because of their communal autonomy educational development was mostly along Talmudic lines. Exceptions are recorded, however, where Jewish youth sought secular instruction in the European universities. The rabbis became not merely expounders of Jewish law, but also spiritual advisers, teachers, judges, and legislators. Polish Jewry found its views of life shaped by these principles, whose influence was felt in the home, in school, and in the synagogue. The culture and intellectual output of the Jewish community in Poland had a profound impact on Judaism as a whole. Some Jewish historians have recounted that the word Poland is pronounced as Polania or Polin in Hebrew, and as transliterated into Hebrew, these names for Poland were interpreted as good omens because Polania can be broken down into three Hebrew words: po ("here"), lan ("dwells"), ya ("God"), and Polin into two words of: po ("here") lin ("[you should] dwell"). The message was that Poland was meant to be a good place for the Jews. According to Rabbi David HaLevi Segal, Poland was a place where "most of the time the gentiles do no harm; on the contrary they do right by Israel" (Divre David; 1689). It was during this period that a rueful pasquinade claiming that Poland was a "paradise for the Jews" gave birth to a proverb, which evolved into the saying that Poland was “heaven for the nobility, purgatory for townfolk, hell for peasants, and paradise for the Jews”. Despite the Warsaw Confederation, there was a significant increase in antisemitism at this time, partly due to the Counter-Reformation and the growing influence of the Jesuits. By the 1590s, there were anti-Jewish outbreaks in Poznań, Lublin, Kraków, Vilnius and Kiev. In Lwów alone there were attacks against Jews in 1572, 1592, 1613, 1618, and every year from 1638, with Jesuit students being responsible for many of them. At the same time, laws were introduced to restrict Jews from living in the royal towns and cities in the Commonwealth, which increased their migration to the eastern parts of the country where they were invited by the magnates to their private towns. By the end of the 18th century, two-thirds of the royal towns and cities in the Commonwealth had pressed the king to grant them that privilege. After the Union of Brest in 1595–1596, the Orthodox church was outlawed in the Polish-Lithuanian Commonwealth. This, along with the mass migration of Jews to Ruthenia and their negative perception by the local population, resulted in significant religious, social and political tensions in Ruthenia. These tensions intensified the many uprisings by the Cossacks, which had begun in 1591. The largest of these was the Khmelnytsky Uprising (1648 – 1657), in which Cossacks massacred tens of thousands of Jews, Catholics and Uniates in the eastern and southern areas of the Polish Crown. Although the uprising was directed primarily against the wealthy nobility and landlords, the Cossacks perceived the Jews as allies of the former. The precise number of dead is not known, but the decrease of the Jewish population during this period is estimated at 100,000 to 200,000, which also includes emigration, deaths from diseases and jasyr (captivity in the Ottoman Empire). The combined effects of the Cossack uprisings, along with internal conflicts in Poland and concurrent invasions of the Commonwealth by the Tsardom of Russia, the Swedish Empire, the Crimean Tatars, and the Ottoman Empire, ended the Polish Golden Age and caused a decline of Polish power during the period known as "the Deluge". The amount of destruction, pillage and methodical plunder during the Siege of Kraków (1657) was so enormous that parts the city never again recovered. Which was later followed by the massacres by Stefan Czarniecki of the Ruthenian and Jewish population. Many Jews along with the townsfolk of Kalisz, Kraków, Poznań, Piotrków and Lublin also fell victim to recurring epidemics. After these disturbances ended, Jews returned and rebuilt their destroyed homes. Although the Jewish population of Poland had decreased, it still was more numerous than that of the Jewish colonies in Western Europe, and Poland continued to be a spiritual center of Judaism. Through 1698, the Polish kings generally remained supportive of the Jews. The decade from the Khmelnytsky Uprising until after the Deluge (1648–1658) left a deep and lasting impression not only on the social life of the Polish–Lithuanian Jews, but on their spiritual life as well. The intellectual output of the Jews of Poland was reduced, and study of the Talmud became overly formalized and accessible to only a limited number of students. Some rabbis busied themselves with quibbles concerning religious laws, while others wrote commentaries on the Talmud in which hair-splitting arguments of no practical importance were discussed. Certain Ashkenazic movements began to appear at this time. At the same time, many miracle-workers made their appearance among the Jews of Poland, culminating in a series of Messianic movements, such as Sabbatianism and Frankism. In this time of Jewish mysticism and overly formal Rabbinism came the teachings of Israel ben Eliezer, known as the Baal Shem Tov, or BeShT, (1698–1760), which had a profound effect on the Jews of Eastern Europe and Poland in particular. His disciples taught and encouraged the new fervent brand of Judaism based on Kabbalah known as Hasidic Judaism. The rise of Hasidic Judaism within Poland's borders and beyond had a great influence on the rise of Haredi Judaism all over the world, with a continuous influence through its many Hasidic dynasties including those of Chabad, Aleksander, Bobov, Ger, and Nadvorna.[citation needed] In 1742 most of Silesia was annexed by Prussia as a result of the Silesian Wars. Further disorder and anarchy ensued in Poland in 1764, after the accession to the throne of its last king, Stanislaus II Augustus Poniatowski. Eight years later, triggered by the Confederation of Bar against Russian influence and the pro-Russian king, the outlying provinces of Poland were overrun from all sides by different military forces and divided for the first time by the three neighboring empires, Russia, Austria, and Prussia. The Commonwealth lost 30% of its land during the annexations of 1772, and even more of its peoples. Jews were most numerous in the territories that fell under the military control of Austria and Russia.[citation needed] By 1764, there were about 750,000 Jews in the Polish–Lithuanian Commonwealth. The worldwide Jewish population at that time was estimated at 1.2 million. In 1768, the Koliivshchyna, a rebellion in Right-bank Ukraine west of the Dnieper in Volhynia, led to ferocious murders of Polish noblemen, Catholic priests and thousands of Jews by haydamaks. Four years later, in 1772, the military Partitions of Poland had begun between Russia, Prussia, and Austria. The permanent council established at the insistence of the Russian government (1773–1788) served as the highest administrative tribunal, and occupied itself with the elaboration of a plan that would make practicable the reorganization of Poland on a more rational basis. The progressive elements in Polish society recognized the urgency of popular education as the first step toward reform. The Commission of National Education — the first ministry of education in the world — was established in 1773, and founded numerous new schools while remodelling older ones. Chancellor Andrzej Zamoyski and other members of the commission demanded that religious toleration and the inviolability of their persons and property should be guaranteed to Jews. However, they also insisted that Jews living in the cities should be separated from Christians, that Jews having no clear occupation should be banished from the kingdom, and that Jews should not be allowed to possess land. On the other hand, some szlachta and intellectuals proposed a national system of government characterized by civil and political equality of the Jews. This was the only example in modern Europe before the French Revolution of tolerance and broadmindedness in dealing with the Jewish question. But all these reforms were too late: a Russian army soon invaded Poland, and soon after a Prussian one followed. A second partition of Poland was made on 17 July 1793. Jews, in a Jewish regiment led by Berek Joselewicz, took part in the Kościuszko Uprising the following year, when the Poles tried to again achieve independence, but were brutally put down. Following the revolt, the third and final partition of Poland took place in 1795. The territories which included the great bulk of the Jewish population was transferred to Russia, and thus they became subjects of that empire, although in the first half of the 19th century some semblance of a vastly smaller Polish state was preserved, especially in the form of the Congress Poland (1815–1831).[citation needed] Under foreign rule many Jews inhabiting formerly Polish lands were indifferent to Polish aspirations for independence. However, most Polonized Jews supported the revolutionary activities of Polish patriots and participated in national uprisings. Polish Jews took part in the November Insurrection of 1830–1831, the January Insurrection of 1863, as well as in the revolutionary movement of 1905. Many Polish Jews were enlisted in the Polish Legions, which fought for the Polish independence, achieved in 1918 when the occupying forces disintegrated following World War I. Jews of Poland within the Russian Empire (1795–1918) Official Russian policy would eventually prove to be substantially harsher to the Jews than that under independent Polish rule. The lands that had once been Poland were to remain the home of many Jews, as, in 1772, Catherine II, the Tzarina of Russia, instituted the Pale of Settlement, restricting Jews to the western parts of the empire, which would eventually include much of Poland, although it excluded some areas in which Jews had previously lived. By the late 19th century, over four million Jews would live in the Pale. Tsarist policy towards the Jews of Poland alternated between harsh rules, and inducements meant to break the resistance to large-scale conversion. In 1804, Alexander I of Russia issued a "Statute Concerning Jews", meant to accelerate the process of assimilation of the Empire's new Jewish population. The Polish Jews were allowed to establish schools with Russian, German or Polish curricula. However, they were also restricted from leasing property, teaching in Yiddish, and from entering Russia. They were banned from the brewing industry. The harshest measures designed to compel Jews to merge into society at large called for their expulsion from small villages, forcing them to move into towns. Once the resettlement began, thousands of Jews lost their only source of income and turned to Qahal for support. Their living conditions in the Pale began to dramatically worsen. During the reign of Tsar Nicolas I, known by the Jews as "Haman the Second", hundreds of new anti-Jewish measures were enacted. The 1827 decree by Nicolas – while lifting the traditional double taxation on Jews in lieu of army service – made Jews subject to general military recruitment laws that required Jewish communities to provide 7 recruits per each 1000 "souls" every 4 years. Unlike the general population that had to provide recruits between the ages of 18 and 35, Jews had to provide recruits between the ages of 12 and 25, at the qahal's discretion. Thus between 1827 and 1857 over 30,000 children were placed in the so-called Cantonist schools, where they were pressured to convert. "Many children were smuggled to Poland, where the conscription of Jews did not take effect until 1844." The Pale of Settlement (Russian: Черта́ осе́длости, chertá osédlosti, Yiddish: תּחום-המושבֿ, tkhum-ha-moyshəv, Hebrew: תְּחוּם הַמּוֹשָב, tḥùm ha-mosháv) was the term given to a region of Imperial Russia in which permanent residency by Jews was allowed and beyond which Jewish permanent residency was generally prohibited. It extended from the eastern pale, or demarcation line, to the western Russian border with the Kingdom of Prussia (later the German Empire) and with Austria-Hungary. The archaic English term pale is derived from the Latin word palus, a stake, extended to mean the area enclosed by a fence or boundary. With its large Catholic and Jewish populations, the Pale was acquired by the Russian Empire (which was a majority Russian Orthodox) in a series of military conquests and diplomatic maneuvers between 1791 and 1835, and lasted until the fall of the Russian Empire in 1917. It comprised about 20% of the territory of European Russia and mostly corresponded to historical borders of the former Polish–Lithuanian Commonwealth; it covered much of present-day Lithuania, Belarus, Poland, Moldova, Ukraine, and parts of western Russia. From 1791 to 1835, and until 1917, there were differing reconfigurations of the boundaries of the Pale, such that certain areas were variously open or shut to Jewish residency, such as the Caucasus. At times, Jews were forbidden to live in agricultural communities, or certain cities, as in Kiev, Sevastopol and Yalta, excluded from residency at a number of cities within the Pale. Settlers from outside the pale were forced to move to small towns, thus fostering the rise of the shtetls. Although the Jews were accorded slightly more rights with the Emancipation reform of 1861 by Alexander II, they were still restricted to the Pale of Settlement and subject to restrictions on ownership and profession. The existing status quo was shattered with the assassination of Alexander in 1881 – an act falsely blamed upon the Jews. The assassination prompted a large-scale wave of anti-Jewish riots, called pogroms (Russian: погро́м;) throughout 1881–1884. In the 1881 outbreak, pogroms were primarily limited to Russia, although in a riot in Warsaw two Jews were killed, 24 others were wounded, women were raped and over two million rubles worth of property was destroyed. The new czar, Alexander III, blamed the Jews for the riots and issued a series of harsh restrictions on Jewish movements. Pogroms continued until 1884, with at least tacit government approval. They proved a turning point in the history of the Jews in partitioned Poland and throughout the world. In 1884, 36 Jewish Zionist delegates met in Katowice, forming the Hovevei Zion movement. The pogroms prompted a great wave of Jewish emigration to the United States. An even bloodier wave of pogroms broke out from 1903 to 1906, at least some of them believed to have been organized by the Tsarist Russian secret police, the Okhrana. They included the Białystok pogrom of 1906 in the Grodno Governorate of Russian Poland, in which at least 75 Jews were murdered by marauding soldiers and many more Jews were wounded. According to Jewish survivors, ethnic Poles did not participate in the pogrom and instead sheltered Jewish families. The Jewish Enlightenment, Haskalah, began to take hold in Poland during the 19th century, stressing secular ideas and values. Champions of Haskalah, the Maskilim, pushed for assimilation and integration into Russian culture. At the same time, there was another school of Jewish thought that emphasized traditional study and a Jewish response to the ethical problems of antisemitism and persecution, one form of which was the Musar movement. Polish Jews generally were less influenced by Haskalah, rather focusing on a strong continuation of their religious lives based on Halakha ("rabbis's law") following primarily Orthodox Judaism, Hasidic Judaism, and also adapting to the new Religious Zionism of the Mizrachi movement later in the 19th century. By the late 19th century, Haskalah and the debates it caused created a growing number of political movements within the Jewish community itself, covering a wide range of views and vying for votes in local and regional elections. Zionism became very popular with the advent of the Poale Zion socialist party as well as the religious Polish Mizrahi, and the increasingly popular General Zionists. Jews also took up socialism, forming the Bund labor union which supported assimilation and the rights of labor. The Folkspartei (People's Party) advocated, for its part, cultural autonomy and resistance to assimilation. In 1912, Agudat Israel, a religious party, came into existence. Many Jews took part in the Polish insurrections, particularly against Russia (since the Tsars discriminated heavily against the Jews). The Kościuszko Insurrection (1794), November Insurrection (1830–31), January Insurrection (1863) and Revolutionary Movement of 1905 all saw significant Jewish involvement in the cause of Polish independence. During the Second Polish Republic period, there were several prominent Jewish politicians in the Polish Sejm, such as Apolinary Hartglas and Yitzhak Gruenbaum. Many Jewish political parties were active, representing a wide ideological spectrum, from the Zionists, to the socialists to the anti-Zionists. One of the largest of these parties was the Bund, which was strongest in Warsaw and Lodz. In addition to the socialists, Zionist parties were also popular, in particular, the Marxist Poale Zion and the orthodox religious Polish Mizrahi. The General Zionist party became the most prominent Jewish party in the interwar period and in the 1919 elections to the first Polish Sejm since the partitions, gained 50% of the Jewish vote. In 1914, the German Zionist Max Bodenheimer founded the short-lived German Committee for Freeing of Russian Jews, with the goal of establishing a buffer state (Pufferstaat) within the Jewish Pale of Settlement, composed of the former Polish provinces annexed by Russia, being de facto protectorate of the German Empire that would free Jews in the region from Russian oppression. The plan, known as the League of East European States, soon proved unpopular with both German officials and Bodenheimer's colleagues, and was dead by the following year. Interbellum (1918–1939) While most Polish Jews were neutral to the idea of a Polish state, many played a significant role in the fight for Poland's independence during World War I. Around 650 Jews — more than all other minorities combined — joined the Legiony Polskie formed by Józef Piłsudski. Prominent Jews (including Herman Feldstein, Henryk Eile, Samuel Herschthal, Zygmunt Leser, Henryk Orlean, and Wiktor Chajes) were among the members of the KTSSN, which formed the nucleus of the interim government of re-emerging sovereign Poland. Other Jews were opposed to the formation of a Polish state. For example, the Committee for the East, founded by German Jewish activists, which promoted the idea of Jews in the east becoming the "spearhead of German expansionism", serving as "Germany's reliable vassals" against other ethnic groups in the region and serving as a “living wall” against the aims of Polish separatists. In the aftermath of the Great War localized conflicts engulfed Eastern Europe between 1917 and 1919. Many attacks were launched against Jews during the Russian Civil War, the Polish–Ukrainian War, and the Polish–Soviet War ending with the Treaty of Riga. Just after the end of World War I, the West became alarmed by reports about alleged massive pogroms in Poland against Jews. Pressure for government action reached the point where U.S. President Woodrow Wilson sent an official commission to investigate the matter. The commission, led by Henry Morgenthau, Sr., concluded in its Morgenthau Report that allegations of pogroms were exaggerated. It identified eight incidents in the years 1918–1919 out of 37 mostly empty claims for damages, and estimated the number of victims at 280. Four of these were attributed to the actions of deserters and undisciplined individual soldiers; none was blamed on official government policy. Among the incidents, during the battle for Pińsk a commander of Polish infantry regiment accused a group of Jewish men of plotting against the Poles and ordered the execution of thirty-five Jewish men and youth. The Morgenthau Report found the charge to be "devoid of foundation" even though their meeting was illegal to the extent of being treasonable. In the Lwów (Lviv) pogrom, which occurred in 1918 during the Polish–Ukrainian War of independence a day after the Poles captured Lviv from the Sich Riflemen – the report concluded – 64 Jews had been killed (other accounts put the number at 72). In Warsaw, soldiers of Blue Army assaulted Jews in the streets, but were punished by military authorities. Many other events in Poland were later found to have been exaggerated, especially by contemporary newspapers such as The New York Times, although serious abuses against the Jews, including pogroms, continued elsewhere, especially in Ukraine. The historians Anna Cichopek-Gajraj and Glenn Dynner state that 130 pogroms of Jews occurred on Polish territories from 1918 to 1921, resulting in as many as 300 deaths, with many attacks conceived as reprisals against supposed Jewish economic power and their supposed "Judeo-Bolshevism" The atrocities committed by the young Polish army and its allies in 1919 during their Kiev operation against the Bolsheviks had a profound impact on the foreign perception of the re-emerging Polish state. Concerns over the fate of Poland's Jews led the Western powers to pressure Polish President Paderewski to sign the Minority Protection Treaty (the Little Treaty of Versailles), protecting the rights of minorities in new Poland including Jews and Germans. This in turn resulted in Poland's 1921 March Constitution granting Jews the same legal rights as other citizens and guaranteed them religious tolerance and freedom of religious holidays. The number of Jews immigrating to Poland from Ukraine and Soviet Russia during the interwar period grew rapidly. Jewish population in the area of former Congress of Poland increased sevenfold between 1816 and 1921, from around 213,000 to roughly 1,500,000. According to the Polish national census of 1921, there were 2,845,364 Jews living in the Second Polish Republic. According to the national census of 1931, there were 3,113,933 Jews living in Poland. By late 1938 that number had grown to approximately 3,310,000. The average rate of permanent settlement was about 30,000 per annum. At the same time, every year around 100,000 Jews were passing through Poland in unofficial emigration overseas Between the end of the Polish–Soviet War and late 1938, the Jewish population of the Republic had grown by over 464,000. According to the 1931 census one city had over 350,000 Jewish inhabitants (Warsaw), one city had over 200,000 Jewish inhabitants (Lodz), one city had around 100,000 Jewish inhabitants (Lvov) and two cities had over 50,000 Jewish inhabitants each (Cracow and Vilno). In total these five cities had 766,272 Jews which was almost 25% of the total Jewish population of Poland. In cities and towns larger than 25,000 inhabitants there lived nearly 44% of Poland's Jews. The table below shows the Jewish population of Poland's cities and towns with over 25,000 inhabitants according to the 1931 census: The newly independent Second Polish Republic had a large and vibrant Jewish minority. By the time World War II began, Poland had the largest concentration of Jews in Europe although many Polish Jews had a separate culture and ethnic identity from Catholic Poles. Some authors have stated that only about 10% of Polish Jews during the interwar period could be considered "assimilated" while more than 80% could be readily recognized as Jews. According to the 1931 National Census there were 3,130,581 Polish Jews measured by the declaration of their religion. Estimating the population increase and the emigration from Poland between 1931 and 1939, there were probably 3,474,000 Jews in Poland as of 1 September 1939 (approximately 10% of the total population) primarily centered in large and smaller cities: 77% lived in cities and 23% in the villages. They made up about 50%, and in some cases even 70% of the population of smaller towns, especially in Eastern Poland. Prior to World War II, the Jewish population of Łódź numbered about 233,000, roughly one-third of the city's population.[better source needed] The city of Lwów (now in Ukraine) had the third-largest Jewish population in Poland, numbering 110,000 in 1939 (42%). Wilno (now in Lithuania) had a Jewish community of nearly 100,000, about 45% of the city's total.[better source needed] In 1938, Kraków's Jewish population numbered over 60,000, or about 25% of the city's total population. In 1939 there were 375,000 Jews in Warsaw or one-third of the city's population. Only New York City had more Jewish residents than Warsaw. Jewish youth and religious groups, diverse political parties and Zionist organizations, newspapers and theatre flourished. Jews owned land and real estate, participated in retail and manufacturing and in the export industry. Their religious beliefs spanned the range from Orthodox Hasidic Judaism to Liberal Judaism. The Polish language, rather than Yiddish, was increasingly used by the young Warsaw Jews who did not have a problem in identifying themselves fully as Jews, Varsovians and Poles. Jews such as Bruno Schulz were entering the mainstream of Polish society, though many thought of themselves as a separate nationality within Poland. Most children were enrolled in Jewish religious schools, which used to limit their ability to speak Polish. As a result, according to the 1931 census, 79% of the Jews declared Yiddish as their first language, and only 12% listed Polish, with the remaining 9% being Hebrew. In contrast, the overwhelming majority of German-born Jews of this period spoke German as their first language. During the school year of 1937–1938 there were 226 elementary schools and twelve high schools as well as fourteen vocational schools with either Yiddish or Hebrew as the instructional language. Jewish political parties, both the Socialist General Jewish Labour Bund (The Bund), as well as parties of the Zionist right and left wing and religious conservative movements, were represented in the Sejm (the Polish Parliament) as well as in the regional councils. The Jewish cultural scene was particularly vibrant in pre–World War II Poland, with numerous Jewish publications and more than one hundred periodicals. Yiddish authors, most notably Isaac Bashevis Singer, went on to achieve international acclaim as classic Jewish writers; Singer won the 1978 Nobel Prize in Literature. His brother Israel Joshua Singer was also a writer. Other Jewish authors of the period, such as Bruno Schulz, Julian Tuwim, Marian Hemar, Emanuel Schlechter and Bolesław Leśmian, as well as Konrad Tom and Jerzy Jurandot, were less well known internationally, but made important contributions to Polish literature. Some Polish writers had Jewish roots e.g. Jan Brzechwa (a favorite poet of Polish children). Singer Jan Kiepura, born of a Jewish mother and Polish father, was one of the most popular artists of that era, and pre-war songs of Jewish composers, including Henryk Wars, Jerzy Petersburski, Artur Gold, Henryk Gold, Zygmunt Białostocki, Szymon Kataszek and Jakub Kagan, are still widely known in Poland today. Painters became known as well for their depictions of Jewish life. Among them were Maurycy Gottlieb, Artur Markowicz, and Maurycy Trębacz, with younger artists like Chaim Goldberg coming up in the ranks. Many Jews were film producers and directors, e.g. Michał Waszyński (The Dybbuk), Aleksander Ford (Children Must Laugh). Scientist Leopold Infeld, mathematician Stanisław Ulam, Alfred Tarski, and professor Adam Ulam contributed to the world of science. Other Polish Jews who gained international recognition are Moses Schorr, Ludwik Zamenhof (the creator of Esperanto), Georges Charpak, Samuel Eilenberg, Emanuel Ringelblum, and Artur Rubinstein, just to name a few from the long list. The term "genocide" was coined by Rafał Lemkin (1900–1959), a Polish–Jewish legal scholar. Leonid Hurwicz was awarded the 2007 Nobel Prize in Economics. The YIVO (Jidiszer Wissenszaftlecher Institute) Scientific Institute was based in Wilno before transferring to New York during the war. In Warsaw, important centers of Judaic scholarship, such the Main Judaic Library and the Institute of Judaic Studies were located, along with numerous Talmudic Schools (Jeszybots), religious centers and synagogues, many of which were of high architectural quality. Yiddish theatre also flourished; Poland had fifteen Yiddish theatres and theatrical groups. Warsaw was home to the most important Yiddish theater troupe of the time, the Vilna Troupe, which staged the first performance of The Dybbuk in 1920 at the Elyseum Theatre. Some future Israeli leaders studied at University of Warsaw, including Menachem Begin and Yitzhak Shamir. There also were several Jewish sports clubs, with some of them, such as Hasmonea Lwów and Jutrzenka Kraków, winning promotion to the Polish First Football League. A Polish–Jewish footballer, Józef Klotz, scored the first ever goal for the Poland national football team. Another athlete, Alojzy Ehrlich, won several medals in the table-tennis tournaments. Many of these clubs belonged to the Maccabi World Union.[citation needed] In contrast to the prevailing trends in Europe at the time, in interwar Poland an increasing percentage of Jews were pushed to live a life separate from the non-Jewish majority. The antisemitic rejection of Jews, whether for religious or racial reasons, caused estrangement and growing tensions between Jews and Poles. It is significant in this regard that in 1921, 74.2% of Polish Jews spoke Yiddish or Hebrew as their native language; by 1931, the number had risen to 87%. Besides the persistent effects of the Great Depression, the strengthening of antisemitism in Polish society was also a consequence of the influence of Nazi Germany. Following the German-Polish non-aggression pact of 1934, the antisemitic tropes of Nazi propaganda had become more common in Polish politics, where they were echoed by the National Democratic movement. One of its founders and chief ideologue Roman Dmowski was obsessed with an international conspiracy of freemasons and Jews, and in his works linked Marxism with Judaism. The position of the Catholic Church had also become increasingly hostile to the Jews, who in the 1920s and 1930s were increasingly seen as agents of evil, that is, of Bolshevism. Economic instability was mirrored by anti-Jewish sentiment in the press; discrimination, exclusion, and violence at the universities; and the appearance of "anti-Jewish squads" associated with some of the right-wing political parties. These developments contributed to a greater support among the Jewish community for Zionist and socialist ideas. In 1925, Polish Zionist members of the Sejm capitalized on governmental support for Zionism by negotiating an agreement with the government known as the Ugoda. The Ugoda was an agreement between the Polish prime minister Władysław Grabski and Zionist leaders of Et Liwnot, including Leon Reich. The agreement granted certain cultural and religious rights to Jews in exchange for Jewish support for Polish nationalist interests; however, the Galician Zionists had little to show for their compromise because the Polish government later refused to honor many aspects of the agreement. During the 1930s, Revisionist Zionists viewed the Polish government as an ally and promoted cooperation between Polish Zionists and Polish nationalists, despite the antisemitism of the Polish government. Matters improved for a time under the rule of Józef Piłsudski (1926–1935). Piłsudski countered Endecja's Polonization with the 'state assimilation' policy: citizens were judged by their loyalty to the state, not by their nationality. The years 1926–1935 were favourably viewed by many Polish Jews, whose situation improved especially under the cabinet of Pilsudski's appointee Kazimierz Bartel. However, a combination of various factors, including the Great Depression, meant that the situation of Jewish Poles was never very satisfactory, and it deteriorated again after Piłsudski's death in May 1935, which many Jews regarded as a tragedy. The Jewish industries were negatively affected by the development of mass production and the advent of department stores offering ready-made products. The traditional sources of livelihood for the estimated 300,000 Jewish family-run businesses in the country began to vanish, contributing to a growing trend toward isolationism and internal self-sufficiency. The difficult situation in the private sector led to enrolment growth in higher education. In 1923 the Jewish students constituted 62.9% of all students of stomatology, 34% of medical sciences, 29.2% of philosophy, 24.9% of chemistry and 22.1% of law (26% by 1929) at all Polish universities. It is speculated that such disproportionate numbers were the probable cause of a backlash. The interwar Polish government provided military training to the Zionist Betar paramilitary movement, whose members admired the Polish nationalist camp and imitated some of its aspects. Uniformed members of Betar marched and performed at Polish public ceremonies alongside Polish scouts and military, with their weapons training provided by Polish institutions and Polish military officers; Menachem Begin, one of its leaders, called for its members to defend Poland in case of war, and the organisation raised both Polish and Zionist flags. With the influence of the Endecja (National Democracy) party growing, antisemitism gathered new momentum in Poland and was most felt in smaller towns and in spheres in which Jews came into direct contact with Poles, such as in Polish schools or on the sports field. Further academic harassment, such as the introduction of ghetto benches, which forced Jewish students to sit in sections of the lecture halls reserved exclusively for them, anti-Jewish riots, and semi-official or unofficial quotas (Numerus clausus) introduced in 1937 in some universities, halved the number of Jews in Polish universities between independence (1918) and the late 1930s. The restrictions were so inclusive that – while the Jews made up 20.4% of the student body in 1928 – by 1937 their share was down to only 7.5%, out of the total population of 9.75% Jews in the country according to 1931 census. While the average per capita income of Polish Jews in 1929 was 40% above the national average – which was very low compared to England or Germany – they were a very heterogeneous community, some poor, some wealthy. Many Jews worked as shoemakers and tailors, as well as in the liberal professions; doctors (56% of all doctors in Poland), teachers (43%), journalists (22%) and lawyers (33%). In 1929, about a third of artisans and home workers and a majority of shopkeepers were Jewish. Although many Jews were educated, they were almost completely excluded from government jobs; as a result, the proportion of unemployed Jewish salary earners was approximately four times as great in 1929 as the proportion of unemployed non-Jewish salary earners, a situation compounded by the fact that almost no Jews were on government support. In 1937 the Catholic trade unions of Polish doctors and lawyers restricted their new members to Christian Poles. In a similar manner, the Jewish trade unions excluded non-Jewish professionals from their ranks after 1918.[citation needed] The bulk of Jewish workers were organized in the Jewish trade unions under the influence of the Jewish socialists who split in 1923 to join the Communist Party of Poland and the Second International. Anti-Jewish sentiment in Poland had reached its zenith in the years leading to the Second World War. Between 1935 and 1937 seventy-nine Jews were killed and 500 injured in anti-Jewish incidents. National policy was such that the Jews who largely worked at home and in small shops were excluded from welfare benefits. In the provincial capital of Łuck Jews constituted 48.5% of the diverse multiethnic population of 35,550 Poles, Ukrainians, Belarusians and others. Łuck had the largest Jewish community in the voivodeship. In the capital of Brześć in 1936 Jews constituted 41.3% of general population and some 80.3% of private enterprises were owned by Jews. The 32% of Jewish inhabitants of Radom enjoyed considerable prominence also, with 90% of small businesses in the city owned and operated by the Jews including tinsmiths, locksmiths, jewellers, tailors, hat makers, hairdressers, carpenters, house painters and wallpaper installers, shoemakers, as well as most of the artisan bakers and clock repairers. In Lubartów, 53.6% of the town's population were Jewish also along with most of its economy. In a town of Luboml, 3,807 Jews lived among its 4,169 inhabitants, constituting the essence of its social and political life. The national boycott of Jewish businesses and advocacy for their confiscation was promoted by the National Democracy party and Prime Minister Felicjan Sławoj-Składkowski, declared an "economic war against Jews", while introducing the term "Christian shop". As a result a boycott of Jewish businesses grew intensively. A national movement to prevent the Jews from kosher slaughter of animals, with animal rights as the stated motivation, was also organized. Violence was also frequently aimed at Jewish stores, and many of them were looted. At the same time, persistent economic boycotts and harassment, including property-destroying riots, combined with the effects of the Great Depression that had been very severe on agricultural countries like Poland, reduced the standard of living of Poles and Polish Jews alike to the extent that by the end of the 1930s, a substantial portion of Polish Jews lived in grinding poverty. As a result, on the eve of the Second World War, the Jewish community in Poland was large and vibrant internally, yet (with the exception of a few professionals) also substantially poorer and less integrated than the Jews in most of Western Europe.[citation needed] The main strain of antisemitism in Poland during this time was motivated by Catholic religious beliefs and centuries-old myths such as the blood libel. This religious-based antisemitism was sometimes joined with an ultra-nationalistic stereotype of Jews as disloyal to the Polish nation. On the eve of World War II, many typical Polish Christians believed that there were far too many Jews in the country, and the Polish government became increasingly concerned with the "Jewish question". According to the British Embassy in Warsaw, in 1936 emigration was the only solution to the Jewish question that found wide support in all Polish political parties. The Polish government condemned wanton violence against the Jewish minority, fearing international repercussions, but shared the view that the Jewish minority hindered Poland's development; in January 1937 Foreign Minister Józef Beck declared that Poland could house 500,000 Jews, and hoped that over the next 30 years 80,000–100,000 Jews a year would leave Poland. As the Polish government sought to lower the numbers of the Jewish population in Poland through mass emigration, it embraced close and good contact with Ze'ev Jabotinsky, the founder of Revisionist Zionism, and pursued a policy of supporting the creation of a Jewish state in Palestine. The Polish government hoped Palestine would provide an outlet for its Jewish population and lobbied for creation of a Jewish state in the League of Nations and other international venues, proposing increased emigration quotas and opposing the Partition Plan of Palestine on behalf of Zionist activists. As Jabotinsky envisioned in his "Evacuation Plan" the settlement of 1.5 million East European Jews within 10 years in Palestine, including 750,000 Polish Jews, he and Beck shared a common goal. Ultimately this proved impossible and illusory, as it lacked both general Jewish and international support. In 1937 Polish Minister of Foreign Affairs Józef Beck declared in the League of Nations his support for the creation of a Jewish state and for an international conference to enable Jewish emigration. The common goals of the Polish state and of the Zionist movement, of increased Jewish population flow to Palestine, resulted in their overt and covert cooperation. Poland helped by organizing passports and facilitating illegal immigration, and supplied the Haganah with weapons. Poland also provided extensive support to the Irgun (the military branch of the Revisionist Zionist movement) in the form of military training and weapons. According to Irgun activists, the Polish state supplied the organisation with 25,000 rifles, additional material and weapons, and by summer 1939 Irgun's Warsaw warehouses held 5,000 rifles and 1,000 machine guns. The training and support by Poland would allow the organisation to mobilise 30,000-40,000 men. In 1938, the Polish government revoked Polish citizenship from tens-of-thousands Polish Jews who had lived outside the country for an extended period of time. It was feared that many Polish Jews living in Germany and Austria would want to return en masse to Poland to escape anti-Jewish measures. Their property was claimed by the Polish state. By the time of the German invasion in 1939, antisemitism was escalating, and hostility towards Jews was a mainstay of the right-wing political forces post-Piłsudski regime and also the Catholic Church. Discrimination and violence against Jews had rendered the Polish Jewish population increasingly destitute. Despite the impending threat to the Polish Republic from Nazi Germany, there was little effort seen in the way of reconciliation with Poland's Jewish population. In July 1939 the pro-government Gazeta Polska wrote, "The fact that our relations with the Reich are worsening does not in the least deactivate our program in the Jewish question—there is not and cannot be any common ground between our internal Jewish problem and Poland's relations with the Hitlerite Reich." Escalating hostility towards Polish Jews and an official Polish government desire to remove Jews from Poland continued until the German invasion of Poland. World War II and the destruction of Polish Jewry (1939–45) The number of Jews in Poland on 1 September 1939, amounted to about 3,474,000 people. One hundred thirty thousand soldiers of Jewish descent, including Boruch Steinberg, Chief Rabbi of the Polish Military, served in the Polish Army at the outbreak of the Second World War, thus being among the first to launch armed resistance against Nazi Germany. During the September Campaign some 20,000 Jewish civilians and 32,216 Jewish soldiers were killed, while 61,000 were taken prisoner by the Germans; the majority did not survive. The soldiers and non-commissioned officers who were released ultimately found themselves in the Nazi ghettos and labor camps and suffered the same fate as other Jewish civilians in the ensuing Holocaust in Poland. In 1939, Jews constituted 30% of Warsaw's population. With the coming of the war, Jewish and Polish citizens of Warsaw jointly defended the city, putting their differences aside. Polish Jews later served in almost all Polish formations during the entire World War II, many were killed or wounded and very many were decorated for their combat skills and exceptional service. Jews fought with the Polish Armed Forces in the West, in the Soviet-formed Polish People's Army as well as in several underground organizations and as part of Polish partisan units or Jewish partisan formations. The Soviet Union signed the Molotov–Ribbentrop Pact with Nazi Germany on 23 August 1939 containing a protocol about partition of Poland. The German army attacked Poland on 1 September 1939. The Soviet Union followed suit by invading eastern Poland on 17 September 1939. The days between the retreat of the Polish army and the entry of the Red Army, September 18–21, witnessed a pogrom in Grodno, in which 25 Jews were killed (the Soviets later put some of the pogromists on trial). Within weeks, 61.2% of Polish Jews found themselves under the German occupation, while 38.8% were trapped in the Polish areas annexed by the Soviet Union. Jews under German occupation were immediately maltreated, beaten, publicly executed, and even burnt alive in the synagogue. As a result 350,000 Polish Jews fled from the German-occupied area to the Soviet area. Upon annexing the region, the Soviet government recognized as Soviet citizens Jews (and other non-Poles) who were permanent residents of the area, while offering refugees the choice of either taking on Soviet citizenship or returning to their former homes. The Soviet annexation was accompanied by the widespread arrests of government officials, police, military personnel, border guards, teachers, priests, judges etc., followed by the NKVD prisoner massacres and massive deportation of 320,000 Polish nationals to the Soviet interior and the Gulag slave labor camps where, as a result of the inhuman conditions, about half of them died before the end of war. Jewish refugees under the Soviet occupation had little knowledge about what was going on under the Germans since the Soviet media did not report on the goings-on in territories occupied by their Nazi ally.[pages needed] Many people from Western Poland registered for repatriation back to the German zone, including wealthier Jews, as well as some political and social activists from the interwar period.[citation needed] Synagogues and churches were not yet closed but heavily taxed. The Soviet ruble of little value was immediately equalized to the much higher Polish zloty and by the end of 1939, zloty was abolished. Most economic activity became subject to central planning and the NKVD restrictions. Since the Jewish communities tended to rely more on commerce and small-scale businesses, the confiscations of property affected them to a greater degree than the general populace. The Soviet rule resulted in near collapse of the local economy, characterized by insufficient wages and general shortage of goods and materials. The Jews, like other inhabitants of the region, saw a fall in their living standards. Under the Soviet policy, ethnic Poles were dismissed and denied access to positions in the civil service. Former senior officials and notable members of the Polish community were arrested and exiled together with their families. At the same time the Soviet authorities encouraged young Jewish communists to fill in the newly emptied government and civil service jobs. While most eastern Poles consolidated themselves around the anti-Soviet sentiments, a portion of the Jewish population, along with the ethnic Belarusian and Ukrainian activists had welcomed invading Soviet forces as their protectors. The general feeling among the Polish Jews was a sense of temporary relief in having escaped the Nazi occupation in the first weeks of war. The Polish poet Aleksander Wat has stated that Jews were more inclined to cooperate with the Soviets. Following Jan Karski's report written in 1940, historian Norman Davies claimed that among the informers and collaborators, the percentage of Jews was striking; likewise, General Władysław Sikorski estimated that 30% of them identified with the communists whilst engaging in provocations; they prepared lists of Polish "class enemies". Other historians have indicated that the level of Jewish collaboration could well have been less than suggested.[better source needed] Historian Martin Dean has written that "few local Jews obtained positions of power under Soviet rule." The issue of Jewish collaboration with the Soviet occupation remains controversial. Some scholars note that while not pro-Communist, many Jews saw the Soviets as the lesser threat compared to the German Nazis. They stress that stories of Jews welcoming the Soviets on the streets, vividly remembered by many Poles from the eastern part of the country are impressionistic and not reliable indicators of the level of Jewish support for the Soviets. Additionally, it has been noted that some ethnic Poles were as prominent as Jews in filling civil and police positions in the occupation administration, and that Jews, both civilians and in the Polish military, suffered equally at the hands of the Soviet occupiers. Whatever initial enthusiasm for the Soviet occupation Jews might have felt was soon dissipated upon feeling the impact of the suppression of Jewish societal modes of life by the occupiers. The tensions between ethnic Poles and Jews as a result of this period has, according to some historians, taken a toll on relations between Poles and Jews throughout the war, creating until this day, an impasse to Polish–Jewish rapprochement. A number of younger Jews, often through the pro-Marxist Bund or some Zionist groups, were sympathetic to Communism and Soviet Russia, both of which had been enemies of the Polish Second Republic. As a result of these factors they found it easy after 1939 to participate in the Soviet occupation administration in Eastern Poland, and briefly occupied prominent positions in industry, schools, local government, police and other Soviet-installed institutions. The concept of "Judeo-communism" was reinforced during the period of the Soviet occupation (see Żydokomuna). There were also Jews who assisted Poles during the Soviet occupation. Among the thousands of Polish officers killed by the Soviet NKVD in the Katyń massacre there were 500–600 Jews. From 1939 to 1941 between 100,000 and 300,000 Polish Jews were deported from Soviet-occupied Polish territory into the Soviet Union. Some of them, especially Polish Communists (e.g. Jakub Berman), moved voluntarily; however, most of them were forcibly deported or imprisoned in a Gulag. Small numbers of Polish Jews (about 6,000) were able to leave the Soviet Union in 1942 with the Władysław Anders army, among them the future Prime Minister of Israel Menachem Begin. During the Polish army's II Corps' stay in the British Mandate of Palestine, 67% (2,972) of the Jewish soldiers deserted to settle in Palestine, and many joined the Irgun. General Anders decided not to prosecute the deserters and emphasized that the Jewish soldiers who remained in the Force fought bravely. The Cemetery of Polish soldiers who died during the Battle of Monte Cassino includes headstones bearing a Star of David. A number of Jewish soldiers died also when liberating Bologna. Poland's Jewish community suffered the most in the Holocaust. Some six million Polish citizens perished in the war – half of those (three million Polish Jews, all but some 300,000 of the Jewish population) being killed at the German extermination camps at Auschwitz, Treblinka, Majdanek, Belzec, Sobibór, and Chełmno or starved to death in the ghettos. Poland was where the German program of extermination of Jews, the "Final Solution", was implemented, since this was where most of Europe's Jews (excluding the Soviet Union's) lived. In 1939 several hundred synagogues were blown up or burned by the Germans, who sometimes forced the Jews to do it themselves. In many cases, the Germans turned the synagogues into factories, places of entertainment, swimming pools, or prisons. By war's end, almost all the synagogues in Poland had been destroyed. Rabbis were forced to dance and sing in public with their beards shorn off. Some rabbis were set on fire or hanged. The Germans ordered that all Jews be registered, and the word "Jude" was stamped in their identity cards. Numerous restrictions and prohibitions targeting Jews were introduced and brutally enforced. For example, Jews were forbidden to walk on the sidewalks, use public transport, or enter places of leisure, sports arenas, theaters, museums and libraries. On the street, Jews had to lift their hat to passing Germans. By the end of 1941 all Jews in German-occupied Poland, except the children, had to wear an identifying badge with a blue Star of David. Rabbis were humiliated in "spectacles organised by the German soldiers and police" who used their rifle butts "to make these men dance in their praying shawls." The Germans disappointed that Poles refused to collaborate, made little attempts to set up a collaborationist government in Poland, nevertheless, German tabloids printed in Polish routinely ran antisemitic articles that urged local people to adopt an attitude of indifference towards the Jews. Following Operation Barbarossa, many Jews in what was then Eastern Poland fell victim to Nazi death squads called Einsatzgruppen, which massacred Jews, especially in 1941. Some of these German-inspired massacres were carried out with help from, or active participation of Poles themselves: for example, the Jedwabne pogrom, in which between 300 (Institute of National Remembrance's Final Findings) and 1,600 Jews (Jan T. Gross) were tortured and beaten to death by members of the local population. The full extent of Polish participation in the massacres of the Polish Jewish community remains a controversial subject, in part due to Jewish leaders' refusal to allow the remains of the Jewish victims to be exhumed and their cause of death to be properly established. The Polish Institute for National Remembrance identified twenty-two other towns that had pogroms similar to Jedwabne. The reasons for these massacres are still debated, but they included antisemitism, resentment over alleged cooperation with the Soviet invaders in the Polish–Soviet War and during the 1939 invasion of the Kresy regions, greed for the possessions of the Jews, and of course coercion by the Nazis to participate in such massacres. Some Jewish historians have written of the negative attitudes of some Poles towards persecuted Jews during the Holocaust. While members of Catholic clergy risked their lives to assist Jews, their efforts were sometimes made in the face of antisemitic attitudes from the church hierarchy. Anti-Jewish attitudes also existed in the London-based Polish Government in Exile, although on 18 December 1942 the President in exile Władysław Raczkiewicz wrote a dramatic letter to Pope Pius XII, begging him for a public defense of both murdered Poles and Jews. In spite of the introduction of death penalty extending to the entire families of rescuers, the number of Polish Righteous among the Nations testifies to the fact that Poles were willing to take risks in order to save Jews. Holocaust survivors' views of Polish behavior during the War span a wide range, depending on their personal experiences. Some are very negative, based on the view of Christian Poles as passive witnesses who failed to act and aid the Jews as they were being persecuted or liquidated by the Nazis. Poles, who were also victims of Nazi crimes, were often afraid for their own and their family's lives and this fear prevented many of them from giving aid and assistance, even if some of them felt sympathy for the Jews. Emanuel Ringelblum, a Polish–Jewish historian of the Warsaw Ghetto, wrote critically of the indifferent and sometimes joyful responses in Warsaw to the destruction of Polish Jews in the Ghetto. However, Gunnar S. Paulsson stated that Polish citizens of Warsaw managed to support and hide the same percentage of Jews as did the citizens of cities in Western European countries. Paulsson's research shows that at least as far as Warsaw is concerned, the number of Poles aiding Jews far outnumbered those who sold out their Jewish neighbors to the Nazis. During the Nazi occupation of Warsaw 70,000–90,000 Polish gentiles aided Jews, while 3,000–4,000 were szmalcowniks, or blackmailers who collaborated with the Nazis in persecuting the Jews. The German Nazis established six extermination camps throughout occupied Poland by 1942. All of these – at Chełmno (Kulmhof), Bełżec, Sobibór, Treblinka, Majdanek and Auschwitz (Oświęcim) – were located near the rail network so that the victims could be easily transported. The system of the camps was expanded over the course of the German occupation of Poland and their purposes were diversified; some served as transit camps, some as forced labor camps and the majority as death camps. While in the death camps, the victims were usually killed shortly after arrival, in the other camps able-bodied Jews were worked and beaten to death.[better source needed] The operation of concentration camps depended on Kapos, the collaborator-prisoners. Some of them were Jewish themselves, and their prosecution after the war created an ethical dilemma.[better source needed] Between October 1939 and July 1942 a system of ghettos was imposed for the confinement of Jews. The Warsaw Ghetto was the largest in all of World War II, with 380,000 people crammed into an area of 1.3 sq mi (3.4 km2). The Łódź Ghetto was the second largest, holding about 160,000 prisoners. Other large Jewish ghettos in leading Polish cities included Białystok Ghetto in Białystok, Częstochowa Ghetto, Kielce Ghetto, Kraków Ghetto in Kraków, Lublin Ghetto, Lwów Ghetto in present-day Lviv, Stanisławów Ghetto also in present-day Ukraine, Brześć Ghetto in present-day Belarus, and Radom Ghetto among others. Ghettos were also established in hundreds of smaller settlements and villages around the country. The overcrowding, dirt, lice, lethal epidemics such as typhoid and hunger all resulted in countless deaths. During the occupation of Poland, the Germans used various laws to separate ethnic Poles from Jewish ones. In the ghettos, the population was separated by putting the Poles into the "Aryan Side" and the Polish Jews into the "Jewish Side". Any Pole found giving any help to a Jewish Pole was subject to the death penalty. Another law implemented by the Germans was that Poles were forbidden from buying from Jewish shops, and if they did they were subject to execution. Many Jews tried to escape from the ghettos in the hope of finding a place to hide outside of it, or of joining the partisan units. When this proved difficult escapees often returned to the ghetto on their own. If caught, Germans would murder the escapees and leave their bodies in plain view as a warning to others. Despite these terror tactics, attempts at escape from ghettos continued until their liquidation. Since the Nazi terror reigned throughout the Aryan districts, the chances of remaining successfully hidden depended on a fluent knowledge of the language and on having close ties with the community. Many Poles were not willing to hide Jews who might have escaped the ghettos or who might have been in hiding due to fear for their own lives and that of their families. While the German policy towards Jews was ruthless and criminal, their policy towards Christian Poles who helped Jews was very much the same. The Germans would often murder non-Jewish Poles for small misdemeanors. Execution for help rendered to Jews, even the most basic kinds, was automatic. In any apartment block or area where Jews were found to be harboured, everybody in the house would be immediately shot by the Germans. For this thousands of non-Jewish Poles were executed. Hiding in a Christian society to which the Jews were only partially assimilated was a daunting task. They needed to quickly acquire not only a new identity, but a new body of knowledge. Many Jews spoke Polish with a distinct Yiddish or Hebrew accent, used a different nonverbal language, different gestures and facial expressions. People with physical characteristics such as dark curly hair and brown eyes were particularly vulnerable. Some individuals blackmailed Jews and non-Jewish Poles hiding them, and took advantage of their desperation by collecting money, or worse, turning them over to the Germans for a reward. The Gestapo provided a standard prize to those who informed on Jews hidden on the 'Aryan' side, consisting of cash, liquor, sugar, and cigarettes. Jews were robbed and handed over to the Germans by "szmalcowniks" (the 'shmalts' people: from shmalts or szmalec, Yiddish and Polish for 'grease'). In extreme cases, the Jews informed on other Jews to alleviate hunger with the awarded prize. The extortionists were condemned by the Polish Underground State. The fight against informers was organized by the Armia Krajowa (the Underground State's military arm), with the death sentence being meted out on a scale unknown in the occupied countries of Western Europe. To discourage Poles from giving shelter to Jews, the Germans often searched houses and introduced ruthless penalties. Poland was the only occupied country during World War II where the Nazis formally imposed the death penalty for anybody found sheltering and helping Jews. The penalty applied not only to the person who did the helping, but also extended to his or her family, neighbors and sometimes to entire villages. In this way Germans applied the principle of collective responsibility whose purpose was to encourage neighbors to inform on each other in order to avoid punishment. The nature of these policies was widely known and visibly publicized by the Nazis who sought to terrorize the Polish population. Food rations for the Poles were small (669 kcal per day in 1941) compared to other occupied nations throughout Europe and black market prices of necessary goods were high, factors which made it difficult to hide people and almost impossible to hide entire families, especially in the cities. Despite these draconian measures imposed by the Nazis, Poland has the highest number of Righteous Among The Nations awards at the Yad Vashem Museum (6,339). The Polish Government in Exile was the first (in November 1942) to reveal the existence of Nazi-run concentration camps and the systematic extermination of the Jews by the Nazis, through its courier Jan Karski and through the activities of Witold Pilecki, a member of Armia Krajowa who was the only person to volunteer for imprisonment in Auschwitz and who organized a resistance movement inside the camp itself. One of the Jewish members of the National Council of the Polish government in exile, Szmul Zygielbojm, committed suicide to protest the indifference of the Allied governments in the face of the Holocaust in Poland. The Polish government in exile was also the only government to set up an organization (Żegota) specifically aimed at helping the Jews in Poland. The Warsaw Ghetto and its 1943 Uprising represents what is likely the most known episode of the wartime history of the Polish Jews. The ghetto was established by the German Governor-General Hans Frank on 16 October 1940. Initially, almost 140,000 Jews were moved into the ghetto from all parts of Warsaw. At the same time, approximately 110,000 Poles had been forcibly evicted from the area. The Germans selected Adam Czerniakow to take charge of the Jewish Council called Judenrat made up of 24 Jewish men ordered to organize Jewish labor battalions as well as Jewish Ghetto Police which would be responsible for maintaining order within the Ghetto walls. A number of Jewish policemen were corrupt and immoral. Soon the Nazis demanded even more from the Judenrat and the demands were much crueler. Death was the punishment for the slightest indication of noncompliance by the Judenrat. Sometimes the Judenrat refused to collaborate in which case its members were consequently executed and replaced by the new group of people. Adam Czerniakow who was the head of the Warsaw Judenrat committed suicide when he was forced to collect daily lists of Jews to be deported to the Treblinka extermination camp at the onset of Grossaktion Warsaw. The population of the ghetto reached 380,000 people by the end of 1940, about 30% of the population of Warsaw. However, the size of the Ghetto was only about 2.4% of the size of the city. The Germans closed off the Ghetto from the outside world, building a wall around it by 16 November 1940. During the next year and a half, Jews from smaller cities and villages were brought into the Warsaw Ghetto, while diseases (especially typhoid) and starvation kept the inhabitants at about the same number. Average food rations in 1941 for Jews in Warsaw were limited to 253 kcal, and 669 kcal for Poles, as opposed to 2,613 kcal for Germans. On 22 July 1942, the mass deportation of the Warsaw Ghetto inhabitants began. During the next fifty-two days (until 12 September 1942) about 300,000 people were transported by freight train to the Treblinka extermination camp. The Jewish Ghetto Police were ordered to escort the ghetto inhabitants to the Umschlagplatz train station. They were spared from the deportations until September 1942 in return for their cooperation, but afterwards shared their fate with families and relatives. On 18 January 1943, a group of Ghetto militants led by the right-leaning ŻZW, including some members of the left-leaning ŻOB, rose up in a first Warsaw uprising. Both organizations resisted, with arms, German attempts for additional deportations to Auschwitz and Treblinka. The final destruction of the Warsaw Ghetto came four months later after the crushing of one of the most heroic and tragic battles of the war, the 1943 Warsaw Ghetto Uprising. When we invaded the Ghetto for the first time – wrote SS commander Jürgen Stroop – the Jews and the Polish bandits succeeded in repelling the participating units, including tanks and armored cars, by a well-prepared concentration of fire. (...) The main Jewish battle group, mixed with Polish bandits, had already retired during the first and second day to the so-called Muranowski Square. There, it was reinforced by a considerable number of Polish bandits. Its plan was to hold the Ghetto by every means in order to prevent us from invading it. — Jürgen Stroop, Stroop Report, 1943.[better source needed] The Uprising was led by ŻOB (Jewish Combat Organization) and the ŻZW. The ŻZW (Jewish Military Union) was the better supplied in arms. The ŻOB had more than 750 fighters, but lacked weapons; they had only 9 rifles, 59 pistols and several grenades.[better source needed] A developed network of bunkers and fortifications were formed. The Jewish fighters also received support from the Polish Underground (Armia Krajowa). The German forces, which included 2,842 Nazi soldiers and 7,000 security personnel, were not capable of crushing the Jewish resistance in open street combat and after several days, decided to switch strategy by setting buildings on fire in which the Jewish fighters hid. The commander of the ŻOB, Mordechai Anielewicz, died fighting on 8 May 1943 at the organization's command centre on 18 Mila Street. It took the Germans twenty-seven days to put down the uprising, after some very heavy fighting. The German general Jürgen Stroop in his report stated that his troops had killed 6,065 Jewish fighters during the battle. After the uprising was already over, Heinrich Himmler had the Great Synagogue on Tłomackie Square (outside the ghetto) destroyed as a celebration of German victory and a symbol that the Jewish Ghetto in Warsaw was no longer. A group of fighters escaped from the ghetto through the sewers and reached the Lomianki forest. About 50 ghetto fighters were saved by the Polish "People's Guard" and later formed their own partisan group, named after Anielewicz. Even after the end of the uprising there were still several hundreds of Jews who continued living in the ruined ghetto. Many of them survived thanks to the contacts they managed to establish with Poles outside the ghetto. The Uprising inspired Jews throughout Poland. Many Jewish leaders who survived the liquidation continued underground work outside the ghetto. They hid other Jews, forged necessary documents and were active in the Polish underground in other parts of Warsaw and the surrounding area. Warsaw Ghetto Uprising, was followed by other Ghetto uprisings in many smaller towns and cities across German-occupied Poland. Many Jews were found alive in the ruins of the former Warsaw Ghetto during the 1944 general Warsaw Uprising when the Poles themselves rose up against the Germans. Some of the survivors of 1943 Warsaw Ghetto Uprising, still held in camps at or near Warsaw, were freed during 1944 Warsaw Uprising, led by the Polish resistance movement Armia Krajowa, and immediately joined Polish fighters. Only a few of them survived. The Polish commander of one Jewish unit, Waclaw Micuta, described them as some of the best fighters, always at the front line. It is estimated that over 2,000 Polish Jews, some as well known as Marek Edelman or Icchak Cukierman, and several dozen Greek, Hungarian or even German Jews freed by Armia Krajowa from Gesiowka concentration camp in Warsaw, men and women, took part in combat against Nazis during 1944 Warsaw Uprising. Some 166,000 people lost their lives in the 1944 Warsaw Uprising, including perhaps as many as 17,000 Polish Jews who had either fought with the AK or had been discovered in hiding (see: Krzysztof Kamil Baczyński and Stanisław Aronson). Warsaw was razed to the ground by the Germans and more than 150,000 Poles were sent to labor or concentration camps. On 17 January 1945, the Soviet Army entered a destroyed and nearly uninhabited Warsaw. Some 300 Jews were found hiding in the ruins in the Polish part of the city (see: Władysław Szpilman). The fate of the Warsaw Ghetto was similar to that of the other ghettos in which Jews were concentrated. With the decision of Nazi Germany to begin the Final Solution, the destruction of the Jews of Europe, Aktion Reinhard began in 1942, with the opening of the extermination camps of Bełżec, Sobibór, and Treblinka, followed by Auschwitz-Birkenau where people were killed in gas chambers and mass executions (death wall). Many died from hunger, starvation, disease, torture or by pseudo-medical experiments. The mass deportation of Jews from ghettos to these camps, such as happened at the Warsaw Ghetto, soon followed, and more than 1.7 million Jews were killed at the Aktion Reinhard camps by October 1943 alone. In August 1941, the Germans ordered the establishment of a ghetto in Białystok. About 50,000 Jews from the city and the surrounding region were confined in a small area of Białystok. The ghetto had two sections, divided by the Biala River. Most Jews in the Białystok ghetto worked in forced-labor projects, primarily in large textile factories located within the ghetto boundaries. The Germans also sometimes used Jews in forced-labor projects outside the ghetto. In February 1943, approximately 10,000 Białystok Jews were deported to the Treblinka extermination camp. During the deportations, hundreds of Jews, mainly those deemed too weak or sick to travel, were killed. In August 1943, the Germans mounted an operation to destroy the Białystok ghetto. German forces and local police auxiliaries surrounded the ghetto and began to round up Jews systematically for deportation to the Treblinka extermination camp. Approximately 7,600 Jews were held in a central transit camp in the city before deportation to Treblinka. Those deemed fit to work were sent to the Majdanek camp. In Majdanek, after another screening for ability to work, they were transported to the Poniatowa, Blizyn, or Auschwitz camps. Those deemed too weak to work were murdered at Majdanek. More than 1,000 Jewish children were sent first to the Theresienstadt ghetto in Bohemia, and then to Auschwitz-Birkenau, where they were killed. On 15 August 1943, the Białystok Ghetto Uprising began, and several hundred Polish Jews and members of the Anti-Fascist Military Organisation (Polish: Antyfaszystowska Organizacja Bojowa) started an armed struggle against the German troops who were carrying out the planned liquidation and deportation of the ghetto to the Treblinka extermination camp. The guerrillas were armed with only one machine gun, several dozen pistols, Molotov cocktails and bottles filled with acid. The fighting in isolated pockets of resistance lasted for several days, but the defence was broken almost instantly. As with the earlier Warsaw Ghetto Uprising of April 1943, the Białystok uprising had no chances for military success, but it was the second-largest ghetto uprising, after the Warsaw Ghetto Uprising. Several dozen guerrillas managed to break through to the forests surrounding Białystok where they joined the partisan units of Armia Krajowa and other organisations and survived the war. Communist rule: 1945–1989 The estimates of Polish Jews before the war vary from slightly under 3 million to almost 3.5 million (the last nationwide census was conducted in 1931). The number of Polish Jews who survived the Holocaust is difficult to ascertain. The majority of Polish Jewish survivors were individuals who were able to find refuge in the territories of Soviet Union that were not overrun by Germans and thus safe from the Holocaust. It is estimated that between 250,000 and 800,000 Polish Jews survived the war, out of which between 50,000 and 100,000 were survivors from occupied Poland, and the remainder, survivors who made it abroad (mostly to the Soviet Union). Following the Soviet annexation of over half of Poland at the onset of World War II, all Polish nationals including Jews were declared by Moscow to have become Soviet nationals regardless of birth. Also, all Polish Jews who perished in the Holocaust east of the Curzon Line were included with the Soviet war dead. For decades to come, the Soviet authorities refused to accept the fact that thousands of Jews who remained in the USSR opted consciously and unambiguously for Polish nationality. At the end of 1944, the number of Polish Jews in the Soviet and the Soviet-controlled territories has been estimated at 250,000–300,000 people. Jews who escaped to eastern Poland from areas occupied by Germany in 1939 were numbering at around 198,000. Over 150,000 of them were repatriated or expelled back to new communist Poland along with the Jewish men conscripted to the Red Army from Kresy in 1940–1941. Their families were murdered in the Holocaust. Some of the soldiers married women with the Soviet citizenship, others agreed to paper marriages. Those who survived the Holocaust in Poland included Jews who were saved by the Poles (most families with children), and those who joined the Polish or Soviet resistance movement. Some 20,000–40,000 Jews were repatriated from Germany and other countries. At its postwar peak, up to 240,000 returning Jews might have resided in Poland mostly in Warsaw, Łódź, Kraków, Wrocław and Lower Silesia, e.g., Dzierżoniów (where there was a significant Jewish community initially consisting of local concentration camp survivors), Legnica, and Bielawa. Following World War II Poland became a satellite state of the Soviet Union, with its eastern regions annexed to the Union, and its western borders expanded to include formerly German territories east of the Oder and Neisse rivers. This forced millions to relocate (see also Territorial changes of Poland immediately after World War II). Jewish survivors returning to their homes in Poland found it practically impossible to reconstruct their pre-war lives. Due to the border shifts, some Polish Jews found that their homes were now in the Soviet Union; in other cases, the returning survivors were German Jews whose homes were now under Polish jurisdiction. Jewish communities and Jewish life as it had existed was gone, and Jews who somehow survived the Holocaust often discovered that their homes had been looted or destroyed. Some returning Jews were met with antisemitic bias in Polish employment and education administrations. Post-war labor certificates contained markings distinguishing Jews from non-Jews. The Jewish community in Szczecin reported a lengthy report of complaints regarding job discrimination. Although Jewish schools were created in the few towns containing a relatively large Jewish population, many Jewish children were enrolled in Polish state schools. Some state schools, as in the town of Otwock, forbade Jewish children to enroll. In the state schools that did allow Jewish children, there were numerous accounts of beatings and persecution targeting these children. The anti-Jewish violence in Poland refers to a series of violent incidents in Poland that immediately followed the end of World War II in Europe. It occurred amid a period of violence and anarchy across the country, caused by lawlessness and anti-communist resistance against the Soviet-backed communist takeover of Poland. The exact number of Jewish victims is a subject of debate with 327 documented cases,[citation needed] and range, estimated by different writers, from 400 to 2,000.[citation needed] Jews constituted between 2 and 3% of the total number of victims of postwar violence in the country,[page needed] including the Polish Jews who managed to escape the Holocaust on territories of Poland annexed by the Soviet Union, and returned after the border changes imposed by the Allies at the Yalta Conference. The incidents ranged from individual attacks to pogroms. The best-known case is the Kielce pogrom of 4 July 1946, in which thirty-seven Jews and two Poles were murdered. Following the investigation, the local police commander was found guilty of inaction.[better source needed] Nine alleged participants of the pogrom were sentenced to death; three were given lengthy prison sentences.[better source needed] The debate in Poland continues about the involvement of regular troops in the killings, and possible Soviet influences. In a number of other instances, returning Jews still met with threats, violence, and murder from their Polish neighbors, occasionally in a deliberate and organized manner. People of the community frequently had knowledge of these murders and turned a blind eye or held no sympathy for the victims. Jewish communities responded to this violence by reporting the violence to the Ministry of Public Administration, but were granted little assistance. As many as 1500 Jewish heirs were often murdered when attempting to reclaim property. Several causes led to the anti-Jewish violence of 1944–1947. One cause was traditional Christian anti-semitism; the pogrom in Cracow (11 August 1945) and in Kielce followed accusations of ritual murder. Another cause was the gentile Polish hostility to the Communist takeover. Even though very few Jews lived in postwar Poland, many Poles believed they dominated the Communist authorities, a belief expressed in the term Żydokomuna (Judeo-Communist), a popular anti-Jewish stereotype. Yet another reason for Polish violence towards Jews stemmed from the fear that survivors would recover their property. After the war ended, Poland's Communist government enacted a broad program of nationalization and land reform, taking over large numbers of properties, both Polish- and Jewish-owned. As part of the reform the Polish People's Republic enacted legislation on "abandoned property", placing severe limitations on inheritance that were not present in prewar inheritance law, for example limiting restitution to the original owners or their immediate heirs. According to Dariusz Stola, the 1945 and 1946 laws governing restitution were enacted with the intention of restricting Jewish restitution claims as one of their main goals. The 1946 law carried a deadline of 31 December 1947 (later extended to 31 December 1948), after which unclaimed property devolved to the Polish state; many survivors residing in the USSR or in displaced-persons camps were repatriated only after the deadline had passed. All other properties that had been confiscated by the Nazi regime were deemed "abandoned"; however, as Yechiel Weizman notes, the fact most of Poland's Jewry had died, in conjunction with the fact that only Jewish property was officially confiscated by the Nazis, suggest "abandoned property" was equivalent to "Jewish property". According to Łukasz Krzyżanowski, the state actively sought to gain control over a large number of "abandoned" properties. According to Krzyżanowski, this declaration of "abandoned" property can be seen as the last stage of the expropriation process that began during the German wartime occupation; by approving the status-quo shaped by the German occupation authorities, the Polish authorities became "the beneficiary of the murder of millions of its Jewish citizens, who were deprived of all their property before death". A 1945 memorandum by the Joint states that "the new economic tendency of the Polish government... is against, or at least makes difficulties in, getting back the Jewish property robbed by the German authorities." Later laws, while more generous, remained mainly on paper, with an "uneven" implementation. Many of the properties that were previously owned or by Jews were taken over by others during the war. Attempting to reclaim an occupied property often put the claimant at a risk of physical harm and even death. Many who proceeded with the process were only granted possession, not ownership, of their properties; and completing the restitution process, given that most properties were already occupied, required additional, lengthy processes. The majority of Jewish claimants could not afford the restitution process without financial help, due to the filing costs, legal fees, and inheritance tax. While it is hard to determine the total number of successful reclamations, Michael Meng estimates that it was extremely small. In general, restitution was easier for larger organizations or well connected individuals, and the process was also abused by criminal gangs. "Movable" property such as housewares, that was either given by Jews for safekeeping or taken during the war, was rarely returned willfully; oftentimes the only resort for a returnee looking for reappropriation was the courts. Most such property was probably never returned. According to Jan Gross, "there was no social norm mandating the return of Jewish property, no detectable social pressure defining such behavior as the right thing to do, no informal social control mechanism imposing censure for doing otherwise." Facing violence and a difficult and expensive legal process, many returnees eventually decided to leave the country rather than attempt reclamation. Following the fall of the Soviet Union, a law was passed that allowed the Catholic Church to reclaim its properties, which it did with great success. According to Stephen Denburg, "unlike the restitution of Church property, the idea of returning property to former Jewish owners has been met with a decided lack of enthusiasm from both the general Polish population as well as the government". Decades later, reclaiming pre-war property would lead to a number of controversies, and the matter is still debated by media and scholars as of late 2010s. Dariusz Stola notes that the issues of property in Poland are incredibly complex, and need to take into consideration unprecedented losses of both Jewish and Polish population and massive destruction caused by Nazi Germany, as well as the expansion of Soviet Union and communism into Polish territories after the war, which dictated the property laws for the next 50 years. Poland remains "the only EU country and the only former Eastern European communist state not to have enacted [a restitution] law," but rather "a patchwork of laws and court decisions promulgated from 1945–present." As stated by Dariusz Stola, director of the POLIN Museum, "the question of restitution is in many ways connected to the question of Polish–Jewish relations, their history and remembrance, but particularly to the attitude of the Poles to the Holocaust." For a variety of reasons, the vast majority of returning Jewish survivors left Poland soon after the war ended. Many left for the West because they did not want to live under a Communist regime. Some left because of the persecution they faced in postwar Poland, and because they did not want to live where their family members had been murdered, and instead have arranged to live with relatives or friends in different western democracies. Others wanted to go to British Mandate of Palestine soon to be the new state of Israel, especially after General Marian Spychalski signed a decree allowing Jews to leave Poland without visas or exit permits. In 1946–1947 Poland was the only Eastern Bloc country to allow free Jewish aliyah to Israel, without visas or exit permits. Britain demanded Poland to halt the exodus, but their pressure was largely unsuccessful. Between 1945 and 1948, 100,000–120,000 Jews left Poland. Their departure was largely organized by the Zionist activists including Adolf Berman and Icchak Cukierman, under the umbrella of a semi-clandestine Berihah ("Flight") organization. Berihah was also responsible for the organized Aliyah emigration of Jews from Romania, Hungary, Czechoslovakia, Yugoslavia, and Poland, totaling 250,000 survivors. In 1947, a military training camp for young Jewish volunteers to Hagana was established in Bolków, Poland. The camp trained 7,000 soldiers who then traveled to Palestine to fight for Israel. The boot-camp existed until the end of 1948. A second wave of Jewish emigration (50,000) took place during the liberalization of the Communist regime between 1957 and 1959. After 1967's Six-Day War, in which the Soviet Union supported the Arab side, the Polish communist party adopted an anti-Jewish course of action which in the years 1968–1969 provoked the last mass migration of Jews from Poland. The Bund took part in the post-war elections of 1947 on a common ticket with the (non-communist) Polish Socialist Party (PPS) and gained its first and only parliamentary seat in its Polish history, plus several seats in municipal councils.[citation needed] Under pressure from Soviet-installed communist authorities, the Bund's leaders 'voluntarily' disbanded the party in 1948–1949 against the opposition of many activists.[citation needed] Stalinist Poland was basically governed by the Soviet NKVD which was against the renewal of Jewish religious and cultural life.[citation needed] In the years 1948–49, all remaining Jewish schools were nationalized by the communists and Yiddish was replaced with Polish as a language of teaching.[citation needed] For those Polish Jews who remained, the rebuilding of Jewish life in Poland was carried out between October 1944 and 1950 by the Central Committee of Polish Jews (Centralny Komitet Żydów Polskich, CKŻP) which provided legal, educational, social care, cultural, and propaganda services. A countrywide Jewish Religious Community, led by Dawid Kahane, who served as chief rabbi of the Polish Armed Forces, functioned between 1945 and 1948 until it was absorbed by the CKŻP. Eleven independent political Jewish parties, of which eight were legal, existed until their dissolution during 1949–50. Hospitals and schools were opened in Poland by the American Jewish Joint Distribution Committee and ORT to provide service to Jewish communities.[better source needed] Some Jewish cultural institutions were established including the Yiddish State Theater founded in 1950 and directed by Ida Kamińska, the Jewish Historical Institute, an academic institution specializing in the research of the history and culture of the Jews in Poland, and the Yiddish newspaper Folks-Shtime ("People's Voice"). Following liberalization after Joseph Stalin's death, in this 1958–59 period, 50,000 Jews emigrated to Israel.[better source needed] Some Polish Communists of Jewish descent actively participated in the establishment of the communist regime in the People's Republic of Poland between 1944 and 1956. Hand-picked by Joseph Stalin, prominent Jews held posts in the Politburo of the Polish United Workers' Party including Jakub Berman, head of state security apparatus Urząd Bezpieczeństwa (UB), and Hilary Minc responsible for establishing a Communist-style economy. Together with hardliner Bolesław Bierut, Berman and Minc formed a triumvirate of the Stalinist leaders in postwar Poland. After 1956, during the process of de-Stalinisation in the People's Republic under Władysław Gomułka, some Jewish officials from Urząd Bezpieczeństwa including Roman Romkowski, Jacek Różański, and Anatol Fejgin, were prosecuted and sentenced to prison terms for "power abuses" including the torture of Polish anti-fascists including Witold Pilecki among others. Yet another Jewish official, Józef Światło, after escaping to the West in 1953, exposed through Radio Free Europe the interrogation methods used the UB which led to its restructuring in 1954. Solomon Morel a member of the Ministry of Public Security of Poland and commandant of the Stalinist era Zgoda labour camp, fled Poland for Israel in 1992 to escape prosecution. Helena Wolińska-Brus, a former Stalinist prosecutor who emigrated to England in the late 1960s, fought being extradited to Poland on charges related to the execution of a Second World War resistance hero Emil Fieldorf. Wolińska-Brus died in London in 2008. In 1967, following the Six-Day War between Israel and the Arab states, Poland's Communist government, following the Soviet lead, broke off diplomatic relations with Israel and launched an antisemitic campaign under the guise of "anti-Zionism". However, the campaign did not resonate well with the Polish public, as most Poles saw similarities between Israel's fight for survival and Poland's past struggles for independence. Many Poles also felt pride in the success of the Israeli military, which was dominated by Polish Jews. The slogan "our Jews beat the Soviet Arabs" (Nasi Żydzi pobili sowieckich Arabów) became popular in Poland. The vast majority of the 40,000 Jews in Poland by the late 1960s were completely assimilated into the broader society.[citation needed] However, this did not prevent them from becoming victims of a campaign, centrally organized by the Polish Communist Party, with Soviet backing, which equated Jewish origins with "Zionism" and disloyalty to a Socialist Poland. In March 1968 student-led demonstrations in Warsaw (see Polish 1968 political crisis) gave Gomułka's government an excuse to try and channel public anti-government sentiment into another avenue. Thus his security chief, Mieczysław Moczar, used the situation as a pretext to launch an antisemitic press campaign (although the expression "Zionist" was officially used). The state-sponsored "anti-Zionist" campaign resulted in the removal of Jews from the Polish United Workers' Party and from teaching positions in schools and universities. In 1967–1971 under economic, political and secret police pressure, over 14,000 Polish Jews chose to leave Poland and relinquish their Polish citizenship. Officially, it was said that they chose to go to Israel. However, only about 4,000 actually went there; most settled throughout Europe and in the United States. The leaders of the Communist party tried to stifle the ongoing protests and unrest by scapegoating the Jews. At the same time there was an ongoing power struggle within the party itself and the antisemitic campaign was used by one faction against another. The so-called "Partisan" faction blamed the Jews who had held office during the Stalinist period for the excesses that had occurred, but the result was that most of the remaining Polish Jews, regardless of their background or political affiliation, were targeted by the communist authorities. There were several outcomes of the March 1968 events. The campaign damaged Poland's reputation abroad, particularly in the U.S. Many Polish intellectuals, however, were disgusted at the promotion of official antisemitism and opposed the campaign. Some of the people who emigrated to the West at this time founded organizations that encouraged anti-Communist opposition inside Poland.[citation needed] First attempts to improve Polish–Israeli relations began in the mid-1970s. Poland was the first of the Eastern Bloc countries to restore diplomatic relations with Israel after these have been broken off right after the Six-Day's War. In 1986 partial diplomatic relations with Israel were restored, and full relations were restored in 1990 as soon as communism fell. During the late 1970s some Jewish activists were engaged in the anti-Communist opposition groups. Most prominent among them, Adam Michnik (founder of Gazeta Wyborcza) was one of the founders of the Workers' Defence Committee (KOR). By the time of the fall of Communism in Poland in 1989, only 5,000–10,000 Jews remained in the country, many of them preferring to conceal their Jewish origin.[citation needed] Since 1989 With the fall of communism in Poland, Jewish cultural, social, and religious life has been undergoing a revival. Many historical issues, especially related to World War II and the 1944–89 period, suppressed by Communist censorship, have been re-evaluated and publicly discussed (like the Jedwabne pogrom, the Koniuchy massacre, the Kielce pogrom, the Auschwitz cross, and Polish-Jewish wartime relations in general). Jewish religious life has been revived with the help of the Ronald Lauder Foundation and the Taube Foundation for Jewish Life & Culture. There are two rabbis serving the Polish Jewish community, several Jewish schools and associated summer camps as well as several periodical and book series sponsored by the above foundations. Jewish studies programs are offered at major universities, such as Warsaw University and the Jagiellonian University. The Union of Jewish Religious Communities in Poland was founded in 1993. Its purpose is the promotion and organization of Jewish religious and cultural activities in Polish communities. A large number of cities with synagogues include Warsaw, Kraków, Zamość, Tykocin, Rzeszów, Kielce, or Góra Kalwaria although not many of them are still active in their original religious role. Stara Synagoga ("Old Synagogue") in Kraków, which hosts a Jewish museum, was built in the early 15th century and is the oldest synagogue in Poland. Before the war, the Yeshiva Chachmei in Lublin was Europe's largest. In 2007 it was renovated, dedicated and reopened thanks to the efforts and endowments by Polish Jewry. Warsaw has an active synagogue, Beit Warszawa, affiliated with the Liberal-Progressive stream of Judaism. There are also several Jewish publications although most of them are in Polish. These include Midrasz, Dos Jidische Wort (which is bilingual), as well as a youth journal Jidele and "Sztendlach" for young children. Active institutions include the Jewish Historical Institute, the E.R. Kaminska State Yiddish Theater in Warsaw, and the Jewish Cultural Center. The Judaica Foundation in Kraków has sponsored a wide range of cultural and educational programs on Jewish themes for a predominantly Polish audience. With funds from the city of Warsaw and the Polish government ($26 million total) a Museum of the History of Polish Jews is being built in Warsaw. The building was designed by the Finnish architect Rainer Mahlamäki.[better source needed] Former extermination camps of Auschwitz-Birkenau, Majdanek and Treblinka are open to visitors. At Auschwitz the Oświęcim State Museum currently houses exhibitions on Nazi crimes with a special section (Block Number 27) specifically focused on Jewish victims and martyrs. At Treblinka there is a monument built out of many shards of broken stone, as well as a mausoleum dedicated to those who perished there. A small mound of human ashes commemorates the 350,000 victims of the Majdanek camp who were killed there by the Nazis. Jewish Cemetery, Łódź is one of the largest Jewish burial grounds in Europe, and preserved historic sites include those located in Góra Kalwaria and Leżajsk (Elimelech's of Lizhensk ohel).[better source needed] The Great Synagogue in Oświęcim was excavated after testimony by a Holocaust survivor suggested that many Jewish relics and ritual objects had been buried there, just before Nazis took over the town. Candelabras, chandeliers, a menorah and a ner tamid were found and can now be seen at the Auschwitz Jewish Center.[better source needed] The Warsaw Ghetto Memorial was unveiled on 19 April 1948—the fifth anniversary of the outbreak of the Warsaw ghetto Uprising. It was constructed out of bronze and granite that the Nazis used for a monument honoring German victory over Poland and it was designed by Nathan Rapoport. The Memorial is located where the Warsaw Ghetto used to be, at the site of one command bunker of the Jewish Combat Organization. A memorial to the victims of the Kielce Pogrom of 1946, where a mob murdered more than 40 Jews who returned to the city after the Holocaust, was unveiled in 2006. The funds for the memorial came from the city itself and from the U.S. Commission for the Preservation of America's Heritage Abroad. Polish authors and scholars have published many works about the history of Jews in Poland. Notable among them are the Polish Academy of Sciences's Holocaust studies journal Zagłada Żydów. Studia i Materiały [pl] as well as other publications from the Institute of National Remembrance. Recent scholarship has primarily focused on three topics: post-war anti-Semitism; emigration and the creation of the State of Israel, and the restitution of property. There have been a number of Holocaust remembrance activities in Poland in recent years. The United States Department of State documents that: In September 2000, dignitaries from Poland, Israel, the United States, and other countries (including Prince Hassan of Jordan) gathered in the city of Oświęcim (Auschwitz) to commemorate the opening of the refurbished Chevra Lomdei Mishnayot synagogue and the Auschwitz Jewish Center. The synagogue, the sole synagogue in Oświęcim to survive World War II and an adjacent Jewish cultural and educational center, provide visitors a place to pray and to learn about the active pre-World War II Jewish community that existed in Oświęcim. The synagogue was the first communal property in the country to be returned to the Jewish community under the 1997 law allowing for restitution of Jewish communal property. The March of the Living is an annual event in April held since 1988 to commemorate the victims of the Holocaust. It takes place from Auschwitz to Birkenau and is attended by many people from Israel, Poland and other countries. The marchers honor Holocaust Remembrance Day as well as Israel Independence Day. An annual festival of Jewish culture, which is one of the biggest festivals of Jewish culture in the world, takes place in Kraków. In 2006, Poland's Jewish population was estimated to be approximately 20,000; most living in Warsaw, Wrocław, Kraków, and Bielsko-Biała, though there are no census figures that would give an exact number. According to the Polish Moses Schorr Centre and other Polish sources, however, this may represent an undercount of the actual number of Jews living in Poland, since many are not religious. There are also people with Jewish roots who do not possess adequate documentation to confirm it, due to various historical and family complications. Poland is currently easing the way for Jews who left Poland during the Communist organized massive expulsion of 1968 to re-obtain their citizenship. Some 15,000 Polish Jews were deprived of their citizenship in the 1968 Polish political crisis. On 17 June 2009 the future Museum of the History of Polish Jews in Warsaw launched a bilingual Polish-English website called "The Virtual Shtetl", providing information about Jewish life in Poland. In 2013, POLIN Museum of the History of Polish Jews opened. It is one of the world's largest Jewish museums. As of 2019 another museum, the Warsaw Ghetto Museum, is under construction and is intended to open in 2023. Numbers of Jews in Poland since 1920 However, most sources other than YIVO give a larger number of Jews living in contemporary Poland. In the 2011 Polish census, 7,508 Polish citizens declared their nationality as "Jewish", a big increase from just 1,133 during the previous 2002 census. In the 2021 Polish census in total 17,156 people declared their ethnicity as Jewish, once again a big increase from the previous census. The voivodeships with the largest number of Jews are Masovian, Lesser Poland and Silesian. There are likely more people of Jewish ancestry living in Poland but who do not actively identify as Jewish. According to the Moses Schorr Centre, there are 100,000 Jews living in Poland who don't actively practice Judaism and do not list "Jewish" as their nationality. The Jewish Renewal in Poland organization estimates that there are 200,000 "potential Jews" in Poland. The American Jewish Joint Distribution Committee and Jewish Agency for Israel estimate that there are between 25,000 and 100,000 Jews living in Poland, a similar number to that estimated by Jonathan Ornstein, head of the Jewish Community Center in Kraków (between 20,000 and 100,000). See also Notes References Bibliography Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Birthday#cite_ref-36] | [TOKENS: 4101] |
Contents Birthday A birthday is the anniversary of the birth of a person or the figurative birth of an institution. Birthdays of people are celebrated in numerous cultures, often with birthday gifts, birthday cards, a birthday party, or a rite of passage. Many religions celebrate the birth of their founders or religious figures with special holidays (e.g. Christmas, Mawlid, Buddha's Birthday, Krishna Janmashtami, and Gurpurb). There is a distinction between birthday and birthdate (also known as date of birth): the former, except for February 29, occurs each year (e.g. January 15), while the latter is the complete date when a person was born (e.g. January 15, 2001). Coming of age In most legal systems, one becomes a legal adult on a particular birthday when they reach the age of majority (usually between 12 and 21), and reaching age-specific milestones confers particular rights and responsibilities. At certain ages, one may become eligible to leave full-time education, become subject to military conscription or to enlist in the military, to consent to sexual intercourse, to marry with parental consent, to marry without parental consent, to vote, to run for elected office, to legally purchase (or consume) alcohol and tobacco products, to purchase lottery tickets, or to obtain a driver's licence. The age of majority is when minors cease to legally be considered children and assume control over their persons, actions, and decisions, thereby terminating the legal control and responsibilities of their parents or guardians over and for them. Most countries set the age of majority at 18, though it varies by jurisdiction. Many cultures celebrate a coming of age birthday when a person reaches a particular year of life. Some cultures celebrate landmark birthdays in early life or old age. In many cultures and jurisdictions, if a person's real birthday is unknown (for example, if they are an orphan), their birthday may be adopted or assigned to a specific day of the year, such as January 1. Racehorses are reckoned to become one year old in the year following their birth on January 1 in the Northern Hemisphere and August 1 in the Southern Hemisphere.[relevant?] Birthday parties In certain parts of the world, an individual's birthday is celebrated by a party featuring a specially made cake. Presents are bestowed on the individual by the guests appropriate to their age. Other birthday activities may include entertainment (sometimes by a hired professional, i.e., a clown, magician, or musician) and a special toast or speech by the birthday celebrant. The last stanza of Patty Hill's and Mildred Hill's famous song, "Good Morning to You" (unofficially titled "Happy Birthday to You") is typically sung by the guests at some point in the proceedings. In some countries, a piñata takes the place of a cake. The birthday cake may be decorated with lettering and the person's age, or studded with the same number of lit candles as the age of the individual. The celebrated individual may make a silent wish and attempt to blow out the candles in one breath; if successful, superstition holds that the wish will be granted. In many cultures, the wish must be kept secret or it will not "come true". Birthdays as holidays Historically significant people's birthdays, such as national heroes or founders, are often commemorated by an official holiday marking the anniversary of their birth. Some notables, particularly monarchs, have an official birthday on a fixed day of the year, which may not necessarily match the day of their birth, but on which celebrations are held. In Mahayana Buddhism, many monasteries celebrate the anniversary of Buddha's birth, usually in a highly formal, ritualized manner. They treat Buddha's statue as if it was Buddha himself as if he were alive; bathing, and "feeding" him. Jesus Christ's traditional birthday is celebrated as Christmas Eve or Christmas Day around the world, on December 24 or 25, respectively. As some Eastern churches use the Julian calendar, December 25 will fall on January 7 in the Gregorian calendar. These dates are traditional and have no connection with Jesus's actual birthday, which is not recorded in the Gospels. Similarly, the birthdays of the Virgin Mary and John the Baptist are liturgically celebrated on September 8 and June 24, especially in the Roman Catholic and Eastern Orthodox traditions (although for those Eastern Orthodox churches using the Julian calendar the corresponding Gregorian dates are September 21 and July 7 respectively). As with Christmas, the dates of these celebrations are traditional and probably have no connection with the actual birthdays of these individuals. Catholic saints are remembered by a liturgical feast on the anniversary of their "birth" into heaven a.k.a. their day of death. In Hinduism, Ganesh Chaturthi is a festival celebrating the birth of the elephant-headed deity Ganesha in extensive community celebrations and at home. Figurines of Ganesha are made for the holiday and are widely sold. Sikhs celebrate the anniversary of the birth of Guru Nanak and other Sikh gurus, which is known as Gurpurb. Mawlid is the anniversary of the birth of Muhammad and is celebrated on the 12th or 17th day of Rabi' al-awwal by adherents of Sunni and Shia Islam respectively. These are the two most commonly accepted dates of birth of Muhammad. However, there is much controversy regarding the permissibility of celebrating Mawlid, as some Muslims judge the custom as an unacceptable practice according to Islamic tradition. In Iran, Mother's Day is celebrated on the birthday of Fatima al-Zahra, the daughter of Muhammad. Banners reading Ya Fatima ("O Fatima") are displayed on government buildings, private buildings, public streets and car windows. Religious views In Judaism, rabbis are divided about celebrating this custom, although the majority of the faithful accept it. In the Torah, the only mention of a birthday is the celebration of Pharaoh's birthday in Egypt (Genesis 40:20). Although the birthday of Jesus of Nazareth is celebrated as a Christian holiday on December 25, historically the celebrating of an individual person's birthday has been subject to theological debate. Early Christians, notes The World Book Encyclopedia, "considered the celebration of anyone's birth to be a pagan custom." Origen, in his commentary "On Levites," wrote that Christians should not only refrain from celebrating their birthdays but should look at them with disgust as a pagan custom. A saint's day was typically celebrated on the anniversary of their martyrdom or death, considered the occasion of or preparation for their entrance into Heaven or the New Jerusalem. Ordinary folk in the Middle Ages celebrated their saint's day (the saint they were named after), but nobility celebrated the anniversary of their birth.[citation needed] The "Squire's Tale", one of Chaucer's Canterbury Tales, opens as King Cambuskan proclaims a feast to celebrate his birthday. In the Modern era, the Catholic Church, the Eastern Orthodox Church and Protestantism, i.e. the three main branches of Christianity, as well as almost all Christian religious denominations, consider celebrating birthdays acceptable or at most a choice of the individual. An exception is Jehovah's Witnesses, who do not celebrate them for various reasons: in their interpretation this feast has pagan origins, was not celebrated by early Christians, is negatively expounded in the Holy Scriptures and has customs linked to superstition and magic. In some historically Roman Catholic and Eastern Orthodox countries,[a] it is common to have a 'name day', otherwise known as a 'Saint's day'. It is celebrated in much the same way as a birthday, but it is held on the official day of a saint with the same Christian name as the birthday person; the difference being that one may look up a person's name day in a calendar, or easily remember common name days (for example, John or Mary); however in pious traditions, the two were often made to concur by giving a newborn the name of a saint celebrated on its day of confirmation, more seldom one's birthday. Some are given the name of the religious feast of their christening's day or birthday, for example, Noel or Pascal (French for Christmas and "of Easter"); as another example, Togliatti was given Palmiro as his first name because he was born on Palm Sunday. The birthday does not reflect Islamic tradition, and because of this, the majority of Muslims refrain from celebrating it. Others do not object, as long as it is not accompanied by behavior contrary to Islamic tradition. A good portion of Muslims (and Arab Christians) who have emigrated to the United States and Europe celebrate birthdays as customary, especially for children, while others abstain. Hindus celebrate the birth anniversary day every year when the day that corresponds to the lunar month or solar month (Sun Signs Nirayana System – Sourava Mana Masa) of birth and has the same asterism (Star/Nakshatra) as that of the date of birth. That age is reckoned whenever Janma Nakshatra of the same month passes. Hindus regard death to be more auspicious than birth, since the person is liberated from the bondages of material society. Also, traditionally, rituals and prayers for the departed are observed on the 5th and 11th days, with many relatives gathering. Historical and cultural perspectives According to Herodotus (5th century BC), of all the days in the year, the one which the Persians celebrate most is their birthday. It was customary to have the board furnished on that day with an ampler supply than common: the richer people eat wholly baked cow, horse, camel, or donkey (Greek: ὄνον), while the poorer classes use instead the smaller kinds of cattle. On his birthday, the king anointed his head and presented gifts to the Persians. According to the law of the Royal Supper, on that day "no one should be refused a request". The rule for drinking was "No restrictions". In ancient Rome, a birthday (dies natalis) was originally an act of religious cultivation (cultus). A dies natalis was celebrated annually for a temple on the day of its founding, and the term is still used sometimes for the anniversary of an institution such as a university. The temple founding day might become the "birthday" of the deity housed there. March 1, for example, was celebrated as the birthday of the god Mars. Each human likewise had a natal divinity, the guardian spirit called the Genius, or sometimes the Juno for a woman, who was owed religious devotion on the day of birth, usually in the household shrine (lararium). The decoration of a lararium often shows the Genius in the role of the person carrying out the rites. A person marked their birthday with ritual acts that might include lighting an altar, saying prayers, making vows (vota), anointing and wreathing a statue of the Genius, or sacrificing to a patron deity. Incense, cakes, and wine were common offerings. Celebrating someone else's birthday was a way to show affection, friendship, or respect. In exile, the poet Ovid, though alone, celebrated not only his own birthday rite but that of his far distant wife. Birthday parties affirmed social as well as sacred ties. One of the Vindolanda tablets is an invitation to a birthday party from the wife of one Roman officer to the wife of another. Books were a popular birthday gift, sometimes handcrafted as a luxury edition or composed especially for the person honored. Birthday poems are a minor but distinctive genre of Latin literature. The banquets, libations, and offerings or gifts that were a regular part of most Roman religious observances thus became part of birthday celebrations for individuals. A highly esteemed person would continue to be celebrated on their birthday after death, in addition to the several holidays on the Roman calendar for commemorating the dead collectively. Birthday commemoration was considered so important that money was often bequeathed to a social organization to fund an annual banquet in the deceased's honor. The observance of a patron's birthday or the honoring of a political figure's Genius was one of the religious foundations for imperial cult or so-called "emperor worship." The Chinese word for "year(s) old" (t 歲, s 岁, suì) is entirely different from the usual word for "year(s)" (年, nián), reflecting the former importance of Chinese astrology and the belief that one's fate was bound to the stars imagined to be in opposition to the planet Jupiter at the time of one's birth. The importance of this duodecennial orbital cycle only survives in popular culture as the 12 animals of the Chinese zodiac, which change each Chinese New Year and may be used as a theme for some gifts or decorations. Because of the importance attached to the influence of these stars in ancient China and throughout the Sinosphere, East Asian age reckoning previously began with one at birth and then added years at each Chinese New Year, so that it formed a record of the suì one had lived through rather than of the exact amount of time from one's birth. This method—which can differ by as much as two years of age from other systems—is increasingly uncommon and is not used for official purposes in the PRC or on Taiwan, although the word suì is still used for describing age. Traditionally, Chinese birthdays—when celebrated—were reckoned using the lunisolar calendar, which varies from the Gregorian calendar by as much as a month forward or backward depending on the year. Celebrating the lunisolar birthday remains common on Taiwan while growing increasingly uncommon on the mainland. Birthday traditions reflected the culture's deep-seated focus on longevity and wordplay. From the homophony in some dialects between 酒 ("rice wine") and 久 (meaning "long" in the sense of time passing), osmanthus and other rice wines are traditional gifts for birthdays in China. Longevity noodles are another traditional food consumed on the day, although western-style birthday cakes are increasingly common among urban Chinese. Hongbaos—red envelopes stuffed with money, now especially the red 100 RMB notes—are the usual gift from relatives and close family friends for most children. Gifts for adults on their birthdays are much less common, although the birthday for each decade is a larger occasion that might prompt a large dinner and celebration. The Japanese reckoned their birthdays by the Chinese system until the Meiji Reforms. Celebrations remained uncommon or muted until after the American occupation that followed World War II.[citation needed] Children's birthday parties are the most important, typically celebrated with a cake, candles, and singing. Adults often just celebrate with their partner. In North Korea, the Day of the Sun, Kim Il Sung's birthday, is the most important public holiday of the country, and Kim Jong Il's birthday is celebrated as the Day of the Shining Star. North Koreans are not permitted to celebrate birthdays on July 8 and December 17 because these were the dates of the deaths of Kim Il Sung and Kim Jong Il, respectively. More than 100,000 North Koreans celebrate displaced birthdays on July 9 and December 18 instead to avoid these dates. A person born on July 8 before 1994 may change their birthday, with official recognition. South Korea was one of the last countries to use a form of East Asian age reckoning for many official purposes. Prior to June 2023, three systems were used together—"Korean ages" that start with 1 at birth and increase every January 1st with the Gregorian New Year, "year ages" that start with 0 at birth and otherwise increase the same way, and "actual ages" that start with 0 at birth and increase each birthday. First birthday celebrations was heavily celebrated, despite usually having little to do with the child's age. In June 2023, all Korean ages were set back at least one year, and official ages henceforth are reckoned only by birthdays. In Ghana, children wake up on their birthday to a special treat called oto, which is a patty made from mashed sweet potato and eggs fried in palm oil. Later they have a birthday party where they usually eat stew and rice and a dish known as kelewele, which is fried plantain chunks. Distribution through the year Birthdays are fairly evenly distributed throughout the year, with some seasonal effects. In the United States, there tend to be more births in September and October. This may be because there is a holiday season nine months before (the human gestation period is about nine months), or because the longest nights of the year also occur in the Northern Hemisphere nine months before. However, the holidays affect birth rates more than the winter: New Zealand, a Southern Hemisphere country, has the same September and October peak with no corresponding peak in March and April. The least common birthdays tend to fall around public holidays, such as Christmas, New Year's Day and fixed-date holidays such as Independence Day in the US, which falls on July 4. Between 1973 and 1999, September 16 was the most common birthday in the United States, and December 25 was the least common birthday (other than February 29 because of leap years). In 2011, October 5 and 6 were reported as the most frequently occurring birthdays. New Zealand's most common birthday is September 29, and the least common birthday is December 25. The ten most common birthdays all fall within a thirteen-day period, between September 22 and October 4. The ten least common birthdays (other than February 29) are December 24–27, January 1–2, February 6, March 22, April 1, and April 25. This is based on all live births registered in New Zealand between 1980 and 2017. Positive and negative associations with culturally significant dates may influence birth rates. The study shows a 5.3% decrease in spontaneous births and a 16.9% decrease in Caesarean births on Halloween, compared to dates occurring within one week before and one week after the October holiday. In contrast, on Valentine's Day, there is a 3.6% increase in spontaneous births and a 12.1% increase in Caesarean births. In Sweden, 9.3% of the population is born in March and 7.3% in November, when a uniform distribution would give 8.3%. In the Gregorian calendar (a common solar calendar), February in a leap year has 29 days instead of the usual 28, so the year lasts 366 days instead of the usual 365. A person born on February 29 may be called a "leapling" or a "leaper". In common years, they usually celebrate their birthdays on February 28. In some situations, March 1 is used as the birthday in a non-leap year since it is the day following February 28. Technically, a leapling will have fewer birthday anniversaries than their age in years. This phenomenon is exploited when a person claims to be only a quarter of their actual age, by counting their leap-year birthday anniversaries only. In Gilbert and Sullivan's 1879 comic opera The Pirates of Penzance, Frederic the pirate apprentice discovers that he is bound to serve the pirates until his 21st birthday rather than until his 21st year. For legal purposes, legal birthdays depend on how local laws count time intervals. An individual's Beddian birthday, named in tribute to firefighter Bobby Beddia, occurs during the year that their age matches the last two digits of the year they were born. Some studies show people are more likely to die on their birthdays, with explanations including excessive drinking, suicide, cardiovascular events due to high stress or happiness, efforts to postpone death for major social events, and death certificate paperwork errors. See also References Notes External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Expression_(computer_science)] | [TOKENS: 696] |
Contents Expression (computer science) In computer science, an expression is a syntactic entity in a programming language that may be evaluated to determine its value of a specific semantic type. It is a combination of one or more constants, variables, functions, and operators that the programming language interprets (according to its particular rules of precedence and of association) and computes to produce ("to return", in a stateful environment) another value. In simple settings, the resulting value is usually one of various primitive types, such as string, boolean, or numerical (such as integer, floating-point, or complex). Expressions are often contrasted with statements—syntactic entities that have no value (an instruction). Definition Like in mathematics, an expression is used to denote a value to be evaluated for a specific value type supported by a programming language. In some cases an expression can't be fully evaluated; in this case, the value is undefined, even though the calculation was effected and finished.: 26 The process of evaluating an expression to a well-defined value type is called evaluation; it can occur in different contexts, such as definition and initialization [needs independent confirmation]. Examples 2 + 3 is both an arithmetic and programming expression, which evaluates to 5. A variable is an expression because it denotes a value in memory, so y + 6 is also an expression. An example of a relational expression is 4 ≠ 4, which evaluates to false. You can see how the parts of the code can be an expression. In C (and many other programming languages), the "=" is considered an operator like in mathematics; more specifically, a binary operator. Void as a result type In C and most C-derived languages, a call to a function with a void return type is a valid expression, of type void. Values of type void cannot be used, so the value of such an expression is always thrown away. Side effects and elimination In many programming languages, a function, and hence an expression containing a function, may have side effects. An expression with side effects does not normally have the property of referential transparency. In many languages (e.g. C++), expressions may be ended with a semicolon (;) to turn the expression into an expression statement. This asks the implementation to evaluate the expression for its side-effects only and to disregard the result of the expression (e.g. x+1;) unless it is a part of an expression statement that induces side-effects (e.g. y=x+1; or func1(func2());). The formal notion of a side effect is a change to the abstract state of the running program. Another class of side effects are changes to the concrete state of the computational system, such as loading data into cache memories. Languages that are often described as "side effect–free" will generally still have concrete side effects that can be exploited, for example, in side-channel attacks. Furthermore, the elapsed time evaluating an expression (even one with no other apparent side effects), is sometimes essential to the correct operation of a system, as behaviour in time is easily visible from outside the evaluation environment by other parts of the system with which it interacts, and might even be regarded as the primary effect such as when performing benchmark testing. It depends on the particular programming language specification whether an expression with no abstract side effects can legally be eliminated from the execution path by the processing environment in which the expression is evaluated. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/File:Mars_surface_map.png] | [TOKENS: 140] |
File:Mars surface map.png Summary Own work designed and rendered with Blender and data from NASA https://astrogeology.usgs.gov/search/map/mars_viking_global_color_mosaic_925m Licensing http://creativecommons.org/publicdomain/zero/1.0/deed.enCC0Creative Commons Zero, Public Domain Dedicationfalsefalse File history Click on a date/time to view the file as it appeared at that time. File usage The following 4 pages use this file: Metadata This file contains additional information, probably added from the digital camera or scanner used to create or digitize it. If the file has been modified from its original state, some details may not fully reflect the modified file. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Harlequinade] | [TOKENS: 2909] |
Contents Harlequinade Harlequinade is an English comic theatrical genre, defined by the Oxford English Dictionary as "that part of a pantomime in which the harlequin and clown play the principal parts". It developed in England between the 17th and mid-19th centuries. It was originally a slapstick adaptation or variant of the commedia dell'arte, which originated in Italy and reached its apogee there in the 16th and 17th centuries. The story of the Harlequinade revolves around a comic incident in the lives of its five main characters: Harlequin, who loves Columbine; Columbine's greedy and foolish father Pantaloon (evolved from the character Pantalone), who tries to separate the lovers in league with the mischievous Clown; and the servant, Pierrot, usually involving chaotic chase scenes with a bumbling policeman. Originally a mime (silent) act with music and stylised dance, the harlequinade later employed some dialogue, but it remained primarily a visual spectacle. Early in its development, it achieved great popularity as the comic closing part of a longer evening of entertainment, following a more serious presentation with operatic and balletic elements. An often elaborate magical transformation scene, presided over by a fairy, connected the unrelated stories, changing the first part of the pantomime, and its characters, into the harlequinade. In the late 18th and 19th centuries, the harlequinade became the larger part of the entertainment, and the transformation scene was presented with increasingly spectacular stage effects. The harlequinade lost popularity towards the end of the 19th century and disappeared altogether in the 1930s, although Christmas pantomimes continue to be presented in Britain without the harlequinade. History During the 16th century, commedia dell'arte spread from Italy throughout Europe, and by the 17th century adaptations of its characters were familiar in English plays. In English versions, harlequinades differed in two important respects from the commedia original. First, instead of being a rogue, Harlequin became the central figure and romantic lead. Secondly, the characters did not speak; this was because of the large number of French performers who played in London, following the suppression of unlicensed theatres in Paris. Although this constraint was only temporary, English harlequinades remained primarily visual, though some dialogue was later admitted. By the early years of the 18th century, "Italian night scenes" presented versions of commedia traditions in familiar London settings. From these, the standard English harlequinade developed, depicting the eloping lovers Harlequin and Columbine, pursued by the girl's foolish father, Pantaloon, and his comic servants. The basic plot remained essentially the same for more than 150 years. In the first two decades of the century, two rival London theatres, Lincoln's Inn Fields Theatre and the Theatre Royal, Drury Lane, presented productions that began seriously with classical stories with elements of opera and ballet and ended with a comic "night scene". In 1716 John Weaver, the dancing master at Drury Lane, presented "The Loves of Mars and Venus – a new Entertainment in Dancing after the manner of the Antient Pantomimes". At Lincoln's Inn, John Rich presented and performed as Harlequin in similar productions. The theatre historian David Mayer explains the use of the "batte" or slapstick and the "transformation scene": Rich gave his Harlequin the power to create stage magic in league with offstage craftsmen who operated trick scenery. Armed with a magic sword or bat (actually a slapstick), Rich's Harlequin treated his weapon as a wand, striking the scenery to sustain the illusion of changing the setting from one locale to another. Objects, too, were transformed by Harlequin's magic bat. Rich's productions were a hit, and other producers, like David Garrick, began producing their own pantomimes. For the rest of the century this pattern persisted in London theatres. When producers ran short of plots from Greek or Roman mythology they turned to British folk stories, popular literature and, by 1800, nursery tales. But whatever the story shown in the first part of the entertainment, the harlequinade remained essentially the same. At the end of the first part, stage illusions were employed in a spectacular transformation scene, initiated by a fairy, turning the pantomime characters into Harlequin, Columbine and their fellows. In the early 19th century, the popular comic performer Joseph Grimaldi turned the role of Clown from "a rustic booby into the star of metropolitan pantomime". Two developments in 1800, both involving Grimaldi, greatly changed the pantomime characters: For the pantomime Peter Wilkins: or Harlequin in the Flying World, new costume designs were introduced. Clown traded in his tatty servant's costume for a flamboyant, colourful one. In Harlequin Amulet; or, The Magick of Mona, later the same year, Harlequin was modified, becoming an increasingly stylised romantic character leaving the mischief and chaos to Grimaldi's Clown. Clown now appeared in a range of roles, from the rival suitor to household cook or nurse. Grimaldi's popularity changed the balance of the evening's entertainment, with the first, relatively serious, section soon dwindling to what Mayer calls "little more than a pretext for determining the characters who were to be transformed into those of the harlequinade." In the 19th century, theatrical presentations typically ran for four hours or more, with the pantomime and harlequinade concluding the evening after a long drama. The pantomimes had double titles, describing the two unconnected stories such as "Little Miss Muffet and Little Boy Blue, or Harlequin and Old Daddy Long-Legs." In an elaborate scene initiated by Harlequin's "slapstick", a Fairy Queen or Fairy Godmother transformed the pantomime characters into the characters of the harlequinade, who then performed the harlequinade. Throughout the 19th century, as stage machinery and technology improved, the transformation of the set became more and more spectacular. Once the transformation was complete, Clown would announce: "Here We Are Again". The setting was usually a street scene containing several stage traps, trick doors and windows. Clown would jump through windows and reappear through trap doors. He would steal sausages, chickens and other props which he would stuff into his pockets, later dividing these unfairly with an accomplice. He would grease the doorstep of a butcher's shop with butter to outwit his pursuers. Usually there was not much spoken dialogue, but much business with a "red hot poker". Harlequin would use his magic wand or staff to turn a dog into sausages and a bed into a horse trough, to the surprise of the sleeping victim. Clown would dive into a clock face, which would show no sign of entry. The harlequinade lost popularity by the 1880s, when music hall, Victorian burlesque, comic opera and other comic entertainments dominated the British comedy stage. In pantomime, the love scenes between Harlequin and Columbine dwindled into brief displays of dancing and acrobatics, the fairy-tale opening was restored to its original pre-eminence, and by the end of the 19th century the harlequinade had become merely a brief epilogue to the pantomime. It lingered for a few decades longer but finally disappeared completely by the middle of the 20th century. The last harlequinade was played at the Lyceum Theatre in 1939. Characters The harlequinade characters consisted of the following five kinds of clowns, in addition to more minor characters like a policeman: Harlequin is the comedian and romantic male lead. He is a servant and the love interest of Columbine. His everlasting high spirits and cleverness work to save him from difficult situations into which his amoral behaviour leads during the course of the harlequinade. In some versions of the original commedia dell'arte, Harlequin is able to perform magic feats. He never holds a grudge or seeks revenge. John Rich brought the British pantomime and harlequinade to great popularity in the early 18th century and became the most famous early Harlequin in England. He developed the character of Harlequin into a mischievous magician who was easily able to evade Pantaloon and his servants to woo Columbine. Harlequin used his magic batte or "slapstick" to transform the scene from the pantomime into the harlequinade and to magically change the settings to various locations during the chase scene. In 1800, at Drury Lane, in Harlequin Amulet; or, The Magick of Mona, Harlequin was modified to become "romantic and mercurial, instead of mischievous". During the 19th century, Harlequin became an increasingly stylised character that performed certain dance poses. Later in the century, Fred Payne and Harry Payne, known as the Payne Brothers, were the most famous Harlequin and Clown, respectively, of their day. Columbine (Colombina in Italian) is a lovely woman who has caught the eye of Harlequin. In the original commedia dell'arte she was variously portrayed as a Pantaloon's daughter or servant. In the English harlequinade she is always Pantaloon's daughter or ward. Her role usually centres on her romantic interest in Harlequin, and her costume often includes the cap and apron of a serving girl, though (unlike the other players) not a mask. Originally a foil for Harlequin's slyness and adroit nature, Clown was a buffoon or bumpkin fool who resembled less a jester than a comical idiot. He was a lower class character, the servant of Pantaloon, dressed in tattered servants' garb. Despite his acrobatic antics, Clown invariably slowed Pantaloon in his pursuit of the lovers. However, two developments in 1800, both involving Joseph Grimaldi, greatly changed the pantomime characters. Grimaldi starred as Clown in Charles Dibdin's 1800 pantomime Peter Wilkins: or Harlequin in the Flying World at Sadler's Wells Theatre. For this elaborate production, Dibdin introduced new costume designs. Clown's costume was "garishly colourful ... patterned with large diamonds and circles, and fringed with tassels and ruffs", instead of the tatty servant's outfit that had been used for a century. The production was a hit, and the new costume design was copied by others in London. Later the same year, at the Theatre Royal, Drury Lane, in Harlequin Amulet; or, The Magick of Mona, Harlequin was modified, becoming "romantic and mercurial, instead of mischievous", which left Grimaldi's Clown as the "undisputed agent" of chaos. Clown became more important, embodying anarchic fun, and no longer simply a servant of Pantaloon. Grimaldi built the character up into the central figure of the harlequinade. He developed jokes, catch-phrases and songs that were used by subsequent Clowns for decades after his retirement in 1828, and Clowns were generically called "Joey" for four generations after him. Clown became central to the transformation scene, crying "Here we are again!" and so opening the harlequinade. He then became the villain of the piece, playing elaborate, cartoonish practical jokes on policemen, soldiers, tradesmen and passers-by, tripping people with butter slides and crushing babies, with the assistance of his elderly accomplice, Pantaloon. The American George Fox, popularly known as G. L. Fox, became interested in pantomime and made Clown a popular character in the Humpty Dumpty story, with which he toured North America during the middle 19th century. In commedia dell'arte, Pantaloon (Pantalone in Italian) was a devious, greedy merchant of Venice. He is taken in readily by the various tricks and schemes of Harlequin. Pantaloon's costume usually included red tight-fitting vest and breeches, slippers, a skullcap, an oversized hooked nose, and a grubby grey goatee. Pantaloon was familiar enough to London audiences for Shakespeare to refer to him at the turn of the 17th century as the exemplar of an elderly man, "the lean and slippered Pantaloon". In the English harlequinade, Pantaloon emerged as the greedy, elderly father of Columbine who tries to keep the lovers separated but was no match for Harlequin's cleverness. His servant Clown's antics, however, slowed him in his pursuit of the lovers. Later, Pantaloon became Clown's assistant. Pierrot (Pedroline) was a comic servant character, often Pantaloon's servant. His face was whitened with flour. During the 17th century, the character was increasingly portrayed as stupid and awkward, a country bumpkin with oversized clothes. During the 19th century, the Pierrot character became less comic, and more sentimental and romantic, as his hopeless adoration for Columbine was emphasised. Also in the 19th century, Pierrot troupes arose, with all the performers in whiteface and baggy white costumes. Costume The costumes consisted of the following: Adaptations Although the original commedia dell'arte characters inspired many stage works, novels and short stories, there were fewer works that drew on the characters of the English tradition. They include Harlequin and Mother Goose, or The Golden Egg (1806) by Thomas John Dibdin and Harlequin and the Fairy's Dilemma (1904) by W. S. Gilbert. References Sources Further reading External links |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.