text stringlengths 0 473k |
|---|
[SOURCE: https://www.wired.com/about/press/] | [TOKENS: 423] |
Press CenterAboutWIRED IS WHERE tomorrow is realized. It is the essential source of information and ideas that make sense of a world in constant transformation. The WIRED conversation illuminates how technology is changing every aspect of our lives—from culture to business, science to design. The breakthroughs and innovations that we uncover lead to new ways of thinking, new connections, and new industries. WIRED reaches more than 30 million people each month through WIRED.com, our digital edition, the magazine, social media, and live events.For all press inquiries, please contact:Cydney GasthalterCommunications Manager1 World Trade CenterNew York, NY 10007cydney_gasthalter@condenast.compress@WIRED.comMost PopularThe Big StoryInside the Gay Tech MafiaPoliticsDHS Opens a Billion-Dollar Tab With PalantirGearA $10K Bounty Awaits Anyone Who Can Hack Ring Cameras to Stop Sharing Data With AmazonBusinessInside the Rolling Layoffs at Jack Dorsey’s Block Press Center About WIRED IS WHERE tomorrow is realized. It is the essential source of information and ideas that make sense of a world in constant transformation. The WIRED conversation illuminates how technology is changing every aspect of our lives—from culture to business, science to design. The breakthroughs and innovations that we uncover lead to new ways of thinking, new connections, and new industries. WIRED reaches more than 30 million people each month through WIRED.com, our digital edition, the magazine, social media, and live events. For all press inquiries, please contact: Cydney GasthalterCommunications Manager1 World Trade CenterNew York, NY 10007cydney_gasthalter@condenast.compress@WIRED.com © 2026 Condé Nast. All rights reserved. WIRED may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Black_hole#cite_note-306] | [TOKENS: 13839] |
Contents Black hole A black hole is an astronomical body so compact that its gravity prevents anything, including light, from escaping. Albert Einstein's theory of general relativity predicts that a sufficiently compact mass will form a black hole. The boundary of no escape is called the event horizon. In general relativity, a black hole's event horizon seals an object's fate but produces no locally detectable change when crossed. General relativity also predicts that every black hole should have a central singularity, where the curvature of spacetime is infinite. In many ways, a black hole acts like an ideal black body, as it reflects no light. Quantum field theory in curved spacetime predicts that event horizons emit Hawking radiation, with the same spectrum as a black body of a temperature inversely proportional to its mass. This temperature is of the order of billionths of a kelvin for stellar black holes, making it essentially impossible to observe directly. Objects whose gravitational fields are too strong for light to escape were first considered in the 18th century by John Michell and Pierre-Simon Laplace. In 1916, Karl Schwarzschild found the first modern solution of general relativity that would characterise a black hole. Due to his influential research, the Schwarzschild metric is named after him. David Finkelstein, in 1958, first interpreted Schwarzschild's model as a region of space from which nothing can escape. Black holes were long considered a mathematical curiosity; it was not until the 1960s that theoretical work showed they were a generic prediction of general relativity. The first black hole known was Cygnus X-1, identified by several researchers independently in 1971. Black holes typically form when massive stars collapse at the end of their life cycle. After a black hole has formed, it can grow by absorbing mass from its surroundings. Supermassive black holes of millions of solar masses may form by absorbing other stars and merging with other black holes, or via direct collapse of gas clouds. There is consensus that supermassive black holes exist in the centres of most galaxies. The presence of a black hole can be inferred through its interaction with other matter and with electromagnetic radiation such as visible light. Matter falling toward a black hole can form an accretion disk of infalling plasma, heated by friction and emitting light. In extreme cases, this creates a quasar, some of the brightest objects in the universe. Merging black holes can also be detected by observation of the gravitational waves they emit. If other stars are orbiting a black hole, their orbits can be used to determine the black hole's mass and location. Such observations can be used to exclude possible alternatives such as neutron stars. In this way, astronomers have identified numerous stellar black hole candidates in binary systems and established that the radio source known as Sagittarius A*, at the core of the Milky Way galaxy, contains a supermassive black hole of about 4.3 million solar masses. History The idea of a body so massive that even light could not escape was first proposed in the late 18th century by English astronomer and clergyman John Michell and independently by French scientist Pierre-Simon Laplace. Both scholars proposed very large stars in contrast to the modern concept of an extremely dense object. Michell's idea, in a short part of a letter published in 1784, calculated that a star with the same density but 500 times the radius of the sun would not let any emitted light escape; the surface escape velocity would exceed the speed of light.: 122 Michell correctly hypothesized that such supermassive but non-radiating bodies might be detectable through their gravitational effects on nearby visible bodies. In 1796, Laplace mentioned that a star could be invisible if it were sufficiently large while speculating on the origin of the Solar System in his book Exposition du Système du Monde. Franz Xaver von Zach asked Laplace for a mathematical analysis, which Laplace provided and published in a journal edited by von Zach. In 1905, Albert Einstein showed that the laws of electromagnetism would be invariant under a Lorentz transformation: they would be identical for observers travelling at different velocities relative to each other. This discovery became known as the principle of special relativity. Although the laws of mechanics had already been shown to be invariant, gravity remained yet to be included.: 19 In 1907, Einstein published a paper proposing his equivalence principle, the hypothesis that inertial mass and gravitational mass have a common cause. Using the principle, Einstein predicted the redshift and half of the lensing effect of gravity on light; the full prediction of gravitational lensing required development of general relativity.: 19 By 1915, Einstein refined these ideas into his general theory of relativity, which explained how matter affects spacetime, which in turn affects the motion of other matter. This formed the basis for black hole physics. Only a few months after Einstein published the field equations describing general relativity, astrophysicist Karl Schwarzschild set out to apply the idea to stars. He assumed spherical symmetry with no spin and found a solution to Einstein's equations.: 124 A few months after Schwarzschild, Johannes Droste, a student of Hendrik Lorentz, independently gave the same solution. At a certain radius from the center of the mass, the Schwarzschild solution became singular, meaning that some of the terms in the Einstein equations became infinite. The nature of this radius, which later became known as the Schwarzschild radius, was not understood at the time. Many physicists of the early 20th century were skeptical of the existence of black holes. In a 1926 popular science book, Arthur Eddington critiqued the idea of a star with mass compressed to its Schwarzschild radius as a flaw in the then-poorly-understood theory of general relativity.: 134 In 1939, Einstein himself used his theory of general relativity in an attempt to prove that black holes were impossible. His work relied on increasing pressure or increasing centrifugal force balancing the force of gravity so that the object would not collapse beyond its Schwarzschild radius. He missed the possibility that implosion would drive the system below this critical value.: 135 By the 1920s, astronomers had classified a number of white dwarf stars as too cool and dense to be explained by the gradual cooling of ordinary stars. In 1926, Ralph Fowler showed that quantum-mechanical degeneracy pressure was larger than thermal pressure at these densities.: 145 In 1931, Subrahmanyan Chandrasekhar calculated that a non-rotating body of electron-degenerate matter below a certain limiting mass is stable, and by 1934 he showed that this explained the catalog of white dwarf stars.: 151 When Chandrasekhar announced his results, Eddington pointed out that stars above this limit would radiate until they were sufficiently dense to prevent light from exiting, a conclusion he considered absurd. Eddington and, later, Lev Landau argued that some yet unknown mechanism would stop the collapse. In the 1930s, Fritz Zwicky and Walter Baade studied stellar novae, focusing on exceptionally bright ones they called supernovae. Zwicky promoted the idea that supernovae produced stars with the density of atomic nuclei—neutron stars—but this idea was largely ignored.: 171 In 1939, based on Chandrasekhar's reasoning, J. Robert Oppenheimer and George Volkoff predicted that neutron stars below a certain mass limit, later called the Tolman–Oppenheimer–Volkoff limit, would be stable due to neutron degeneracy pressure. Above that limit, they reasoned that either their model would not apply or that gravitational contraction would not stop.: 380 John Archibald Wheeler and two of his students resolved questions about the model behind the Tolman–Oppenheimer–Volkoff (TOV) limit. Harrison and Wheeler developed the equations of state relating density to pressure for cold matter all the way through electron degeneracy and neutron degeneracy. Masami Wakano and Wheeler then used the equations to compute the equilibrium curve for stars, relating mass to circumference. They found no additional features that would invalidate the TOV limit. This meant that the only thing that could prevent black holes from forming was a dynamic process ejecting sufficient mass from a star as it cooled.: 205 The modern concept of black holes was formulated by Robert Oppenheimer and his student Hartland Snyder in 1939.: 80 In the paper, Oppenheimer and Snyder solved Einstein's equations of general relativity for an idealized imploding star, in a model later called the Oppenheimer–Snyder model, then described the results from far outside the star. The implosion starts as one might expect: the star material rapidly collapses inward. However, as the density of the star increases, gravitational time dilation increases and the collapse, viewed from afar, seems to slow down further and further until the star reaches its Schwarzschild radius, where it appears frozen in time.: 217 In 1958, David Finkelstein identified the Schwarzschild surface as an event horizon, calling it "a perfect unidirectional membrane: causal influences can cross it in only one direction". In this sense, events that occur inside of the black hole cannot affect events that occur outside of the black hole. Finkelstein created a new reference frame to include the point of view of infalling observers.: 103 Finkelstein's new frame of reference allowed events at the surface of an imploding star to be related to events far away. By 1962 the two points of view were reconciled, convincing many skeptics that implosion into a black hole made physical sense.: 226 The era from the mid-1960s to the mid-1970s was the "golden age of black hole research", when general relativity and black holes became mainstream subjects of research.: 258 In this period, more general black hole solutions were found. In 1963, Roy Kerr found the exact solution for a rotating black hole. Two years later, Ezra Newman found the cylindrically symmetric solution for a black hole that is both rotating and electrically charged. In 1967, Werner Israel found that the Schwarzschild solution was the only possible solution for a nonspinning, uncharged black hole, meaning that a Schwarzschild black hole would be defined by its mass alone. Similar identities were later found for Reissner-Nordstrom and Kerr black holes, defined only by their mass and their charge or spin respectively. Together, these findings became known as the no-hair theorem, which states that a stationary black hole is completely described by the three parameters of the Kerr–Newman metric: mass, angular momentum, and electric charge. At first, it was suspected that the strange mathematical singularities found in each of the black hole solutions only appeared due to the assumption that a black hole would be perfectly spherically symmetric, and therefore the singularities would not appear in generic situations where black holes would not necessarily be symmetric. This view was held in particular by Vladimir Belinski, Isaak Khalatnikov, and Evgeny Lifshitz, who tried to prove that no singularities appear in generic solutions, although they would later reverse their positions. However, in 1965, Roger Penrose proved that general relativity without quantum mechanics requires that singularities appear in all black holes. Astronomical observations also made great strides during this era. In 1967, Antony Hewish and Jocelyn Bell Burnell discovered pulsars and by 1969, these were shown to be rapidly rotating neutron stars. Until that time, neutron stars, like black holes, were regarded as just theoretical curiosities, but the discovery of pulsars showed their physical relevance and spurred a further interest in all types of compact objects that might be formed by gravitational collapse. Based on observations in Greenwich and Toronto in the early 1970s, Cygnus X-1, a galactic X-ray source discovered in 1964, became the first astronomical object commonly accepted to be a black hole. Work by James Bardeen, Jacob Bekenstein, Carter, and Hawking in the early 1970s led to the formulation of black hole thermodynamics. These laws describe the behaviour of a black hole in close analogy to the laws of thermodynamics by relating mass to energy, area to entropy, and surface gravity to temperature. The analogy was completed: 442 when Hawking, in 1974, showed that quantum field theory implies that black holes should radiate like a black body with a temperature proportional to the surface gravity of the black hole, predicting the effect now known as Hawking radiation. While Cygnus X-1, a stellar-mass black hole, was generally accepted by the scientific community as a black hole by the end of 1973, it would be decades before a supermassive black hole would gain the same broad recognition. Although, as early as the 1960s, physicists such as Donald Lynden-Bell and Martin Rees had suggested that powerful quasars in the center of galaxies were powered by accreting supermassive black holes, little observational proof existed at the time. However, the Hubble Space Telescope, launched decades later, found that supermassive black holes were not only present in these active galactic nuclei, but that supermassive black holes in the center of galaxies were ubiquitous: Almost every galaxy had a supermassive black hole at its center, many of which were quiescent. In 1999, David Merritt proposed the M–sigma relation, which related the dispersion of the velocity of matter in the center bulge of a galaxy to the mass of the supermassive black hole at its core. Subsequent studies confirmed this correlation. Around the same time, based on telescope observations of the velocities of stars at the center of the Milky Way galaxy, independent work groups led by Andrea Ghez and Reinhard Genzel concluded that the compact radio source in the center of the galaxy, Sagittarius A*, was likely a supermassive black hole. On 11 February 2016, the LIGO Scientific Collaboration and Virgo Collaboration announced the first direct detection of gravitational waves, named GW150914, representing the first observation of a black hole merger. At the time of the merger, the black holes were approximately 1.4 billion light-years away from Earth and had masses of 30 and 35 solar masses.: 6 In 2017, Rainer Weiss, Kip Thorne, and Barry Barish, who had spearheaded the project, were awarded the Nobel Prize in Physics for their work. Since the initial discovery in 2015, hundreds more gravitational waves have been observed by LIGO and another interferometer, Virgo. On 10 April 2019, the first direct image of a black hole and its vicinity was published, following observations made by the Event Horizon Telescope (EHT) in 2017 of the supermassive black hole in Messier 87's galactic centre. In 2022, the Event Horizon Telescope collaboration released an image of the black hole in the center of the Milky Way galaxy, Sagittarius A*; The data had been collected in 2017. In 2020, the Nobel Prize in Physics was awarded for work on black holes. Andrea Ghez and Reinhard Genzel shared one-half for their discovery that Sagittarius A* is a supermassive black hole. Penrose received the other half for his work showing that the mathematics of general relativity requires the formation of black holes. Cosmologists lamented that Hawking's extensive theoretical work on black holes would not be honored since he died in 2018. In December 1967, a student reportedly suggested the phrase black hole at a lecture by John Wheeler; Wheeler adopted the term for its brevity and "advertising value", and Wheeler's stature in the field ensured it quickly caught on, leading some to credit Wheeler with coining the phrase. However, the term was used by others around that time. Science writer Marcia Bartusiak traces the term black hole to physicist Robert H. Dicke, who in the early 1960s reportedly compared the phenomenon to the Black Hole of Calcutta, notorious as a prison where people entered but never left alive. The term was used in print by Life and Science News magazines in 1963, and by science journalist Ann Ewing in her article "'Black Holes' in Space", dated 18 January 1964, which was a report on a meeting of the American Association for the Advancement of Science held in Cleveland, Ohio. Definition A black hole is generally defined as a region of spacetime from which no information-carrying signals or objects can escape. However, verifying an object as a black hole by this definition would require waiting for an infinite time and at an infinite distance from the black hole to verify that indeed, nothing has escaped, and thus cannot be used to identify a physical black hole. Broadly, physicists do not have a precisely-agreed-upon definition of a black hole. Among astrophysicists, a black hole is a compact object with a mass larger than four solar masses. A black hole may also be defined as a reservoir of information: 142 or a region where space is falling inwards faster than the speed of light. Properties The no-hair theorem postulates that, once it achieves a stable condition after formation, a black hole has only three independent physical properties: mass, electric charge, and angular momentum; the black hole is otherwise featureless. If the conjecture is true, any two black holes that share the same values for these properties, or parameters, are indistinguishable from one another. The degree to which the conjecture is true for real black holes is currently an unsolved problem. The simplest static black holes have mass but neither electric charge nor angular momentum. According to Birkhoff's theorem, these Schwarzschild black holes are the only vacuum solution that is spherically symmetric. Solutions describing more general black holes also exist. Non-rotating charged black holes are described by the Reissner–Nordström metric, while the Kerr metric describes a non-charged rotating black hole. The most general stationary black hole solution known is the Kerr–Newman metric, which describes a black hole with both charge and angular momentum. The simplest static black holes have mass but neither electric charge nor angular momentum. Contrary to the popular notion of a black hole "sucking in everything" in its surroundings, from far away, the external gravitational field of a black hole is identical to that of any other body of the same mass. While a black hole can theoretically have any positive mass, the charge and angular momentum are constrained by the mass. The total electric charge Q and the total angular momentum J are expected to satisfy the inequality Q 2 4 π ϵ 0 + c 2 J 2 G M 2 ≤ G M 2 {\displaystyle {\frac {Q^{2}}{4\pi \epsilon _{0}}}+{\frac {c^{2}J^{2}}{GM^{2}}}\leq GM^{2}} for a black hole of mass M. Black holes with the maximum possible charge or spin satisfying this inequality are called extremal black holes. Solutions of Einstein's equations that violate this inequality exist, but they do not possess an event horizon. These are so-called naked singularities that can be observed from the outside. Because these singularities make the universe inherently unpredictable, many physicists believe they could not exist. The weak cosmic censorship hypothesis, proposed by Sir Roger Penrose, rules out the formation of such singularities, when they are created through the gravitational collapse of realistic matter. However, this theory has not yet been proven, and some physicists believe that naked singularities could exist. It is also unknown whether black holes could even become extremal, forming naked singularities, since natural processes counteract increasing spin and charge when a black hole becomes near-extremal. The total mass of a black hole can be estimated by analyzing the motion of objects near the black hole, such as stars or gas. All black holes spin, often fast—One supermassive black hole, GRS 1915+105 has been estimated to spin at over 1,000 revolutions per second. The Milky Way's central black hole Sagittarius A* rotates at about 90% of the maximum rate. The spin rate can be inferred from measurements of atomic spectral lines in the X-ray range. As gas near the black hole plunges inward, high energy X-ray emission from electron-positron pairs illuminates the gas further out, appearing red-shifted due to relativistic effects. Depending on the spin of the black hole, this plunge happens at different radii from the hole, with different degrees of redshift. Astronomers can use the gap between the x-ray emission of the outer disk and the redshifted emission from plunging material to determine the spin of the black hole. A newer way to estimate spin is based on the temperature of gasses accreting onto the black hole. The method requires an independent measurement of the black hole mass and inclination angle of the accretion disk followed by computer modeling. Gravitational waves from coalescing binary black holes can also provide the spin of both progenitor black holes and the merged hole, but such events are rare. A spinning black hole has angular momentum. The supermassive black hole in the center of the Messier 87 (M87) galaxy appears to have an angular momentum very close to the maximum theoretical value. That uncharged limit is J ≤ G M 2 c , {\displaystyle J\leq {\frac {GM^{2}}{c}},} allowing definition of a dimensionless spin magnitude such that 0 ≤ c J G M 2 ≤ 1. {\displaystyle 0\leq {\frac {cJ}{GM^{2}}}\leq 1.} Most black holes are believed to have an approximately neutral charge. For example, Michal Zajaček, Arman Tursunov, Andreas Eckart, and Silke Britzen found the electric charge of Sagittarius A* to be at least ten orders of magnitude below the theoretical maximum. A charged black hole repels other like charges just like any other charged object. If a black hole were to become charged, particles with an opposite sign of charge would be pulled in by the extra electromagnetic force, while particles with the same sign of charge would be repelled, neutralizing the black hole. This effect may not be as strong if the black hole is also spinning. The presence of charge can reduce the diameter of the black hole by up to 38%. The charge Q for a nonspinning black hole is bounded by Q ≤ G M , {\displaystyle Q\leq {\sqrt {G}}M,} where G is the gravitational constant and M is the black hole's mass. Classification Black holes can have a wide range of masses. The minimum mass of a black hole formed by stellar gravitational collapse is governed by the maximum mass of a neutron star and is believed to be approximately two-to-four solar masses. However, theoretical primordial black holes, believed to have formed soon after the Big Bang, could be far smaller, with masses as little as 10−5 grams at formation. These very small black holes are sometimes called micro black holes. Black holes formed by stellar collapse are called stellar black holes. Estimates of their maximum mass at formation vary, but generally range from 10 to 100 solar masses, with higher estimates for black holes progenated by low-metallicity stars. The mass of a black hole formed via a supernova has a lower bound: If the progenitor star is too small, the collapse may be stopped by the degeneracy pressure of the star's constituents, allowing the condensation of matter into an exotic denser state. Degeneracy pressure occurs from the Pauli exclusion principle—Particles will resist being in the same place as each other. Smaller progenitor stars, with masses less than about 8 M☉, will be held together by the degeneracy pressure of electrons and will become a white dwarf. For more massive progenitor stars, electron degeneracy pressure is no longer strong enough to resist the force of gravity and the star will be held together by neutron degeneracy pressure, which can occur at much higher densities, forming a neutron star. If the star is still too massive, even neutron degeneracy pressure will not be able to resist the force of gravity and the star will collapse into a black hole.: 5.8 Stellar black holes can also gain mass via accretion of nearby matter, often from a companion object such as a star. Black holes that are larger than stellar black holes but smaller than supermassive black holes are called intermediate-mass black holes, with masses of approximately 102 to 105 solar masses. These black holes seem to be rarer than their stellar and supermassive counterparts, with relatively few candidates having been observed. Physicists have speculated that such black holes may form from collisions in globular and star clusters or at the center of low-mass galaxies. They may also form as the result of mergers of smaller black holes, with several LIGO observations finding merged black holes within the 110-350 solar mass range. The black holes with the largest masses are called supermassive black holes, with masses more than 106 times that of the Sun. These black holes are believed to exist at the centers of almost every large galaxy, including the Milky Way. Some scientists have proposed a subcategory of even larger black holes, called ultramassive black holes, with masses greater than 109-1010 solar masses. Theoretical models predict that the accretion disc that feeds black holes will be unstable once a black hole reaches 50-100 billion times the mass of the Sun, setting a rough upper limit to black hole mass. Structure While black holes are conceptually invisible sinks of all matter and light, in astronomical settings, their enormous gravity alters the motion of surrounding objects and pulls nearby gas inwards at near-light speed, making the area around black holes the brightest objects in the universe. Some black holes have relativistic jets—thin streams of plasma travelling away from the black hole at more than one-tenth of the speed of light. A small faction of the matter falling towards the black hole gets accelerated away along the hole rotation axis. These jets can extend as far as millions of parsecs from the black hole itself. Black holes of any mass can have jets. However, they are typically observed around spinning black holes with strongly-magnetized accretion disks. Relativistic jets were more common in the early universe, when galaxies and their corresponding supermassive black holes were rapidly gaining mass. All black holes with jets also have an accretion disk, but the jets are usually brighter than the disk. Quasars, typically found in other galaxies, are believed to be supermassive black holes with jets; microquasars are believed to be stellar-mass objects with jets, typically observed in the Milky Way. The mechanism of formation of jets is not yet known, but several options have been proposed. One method proposed to fuel these jets is the Blandford-Znajek process, which suggests that the dragging of magnetic field lines by a black hole's rotation could launch jets of matter into space. The Penrose process, which involves extraction of a black hole's rotational energy, has also been proposed as a potential mechanism of jet propulsion. Due to conservation of angular momentum, gas falling into the gravitational well created by a massive object will typically form a disk-like structure around the object.: 242 As the disk's angular momentum is transferred outward due to internal processes, its matter falls farther inward, converting its gravitational energy into heat and releasing a large flux of x-rays. The temperature of these disks can range from thousands to millions of Kelvin, and temperatures can differ throughout a single accretion disk. Accretion disks can also emit in other parts of the electromagnetic spectrum, depending on the disk's turbulence and magnetization and the black hole's mass and angular momentum. Accretion disks can be defined as geometrically thin or geometrically thick. Geometrically thin disks are mostly confined to the black hole's equatorial plane and have a well-defined edge at the innermost stable circular orbit (ISCO), while geometrically thick disks are supported by internal pressure and temperature and can extend inside the ISCO. Disks with high rates of electron scattering and absorption, appearing bright and opaque, are called optically thick; optically thin disks are more translucent and produce fainter images when viewed from afar. Accretion disks of black holes accreting beyond the Eddington limit are often referred to as polish donuts due to their thick, toroidal shape that resembles that of a donut. Quasar accretion disks are expected to usually appear blue in color. The disk for a stellar black hole, on the other hand, would likely look orange, yellow, or red, with its inner regions being the brightest. Theoretical research suggests that the hotter a disk is, the bluer it should be, although this is not always supported by observations of real astronomical objects. Accretion disk colors may also be altered by the Doppler effect, with the part of the disk travelling towards an observer appearing bluer and brighter and the part of the disk travelling away from the observer appearing redder and dimmer. In Newtonian gravity, test particles can stably orbit at arbitrary distances from a central object. In general relativity, however, there exists a smallest possible radius for which a massive particle can orbit stably. Any infinitesimal inward perturbations to this orbit will lead to the particle spiraling into the black hole, and any outward perturbations will, depending on the energy, cause the particle to spiral in, move to a stable orbit further from the black hole, or escape to infinity. This orbit is called the innermost stable circular orbit, or ISCO. The location of the ISCO depends on the spin of the black hole and the spin of the particle itself. In the case of a Schwarzschild black hole (spin zero) and a particle without spin, the location of the ISCO is: r I S C O = 3 r s = 6 G M c 2 , {\displaystyle r_{\rm {ISCO}}=3\,r_{\text{s}}={\frac {6\,GM}{c^{2}}},} where r I S C O {\displaystyle r_{\rm {_{ISCO}}}} is the radius of the ISCO, r s {\displaystyle r_{\text{s}}} is the Schwarzschild radius of the black hole, G {\displaystyle G} is the gravitational constant, and c {\displaystyle c} is the speed of light. The radius of this orbit changes slightly based on particle spin. For charged black holes, the ISCO moves inwards. For spinning black holes, the ISCO is moved inwards for particles orbiting in the same direction that the black hole is spinning (prograde) and outwards for particles orbiting in the opposite direction (retrograde). For example, the ISCO for a particle orbiting retrograde can be as far out as about 9 r s {\displaystyle 9r_{\text{s}}} , while the ISCO for a particle orbiting prograde can be as close as at the event horizon itself. The photon sphere is a spherical boundary for which photons moving on tangents to that sphere are bent completely around the black hole, possibly orbiting multiple times. Light rays with impact parameters less than the radius of the photon sphere enter the black hole. For Schwarzschild black holes, the photon sphere has a radius 1.5 times the Schwarzschild radius; the radius for non-Schwarzschild black holes is at least 1.5 times the radius of the event horizon. When viewed from a great distance, the photon sphere creates an observable black hole shadow. Since no light emerges from within the black hole, this shadow is the limit for possible observations.: 152 The shadow of colliding black holes should have characteristic warped shapes, allowing scientists to detect black holes that are about to merge. While light can still escape from the photon sphere, any light that crosses the photon sphere on an inbound trajectory will be captured by the black hole. Therefore, any light that reaches an outside observer from the photon sphere must have been emitted by objects between the photon sphere and the event horizon. Light emitted towards the photon sphere may also curve around the black hole and return to the emitter. For a rotating, uncharged black hole, the radius of the photon sphere depends on the spin parameter and whether the photon is orbiting prograde or retrograde. For a photon orbiting prograde, the photon sphere will be 1-3 Schwarzschild radii from the center of the black hole, while for a photon orbiting retrograde, the photon sphere will be between 3-5 Schwarzschild radii from the center of the black hole. The exact location of the photon sphere depends on the magnitude of the black hole's rotation. For a charged, nonrotating black hole, there will only be one photon sphere, and the radius of the photon sphere will decrease for increasing black hole charge. For non-extremal, charged, rotating black holes, there will always be two photon spheres, with the exact radii depending on the parameters of the black hole. Near a rotating black hole, spacetime rotates similar to a vortex. The rotating spacetime will drag any matter and light into rotation around the spinning black hole. This effect of general relativity, called frame dragging, gets stronger closer to the spinning mass. The region of spacetime in which it is impossible to stay still is called the ergosphere. The ergosphere of a black hole is a volume bounded by the black hole's event horizon and the ergosurface, which coincides with the event horizon at the poles but bulges out from it around the equator. Matter and radiation can escape from the ergosphere. Through the Penrose process, objects can emerge from the ergosphere with more energy than they entered with. The extra energy is taken from the rotational energy of the black hole, slowing down the rotation of the black hole.: 268 A variation of the Penrose process in the presence of strong magnetic fields, the Blandford–Znajek process, is considered a likely mechanism for the enormous luminosity and relativistic jets of quasars and other active galactic nuclei. The observable region of spacetime around a black hole closest to its event horizon is called the plunging region. In this area it is no longer possible for free falling matter to follow circular orbits or stop a final descent into the black hole. Instead, it will rapidly plunge toward the black hole at close to the speed of light, growing increasingly hot and producing a characteristic, detectable thermal emission. However, light and radiation emitted from this region can still escape from the black hole's gravitational pull. For a nonspinning, uncharged black hole, the radius of the event horizon, or Schwarzschild radius, is proportional to the mass, M, through r s = 2 G M c 2 ≈ 2.95 M M ⊙ k m , {\displaystyle r_{\mathrm {s} }={\frac {2GM}{c^{2}}}\approx 2.95\,{\frac {M}{M_{\odot }}}~\mathrm {km,} } where rs is the Schwarzschild radius and M☉ is the mass of the Sun.: 124 For a black hole with nonzero spin or electric charge, the radius is smaller,[Note 1] until an extremal black hole could have an event horizon close to r + = G M c 2 , {\displaystyle r_{\mathrm {+} }={\frac {GM}{c^{2}}},} half the radius of a nonspinning, uncharged black hole of the same mass. Since the volume within the Schwarzschild radius increase with the cube of the radius, average density of a black hole inside its Schwarzschild radius is inversely proportional to the square of its mass: supermassive black holes are much less dense than stellar black holes. The average density of a 108 M☉ black hole is comparable to that of water. The defining feature of a black hole is the existence of an event horizon, a boundary in spacetime through which matter and light can pass only inward towards the center of the black hole. Nothing, not even light, can escape from inside the event horizon. The event horizon is referred to as such because if an event occurs within the boundary, information from that event cannot reach or affect an outside observer, making it impossible to determine whether such an event occurred.: 179 For non-rotating black holes, the geometry of the event horizon is precisely spherical, while for rotating black holes, the event horizon is oblate. To a distant observer, a clock near a black hole would appear to tick more slowly than one further from the black hole.: 217 This effect, known as gravitational time dilation, would also cause an object falling into a black hole to appear to slow as it approached the event horizon, never quite reaching the horizon from the perspective of an outside observer.: 218 All processes on this object would appear to slow down, and any light emitted by the object to appear redder and dimmer, an effect known as gravitational redshift. An object falling from half of a Schwarzschild radius above the event horizon would fade away until it could no longer be seen, disappearing from view within one hundredth of a second. It would also appear to flatten onto the black hole, joining all other material that had ever fallen into the hole. On the other hand, an observer falling into a black hole would not notice any of these effects as they cross the event horizon. Their own clocks appear to them to tick normally, and they cross the event horizon after a finite time without noting any singular behaviour. In general relativity, it is impossible to determine the location of the event horizon from local observations, due to Einstein's equivalence principle.: 222 Black holes that are rotating and/or charged have an inner horizon, often called the Cauchy horizon, inside of the black hole. The inner horizon is divided up into two segments: an ingoing section and an outgoing section. At the ingoing section of the Cauchy horizon, radiation and matter that fall into the black hole would build up at the horizon, causing the curvature of spacetime to go to infinity. This would cause an observer falling in to experience tidal forces. This phenomenon is often called mass inflation, since it is associated with a parameter dictating the black hole's internal mass growing exponentially, and the buildup of tidal forces is called the mass-inflation singularity or Cauchy horizon singularity. Some physicists have argued that in realistic black holes, accretion and Hawking radiation would stop mass inflation from occurring. At the outgoing section of the inner horizon, infalling radiation would backscatter off of the black hole's spacetime curvature and travel outward, building up at the outgoing Cauchy horizon. This would cause an infalling observer to experience a gravitational shock wave and tidal forces as the spacetime curvature at the horizon grew to infinity. This buildup of tidal forces is called the shock singularity. Both of these singularities are weak, meaning that an object crossing them would only be deformed a finite amount by tidal forces, even though the spacetime curvature would still be infinite at the singularity. This is as opposed to a strong singularity, where an object hitting the singularity would be stretched and squeezed by an infinite amount. They are also null singularities, meaning that a photon could travel parallel to the them without ever being intercepted. Ignoring quantum effects, every black hole has a singularity inside, points where the curvature of spacetime becomes infinite, and geodesics terminate within a finite proper time.: 205 For a non-rotating black hole, this region takes the shape of a single point; for a rotating black hole it is smeared out to form a ring singularity that lies in the plane of rotation.: 264 In both cases, the singular region has zero volume. All of the mass of the black hole ends up in the singularity.: 252 Since the singularity has nonzero mass in an infinitely small space, it can be thought of as having infinite density. Observers falling into a Schwarzschild black hole (i.e., non-rotating and not charged) cannot avoid being carried into the singularity once they cross the event horizon. As they fall further into the black hole, they will be torn apart by the growing tidal forces in a process sometimes referred to as spaghettification or the noodle effect. Eventually, they will reach the singularity and be crushed into an infinitely small point.: 182 However any perturbations, such as those caused by matter or radiation falling in, would cause space to oscillate chaotically near the singularity. Any matter falling in would experience intense tidal forces rapidly changing in direction, all while being compressed into an increasingly small volume. Alternative forms of general relativity, including addition of some quatum effects, can lead to regular, or nonsingular, black holes without singularities. For example, the fuzzball model, based on string theory, states that black holes are actually made up of quantum microstates and need not have a singularity or an event horizon. The theory of loop quantum gravity proposes that the curvature and density at the center of a black hole is large, but not infinite. Formation Black holes are formed by gravitational collapse of massive stars, either by direct collapse or during a supernova explosion in a process called fallback. Black holes can result from the merger of two neutron stars or a neutron star and a black hole. Other more speculative mechanisms include primordial black holes created from density fluctuations in the early universe, the collapse of dark stars, a hypothetical object powered by annihilation of dark matter, or from hypothetical self-interacting dark matter. Gravitational collapse occurs when an object's internal pressure is insufficient to resist the object's own gravity. At the end of a star's life, it will run out of hydrogen to fuse, and will start fusing more and more massive elements, until it gets to iron. Since the fusion of elements heavier than iron would require more energy than it would release, nuclear fusion ceases. If the iron core of the star is too massive, the star will no longer be able to support itself and will undergo gravitational collapse. While most of the energy released during gravitational collapse is emitted very quickly, an outside observer does not actually see the end of this process. Even though the collapse takes a finite amount of time from the reference frame of infalling matter, a distant observer would see the infalling material slow and halt just above the event horizon, due to gravitational time dilation. Light from the collapsing material takes longer and longer to reach the observer, with the delay growing to infinity as the emitting material reaches the event horizon. Thus the external observer never sees the formation of the event horizon; instead, the collapsing material seems to become dimmer and increasingly red-shifted, eventually fading away. Observations of quasars at redshift z ∼ 7 {\displaystyle z\sim 7} , less than a billion years after the Big Bang, has led to investigations of other ways to form black holes. The accretion process to build supermassive black holes has a limiting rate of mass accumulation and a billion years is not enough time to reach quasar status. One suggestion is direct collapse of nearly pure hydrogen gas (low metalicity) clouds characteristic of the young universe, forming a supermassive star which collapses into a black hole. It has been suggested that seed black holes with typical masses of ~105 M☉ could have formed in this way which then could grow to ~109 M☉. However, the very large amount of gas required for direct collapse is not typically stable to fragmentation to form multiple stars. Thus another approach suggests massive star formation followed by collisions that seed massive black holes which ultimately merge to create a quasar.: 85 A neutron star in a common envelope with a regular star can accrete sufficient material to collapse to a black hole or two neutron stars can merge. These avenues for the formation of black holes are considered relatively rare. In the current epoch of the universe, conditions needed to form black holes are rare and are mostly only found in stars. However, in the early universe, conditions may have allowed for black hole formations via other means. Fluctuations of spacetime soon after the Big Bang may have formed areas that were denser then their surroundings. Initially, these regions would not have been compact enough to form a black hole, but eventually, the curvature of spacetime in the regions become large enough to cause them to collapse into a black hole. Different models for the early universe vary widely in their predictions of the scale of these fluctuations. Various models predict the creation of primordial black holes ranging from a Planck mass (~2.2×10−8 kg) to hundreds of thousands of solar masses. Primordial black holes with masses less than 1015 g would have evaporated by now due to Hawking radiation. Despite the early universe being extremely dense, it did not re-collapse into a black hole during the Big Bang, since the universe was expanding rapidly and did not have the gravitational differential necessary for black hole formation. Models for the gravitational collapse of objects of relatively constant size, such as stars, do not necessarily apply in the same way to rapidly expanding space such as the Big Bang. In principle, black holes could be formed in high-energy particle collisions that achieve sufficient density, although no such events have been detected. These hypothetical micro black holes, which could form from the collision of cosmic rays and Earth's atmosphere or in particle accelerators like the Large Hadron Collider, would not be able to aggregate additional mass. Instead, they would evaporate in about 10−25 seconds, posing no threat to the Earth. Evolution Black holes can also merge with other objects such as stars or even other black holes. This is thought to have been important, especially in the early growth of supermassive black holes, which could have formed from the aggregation of many smaller objects. The process has also been proposed as the origin of some intermediate-mass black holes. Mergers of supermassive black holes may take a long time: As a binary of supermassive black holes approach each other, most nearby stars are ejected, leaving little for the remaining black holes to gravitationally interact with that would allow them to get closer to each other. This phenomenon has been called the final parsec problem, as the distance at which this happens is usually around one parsec. When a black hole accretes matter, the gas in the inner accretion disk orbits at very high speeds because of its proximity to the black hole. The resulting friction heats the inner disk to temperatures at which it emits vast amounts of electromagnetic radiation (mainly X-rays) detectable by telescopes. By the time the matter of the disk reaches the ISCO, between 5.7% and 42% of its mass will have been converted to energy, depending on the black hole's spin. About 90% of this energy is released within about 20 black hole radii. In many cases, accretion disks are accompanied by relativistic jets that are emitted along the black hole's poles, which carry away much of the energy. The mechanism for the creation of these jets is currently not well understood, in part due to insufficient data. Many of the universe's most energetic phenomena have been attributed to the accretion of matter on black holes. Active galactic nuclei and quasars are believed to be the accretion disks of supermassive black holes. X-ray binaries are generally accepted to be binary systems in which one of the two objects is a compact object accreting matter from its companion. Ultraluminous X-ray sources may be the accretion disks of intermediate-mass black holes. At a certain rate of accretion, the outward radiation pressure will become as strong as the inward gravitational force, and the black hole should unable to accrete any faster. This limit is called the Eddington limit. However, many black holes accrete beyond this rate due to their non-spherical geometry or instabilities in the accretion disk. Accretion beyond the limit is called Super-Eddington accretion and may have been commonplace in the early universe. Stars have been observed to get torn apart by tidal forces in the immediate vicinity of supermassive black holes in galaxy nuclei, in what is known as a tidal disruption event (TDE). Some of the material from the disrupted star forms an accretion disk around the black hole, which emits observable electromagnetic radiation. The correlation between the masses of supermassive black holes in the centres of galaxies with the velocity dispersion and mass of stars in their host bulges suggests that the formation of galaxies and the formation of their central black holes are related. Black hole winds from rapid accretion, particularly when the galaxy itself is still accreting matter, can compress gas nearby, accelerating star formation. However, if the winds become too strong, the black hole may blow nearly all of the gas out of the galaxy, quenching star formation. Black hole jets may also energize nearby cavities of plasma and eject low-entropy gas from out of the galactic core, causing gas in galactic centers to be hotter than expected. If Hawking's theory of black hole radiation is correct, then black holes are expected to shrink and evaporate over time as they lose mass by the emission of photons and other particles. The temperature of this thermal spectrum (Hawking temperature) is proportional to the surface gravity of the black hole, which is inversely proportional to the mass. Hence, large black holes emit less radiation than small black holes.: Ch. 9.6 A stellar black hole of 1 M☉ has a Hawking temperature of 62 nanokelvins. This is far less than the 2.7 K temperature of the cosmic microwave background radiation. Stellar-mass or larger black holes receive more mass from the cosmic microwave background than they emit through Hawking radiation and thus will grow instead of shrinking. To have a Hawking temperature larger than 2.7 K (and be able to evaporate), a black hole would need a mass less than the Moon. Such a black hole would have a diameter of less than a tenth of a millimetre. The Hawking radiation for an astrophysical black hole is predicted to be very weak and would thus be exceedingly difficult to detect from Earth. A possible exception is the burst of gamma rays emitted in the last stage of the evaporation of primordial black holes. Searches for such flashes have proven unsuccessful and provide stringent limits on the possibility of existence of low mass primordial black holes, with modern research predicting that primordial black holes must make up less than a fraction of 10−7 of the universe's total mass. NASA's Fermi Gamma-ray Space Telescope, launched in 2008, has searched for these flashes, but has not yet found any. The properties of a black hole are constrained and interrelated by the theories that predict these properties. When based on general relativity, these relationships are called the laws of black hole mechanics. For a black hole that is not still forming or accreting matter, the zeroth law of black hole mechanics states the black hole's surface gravity is constant across the event horizon. The first law relates changes in the black hole's surface area, angular momentum, and charge to changes in its energy. The second law says the surface area of a black hole never decreases on its own. Finally, the third law says that the surface gravity of a black hole is never zero. These laws are mathematical analogs of the laws of thermodynamics. They are not equivalent, however, because, according to general relativity without quantum mechanics, a black hole can never emit radiation, and thus its temperature must always be zero.: 11 Quantum mechanics predicts that a black hole will continuously emit thermal Hawking radiation, and therefore must always have a nonzero temperature. It also predicts that all black holes have entropy which scales with their surface area. When quantum mechanics is accounted for, the laws of black hole mechanics become equivalent to the classical laws of thermodynamics. However, these conclusions are derived without a complete theory of quantum gravity, although many potential theories do predict black holes having entropy and temperature. Thus, the true quantum nature of black hole thermodynamics continues to be debated.: 29 Observational evidence Millions of black holes with around 30 solar masses derived from stellar collapse are expected to exist in the Milky Way. Even a dwarf galaxy like Draco should have hundreds. Only a few of these have been detected. By nature, black holes do not themselves emit any electromagnetic radiation other than the hypothetical Hawking radiation, so astrophysicists searching for black holes must generally rely on indirect observations. The defining characteristic of a black hole is its event horizon. The horizon itself cannot be imaged, so all other possible explanations for these indirect observations must be considered and eliminated before concluding that a black hole has been observed.: 11 The Event Horizon Telescope (EHT) is a global system of radio telescopes capable of directly observing a black hole shadow. The angular resolution of a telescope is based on its aperture and the wavelengths it is observing. Because the angular diameters of Sagittarius A* and Messier 87* in the sky are very small, a single telescope would need to be about the size of the Earth to clearly distinguish their horizons using radio wavelengths. By combining data from several different radio telescopes around the world, the Event Horizon Telescope creates an effective aperture the diameter size of the Earth. The EHT team used imaging algorithms to compute the most probable image from the data in its observations of Sagittarius A* and M87*. Gravitational-wave interferometry can be used to detect merging black holes and other compact objects. In this method, a laser beam is split down two long arms of a tunnel. The laser beams reflect off of mirrors in the tunnels and converge at the intersection of the arms, cancelling each other out. However, when a gravitational wave passes, it warps spacetime, changing the lengths of the arms themselves. Since each laser beam is now travelling a slightly different distance, they do not cancel out and produce a recognizable signal. Analysis of the signal can give scientists information about what caused the gravitational waves. Since gravitational waves are very weak, gravitational-wave observatories such as LIGO must have arms several kilometers long and carefully control for noise from Earth to be able to detect these gravitational waves. Since the first measurements in 2016, multiple gravitational waves from black holes have been detected and analyzed. The proper motions of stars near the centre of the Milky Way provide strong observational evidence that these stars are orbiting a supermassive black hole. Since 1995, astronomers have tracked the motions of 90 stars orbiting an invisible object coincident with the radio source Sagittarius A*. In 1998, by fitting the motions of the stars to Keplerian orbits, the astronomers were able to infer that Sagittarius A* must be a 2.6×106 M☉ object must be contained within a radius of 0.02 light-years. Since then, one of the stars—called S2—has completed a full orbit. From the orbital data, astronomers were able to refine the calculations of the mass of Sagittarius A* to 4.3×106 M☉, with a radius of less than 0.002 light-years. This upper limit radius is larger than the Schwarzschild radius for the estimated mass, so the combination does not prove Sagittarius A* is a black hole. Nevertheless, these observations strongly suggest that the central object is a supermassive black hole as there are no other plausible scenarios for confining so much invisible mass into such a small volume. Additionally, there is some observational evidence that this object might possess an event horizon, a feature unique to black holes. The Event Horizon Telescope image of Sagittarius A*, released in 2022, provided further confirmation that it is indeed a black hole. X-ray binaries are binary systems that emit a majority of their radiation in the X-ray part of the electromagnetic spectrum. These X-ray emissions result when a compact object accretes matter from an ordinary star. The presence of an ordinary star in such a system provides an opportunity for studying the central object and to determine if it might be a black hole. By measuring the orbital period of the binary, the distance to the binary from Earth, and the mass of the companion star, scientists can estimate the mass of the compact object. The Tolman-Oppenheimer-Volkoff limit (TOV limit) dictates the largest mass a nonrotating neutron star can be, and is estimated to be about two solar masses. While a rotating neutron star can be slightly more massive, if the compact object is much more massive than the TOV limit, it cannot be a neutron star and is generally expected to be a black hole. The first strong candidate for a black hole, Cygnus X-1, was discovered in this way by Charles Thomas Bolton, Louise Webster, and Paul Murdin in 1972. Observations of rotation broadening of the optical star reported in 1986 lead to a compact object mass estimate of 16 solar masses, with 7 solar masses as the lower bound. In 2011, this estimate was updated to 14.1±1.0 M☉ for the black hole and 19.2±1.9 M☉ for the optical stellar companion. X-ray binaries can be categorized as either low-mass or high-mass; This classification is based on the mass of the companion star, not the compact object itself. In a class of X-ray binaries called soft X-ray transients, the companion star is of relatively low mass, allowing for more accurate estimates of the black hole mass. These systems actively emit X-rays for only several months once every 10–50 years. During the period of low X-ray emission, called quiescence, the accretion disk is extremely faint, allowing detailed observation of the companion star. Numerous black hole candidates have been measured by this method. Black holes are also sometimes found in binaries with other compact objects, such as white dwarfs, neutron stars, and other black holes. The centre of nearly every galaxy contains a supermassive black hole. The close observational correlation between the mass of this hole and the velocity dispersion of the host galaxy's bulge, known as the M–sigma relation, strongly suggests a connection between the formation of the black hole and that of the galaxy itself. Astronomers use the term active galaxy to describe galaxies with unusual characteristics, such as unusual spectral line emission and very strong radio emission. Theoretical and observational studies have shown that the high levels of activity in the centers of these galaxies, regions called active galactic nuclei (AGN), may be explained by accretion onto supermassive black holes. These AGN consist of a central black hole that may be millions or billions of times more massive than the Sun, a disk of interstellar gas and dust called an accretion disk, and two jets perpendicular to the accretion disk. Although supermassive black holes are expected to be found in most AGN, only some galaxies' nuclei have been more carefully studied in attempts to both identify and measure the actual masses of the central supermassive black hole candidates. Some of the most notable galaxies with supermassive black hole candidates include the Andromeda Galaxy, Messier 32, Messier 87, the Sombrero Galaxy, and the Milky Way itself. Another way black holes can be detected is through observation of effects caused by their strong gravitational field. One such effect is gravitational lensing: The deformation of spacetime around a massive object causes light rays to be deflected, making objects behind them appear distorted. When the lensing object is a black hole, this effect can be strong enough to create multiple images of a star or other luminous source. However, the distance between the lensed images may be too small for contemporary telescopes to resolve—this phenomenon is called microlensing. Instead of seeing two images of a lensed star, astronomers see the star brighten slightly as the black hole moves towards the line of sight between the star and Earth and then return to its normal luminosity as the black hole moves away. The turn of the millennium saw the first 3 candidate detections of black holes in this way, and in January 2022, astronomers reported the first confirmed detection of a microlensing event from an isolated black hole. This was also the first determination of an isolated black hole mass, 7.1±1.3 M☉. Alternatives While there is a strong case for supermassive black holes, the model for stellar-mass black holes assumes of an upper limit for the mass of a neutron star: objects observed to have more mass are assumed to be black holes. However, the properties of extremely dense matter are poorly understood. New exotic phases of matter could allow other kinds of massive objects. Quark stars would be made up of quark matter and supported by quark degeneracy pressure, a form of degeneracy pressure even stronger than neutron degeneracy pressure. This would halt gravitational collapse at a higher mass than for a neutron star. Even stronger stars called electroweak stars would convert quarks in their cores into leptons, providing additional pressure to stop the star from collapsing. If, as some extensions of the Standard Model posit, quarks and leptons are made up of the even-smaller fundamental particles called preons, a very compact star could be supported by preon degeneracy pressure. While none of these hypothetical models can explain all of the observations of stellar black hole candidates, a Q star is the only alternative which could significantly exceed the mass limit for neutron stars and thus provide an alternative for supermassive black holes.: 12 A few theoretical objects have been conjectured to match observations of astronomical black hole candidates identically or near-identically, but which function via a different mechanism. A dark energy star would convert infalling matter into vacuum energy; This vacuum energy would be much larger than the vacuum energy of outside space, exerting outwards pressure and preventing a singularity from forming. A black star would be gravitationally collapsing slowly enough that quantum effects would keep it just on the cusp of fully collapsing into a black hole. A gravastar would consist of a very thin shell and a dark-energy interior providing outward pressure to stop the collapse into a black hole or formation of a singularity; It could even have another gravastar inside, called a 'nestar'. Open questions According to the no-hair theorem, a black hole is defined by only three parameters: its mass, charge, and angular momentum. This seems to mean that all other information about the matter that went into forming the black hole is lost, as there is no way to determine anything about the black hole from outside other than those three parameters. When black holes were thought to persist forever, this information loss was not problematic, as the information can be thought of as existing inside the black hole. However, black holes slowly evaporate by emitting Hawking radiation. This radiation does not appear to carry any additional information about the matter that formed the black hole, meaning that this information is seemingly gone forever. This is called the black hole information paradox. Theoretical studies analyzing the paradox have led to both further paradoxes and new ideas about the intersection of quantum mechanics and general relativity. While there is no consensus on the resolution of the paradox, work on the problem is expected to be important for a theory of quantum gravity.: 126 Observations of faraway galaxies have found that ultraluminous quasars, powered by supermassive black holes, existed in the early universe as far as redshift z ≥ 7 {\displaystyle z\geq 7} . These black holes have been assumed to be the products of the gravitational collapse of large population III stars. However, these stellar remnants were not massive enough to produce the quasars observed at early times without accreting beyond the Eddington limit, the theoretical maximum rate of black hole accretion. Physicists have suggested a variety of different mechanisms by which these supermassive black holes may have formed. It has been proposed that smaller black holes may have also undergone mergers to produce the observed supermassive black holes. It is also possible that they were seeded by direct-collapse black holes, in which a large cloud of hot gas avoids fragmentation that would lead to multiple stars, due to low angular momentum or heating from a nearby galaxy. Given the right circumstances, a single supermassive star forms and collapses directly into a black hole without undergoing typical stellar evolution. Additionally, these supermassive black holes in the early universe may be high-mass primordial black holes, which could have accreted further matter in the centers of galaxies. Finally, certain mechanisms allow black holes to grow faster than the theoretical Eddington limit, such as dense gas in the accretion disk limiting outward radiation pressure that prevents the black hole from accreting. However, the formation of bipolar jets prevent super-Eddington rates. In fiction Black holes have been portrayed in science fiction in a variety of ways. Even before the advent of the term itself, objects with characteristics of black holes appeared in stories such as the 1928 novel The Skylark of Space with its "black Sun" and the "hole in space" in the 1935 short story Starship Invincible. As black holes grew to public recognition in the 1960s and 1970s, they began to be featured in films as well as novels, such as Disney's The Black Hole. Black holes have also been used in works of the 21st century, such as Christopher Nolan's science fiction epic Interstellar. Authors and screenwriters have exploited the relativistic effects of black holes, particularly gravitational time dilation. For example, Interstellar features a black hole planet with a time dilation factor of over 60,000:1, while the 1977 novel Gateway depicts a spaceship approaching but never crossing the event horizon of a black hole from the perspective of an outside observer due to time dilation effects. Black holes have also been appropriated as wormholes or other methods of faster-than-light travel, such as in the 1974 novel The Forever War, where a network of black holes is used for interstellar travel. Additionally, black holes can feature as hazards to spacefarers and planets: A black hole threatens a deep-space outpost in 1978 short story The Black Hole Passes, and a binary black hole dangerously alters the orbit of a planet in the 2018 Netflix reboot of Lost in Space. Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/BBC_News#cite_note-25] | [TOKENS: 8810] |
Contents BBC News BBC News is an operational business division of the British Broadcasting Corporation (BBC) responsible for the gathering and broadcasting of news and current affairs in the UK and around the world. The department is the world's largest broadcast news organisation and generates about 120 hours of radio and television output each day, as well as online news coverage. The service has over 5,500 journalists working across its output including in 50 foreign news bureaus where more than 250 foreign correspondents are stationed. Deborah Turness has been the CEO of news and current affairs since September 2022. In 2019, it was reported in an Ofcom report that the BBC spent £136m on news during the period April 2018 to March 2019. BBC News' domestic, global and online news divisions are housed within the largest live newsroom in Europe, in Broadcasting House in central London. Parliamentary coverage is produced and broadcast from studios in London. Through BBC English Regions, the BBC also has regional centres across England and national news centres in Northern Ireland, Scotland and Wales. All nations and English regions produce their own local news programmes and other current affairs and sport programmes. The BBC is a quasi-autonomous corporation authorised by royal charter, making it operationally independent of the government. As of 2024, the BBC reaches an average of 450 million people per week, with the BBC World Service accounting for 320 million people. History This is London calling – 2LO calling. Here is the first general news bulletin, copyright by Reuters, Press Association, Exchange Telegraph and Central News. — BBC news programme opening during the 1920s The British Broadcasting Company broadcast its first radio bulletin from radio station 2LO on 14 November 1922. Wishing to avoid competition, newspaper publishers persuaded the government to ban the BBC from broadcasting news before 7 pm, and to force it to use wire service copy instead of reporting on its own. The BBC gradually gained the right to edit the copy and, in 1934, created its own news operation. However, it could not broadcast news before 6 p.m. until World War II. In addition to news, Gaumont British and Movietone cinema newsreels had been broadcast on the TV service since 1936, with the BBC producing its own equivalent Television Newsreel programme from January 1948. A weekly Children's Newsreel was inaugurated on 23 April 1950, to around 350,000 receivers. The network began simulcasting its radio news on television in 1946, with a still picture of Big Ben. Televised bulletins began on 5 July 1954, broadcast from leased studios within Alexandra Palace in London. The public's interest in television and live events was stimulated by Elizabeth II's coronation in 1953. It is estimated that up to 27 million people viewed the programme in the UK, overtaking radio's audience of 12 million for the first time. Those live pictures were fed from 21 cameras in central London to Alexandra Palace for transmission, and then on to other UK transmitters opened in time for the event. That year, there were around two million TV Licences held in the UK, rising to over three million the following year, and four and a half million by 1955. Television news, although physically separate from its radio counterpart, was still firmly under radio news' control in the 1950s. Correspondents provided reports for both outlets, and the first televised bulletin, shown on 5 July 1954 on the then BBC television service and presented by Richard Baker, involved his providing narration off-screen while stills were shown. This was then followed by the customary Television Newsreel with a recorded commentary by John Snagge (and on other occasions by Andrew Timothy). On-screen newsreaders were introduced a year later in 1955 – Kenneth Kendall (the first to appear in vision), Robert Dougall, and Richard Baker—three weeks before ITN's launch on 21 September 1955. Mainstream television production had started to move out of Alexandra Palace in 1950 to larger premises – mainly at Lime Grove Studios in Shepherd's Bush, west London – taking Current Affairs (then known as Talks Department) with it. It was from here that the first Panorama, a new documentary programme, was transmitted on 11 November 1953, with Richard Dimbleby becoming anchor in 1955. In 1958, Hugh Carleton Greene became head of News and Current Affairs. On 1 January 1960, Greene became Director-General. Greene made changes that were aimed at making BBC reporting more similar to its competitor ITN, which had been highly rated by study groups held by Greene. A newsroom was created at Alexandra Palace, television reporters were recruited and given the opportunity to write and voice their own scripts, without having to cover stories for radio too. On 20 June 1960, Nan Winton, the first female BBC network newsreader, appeared in vision. 19 September 1960 saw the start of the radio news and current affairs programme The Ten O'clock News. BBC2 started transmission on 20 April 1964 and began broadcasting a new show, Newsroom. The World at One, a lunchtime news programme, began on 4 October 1965 on the then Home Service, and the year before News Review had started on television. News Review was a summary of the week's news, first broadcast on Sunday, 26 April 1964 on BBC 2 and harking back to the weekly Newsreel Review of the Week, produced from 1951, to open programming on Sunday evenings–the difference being that this incarnation had subtitles for the deaf and hard-of-hearing. As this was the decade before electronic caption generation, each superimposition ("super") had to be produced on paper or card, synchronised manually to studio and news footage, committed to tape during the afternoon, and broadcast early evening. Thus Sundays were no longer a quiet day for news at Alexandra Palace. The programme ran until the 1980s – by then using electronic captions, known as Anchor – to be superseded by Ceefax subtitling (a similar Teletext format), and the signing of such programmes as See Hear (from 1981). On Sunday 17 September 1967, The World This Weekend, a weekly news and current affairs programme, launched on what was then Home Service, but soon-to-be Radio 4. Preparations for colour began in the autumn of 1967 and on Thursday 7 March 1968 Newsroom on BBC2 moved to an early evening slot, becoming the first UK news programme to be transmitted in colour – from Studio A at Alexandra Palace. News Review and Westminster (the latter a weekly review of Parliamentary happenings) were "colourised" shortly after. However, much of the insert material was still in black and white, as initially only a part of the film coverage shot in and around London was on colour reversal film stock, and all regional and many international contributions were still in black and white. Colour facilities at Alexandra Palace were technically very limited for the next eighteen months, as it had only one RCA colour Quadruplex videotape machine and, eventually two Pye plumbicon colour telecines–although the news colour service started with just one. Black and white national bulletins on BBC 1 continued to originate from Studio B on weekdays, along with Town and Around, the London regional "opt out" programme broadcast throughout the 1960s (and the BBC's first regional news programme for the South East), until it started to be replaced by Nationwide on Tuesday to Thursday from Lime Grove Studios early in September 1969. Town and Around was never to make the move to Television Centre – instead it became London This Week which aired on Mondays and Fridays only, from the new TVC studios. The BBC moved production out of Alexandra Palace in 1969. BBC Television News resumed operations the next day with a lunchtime bulletin on BBC1 – in black and white – from Television Centre, where it remained until March 2013. This move to a smaller studio with better technical facilities allowed Newsroom and News Review to replace back projection with colour-separation overlay. During the 1960s, satellite communication had become possible; however, it was some years before digital line-store conversion was able to undertake the process seamlessly. On 14 September 1970, the first Nine O'Clock News was broadcast on television. Robert Dougall presented the first week from studio N1 – described by The Guardian as "a sort of polystyrene padded cell"—the bulletin having been moved from the earlier time of 20.50 as a response to the ratings achieved by ITN's News at Ten, introduced three years earlier on the rival ITV. Richard Baker and Kenneth Kendall presented subsequent weeks, thus echoing those first television bulletins of the mid-1950s. Angela Rippon became the first female news presenter of the Nine O'Clock News in 1975. Her work outside the news was controversial at the time, appearing on The Morecambe and Wise Christmas Show in 1976 singing and dancing. The first edition of John Craven's Newsround, initially intended only as a short series and later renamed just Newsround, came from studio N3 on 4 April 1972. Afternoon television news bulletins during the mid to late 1970s were broadcast from the BBC newsroom itself, rather than one of the three news studios. The newsreader would present to camera while sitting on the edge of a desk; behind him staff would be seen working busily at their desks. This period corresponded with when the Nine O'Clock News got its next makeover, and would use a CSO background of the newsroom from that very same camera each weekday evening. Also in the mid-1970s, the late night news on BBC2 was briefly renamed Newsnight, but this was not to last, or be the same programme as we know today – that would be launched in 1980 – and it soon reverted to being just a news summary with the early evening BBC2 news expanded to become Newsday. News on radio was to change in the 1970s, and on Radio 4 in particular, brought about by the arrival of new editor Peter Woon from television news and the implementation of the Broadcasting in the Seventies report. These included the introduction of correspondents into news bulletins where previously only a newsreader would present, as well as the inclusion of content gathered in the preparation process. New programmes were also added to the daily schedule, PM and The World Tonight as part of the plan for the station to become a "wholly speech network". Newsbeat launched as the news service on Radio 1 on 10 September 1973. On 23 September 1974, a teletext system which was launched to bring news content on television screens using text only was launched. Engineers originally began developing such a system to bring news to deaf viewers, but the system was expanded. The Ceefax service became much more diverse before it ceased on 23 October 2012: it not only had subtitling for all channels, it also gave information such as weather, flight times and film reviews. By the end of the decade, the practice of shooting on film for inserts in news broadcasts was declining, with the introduction of ENG technology into the UK. The equipment would gradually become less cumbersome – the BBC's first attempts had been using a Philips colour camera with backpack base station and separate portable Sony U-matic recorder in the latter half of the decade. In 1980, the Iranian Embassy Siege had been shot electronically by the BBC Television News Outside broadcasting team, and the work of reporter Kate Adie, broadcasting live from Prince's Gate, was nominated for BAFTA actuality coverage, but this time beaten by ITN for the 1980 award. Newsnight, the news and current affairs programme, was due to go on air on 23 January 1980, although trade union disagreements meant that its launch from Lime Grove was postponed by a week. On 27 August 1981 Moira Stuart became the first African Caribbean female newsreader to appear on British television. By 1982, ENG technology had become sufficiently reliable for Bernard Hesketh to use an Ikegami camera to cover the Falklands War, coverage for which he won the "Royal Television Society Cameraman of the Year" award and a BAFTA nomination – the first time that BBC News had relied upon an electronic camera, rather than film, in a conflict zone. BBC News won the BAFTA for its actuality coverage, however the event has become remembered in television terms for Brian Hanrahan's reporting where he coined the phrase "I'm not allowed to say how many planes joined the raid, but I counted them all out and I counted them all back" to circumvent restrictions, and which has become cited as an example of good reporting under pressure. The first BBC breakfast television programme, Breakfast Time also launched during the 1980s, on 17 January 1983 from Lime Grove Studio E and two weeks before its ITV rival TV-am. Frank Bough, Selina Scott, and Nick Ross helped to wake viewers with a relaxed style of presenting. The Six O'Clock News first aired on 3 September 1984, eventually becoming the most watched news programme in the UK (however, since 2006 it has been overtaken by the BBC News at Ten). In October 1984, images of millions of people starving to death in the Ethiopian famine were shown in Michael Buerk's Six O'Clock News reports. The BBC News crew were the first to document the famine, with Buerk's report on 23 October describing it as "a biblical famine in the 20th century" and "the closest thing to hell on Earth". The BBC News report shocked Britain, motivating its citizens to inundate relief agencies, such as Save the Children, with donations, and to bring global attention to the crisis in Ethiopia. The news report was also watched by Bob Geldof, who would organise the charity single "Do They Know It's Christmas?" to raise money for famine relief followed by the Live Aid concert in July 1985. Starting in 1981, the BBC gave a common theme to its main news bulletins with new electronic titles–a set of computer-animated "stripes" forming a circle on a red background with a "BBC News" typescript appearing below the circle graphics, and a theme tune consisting of brass and keyboards. The Nine used a similar (striped) number 9. The red background was replaced by a blue from 1985 until 1987. By 1987, the BBC had decided to re-brand its bulletins and established individual styles again for each one with differing titles and music, the weekend and holiday bulletins branded in a similar style to the Nine, although the "stripes" introduction continued to be used until 1989 on occasions where a news bulletin was screened out of the running order of the schedule. In 1987, John Birt resurrected the practice of correspondents working for both TV and radio with the introduction of bi-media journalism. During the 1990s, a wider range of services began to be offered by BBC News, with the split of BBC World Service Television to become BBC World (news and current affairs), and BBC Prime (light entertainment). Content for a 24-hour news channel was thus required, followed in 1997 with the launch of domestic equivalent BBC News 24. Rather than set bulletins, ongoing reports and coverage was needed to keep both channels functioning and meant a greater emphasis in budgeting for both was necessary. In 1998, after 66 years at Broadcasting House, the BBC Radio News operation moved to BBC Television Centre. New technology, provided by Silicon Graphics, came into use in 1993 for a re-launch of the main BBC 1 bulletins, creating a virtual set which appeared to be much larger than it was physically. The relaunch also brought all bulletins into the same style of set with only small changes in colouring, titles, and music to differentiate each. A computer generated cut-glass sculpture of the BBC coat of arms was the centrepiece of the programme titles until the large scale corporate rebranding of news services in 1999. In November 1997, BBC News Online was launched, following individual webpages for major news events such as the 1996 Olympic Games, 1997 general election, and the death of Princess Diana. In 1999, the biggest relaunch occurred, with BBC One bulletins, BBC World, BBC News 24, and BBC News Online all adopting a common style. One of the most significant changes was the gradual adoption of the corporate image by the BBC regional news programmes, giving a common style across local, national and international BBC television news. This also included Newyddion, the main news programme of Welsh language channel S4C, produced by BBC News Wales. Following the relaunch of BBC News in 1999, regional headlines were included at the start of the BBC One news bulletins in 2000. The English regions did however lose five minutes at the end of their bulletins, due to a new headline round-up at 18:55. 2000 also saw the Nine O'Clock News moved to the later time of 22:00. This was in response to ITN who had just moved their popular News at Ten programme to 23:00. ITN briefly returned News at Ten but following poor ratings when head-to-head against the BBC's Ten O'Clock News, the ITN bulletin was moved to 22.30, where it remained until 14 January 2008. The retirement in 2009 of Peter Sissons and departure of Michael Buerk from the Ten O'Clock News led to changes in the BBC One bulletin presenting team on 20 January 2003. The Six O'Clock News became double headed with George Alagiah and Sophie Raworth after Huw Edwards and Fiona Bruce moved to present the Ten. A new set design featuring a projected fictional newsroom backdrop was introduced, followed on 16 February 2004 by new programme titles to match those of BBC News 24. BBC News 24 and BBC World introduced a new style of presentation in December 2003, that was slightly altered on 5 July 2004 to mark 50 years of BBC Television News. On 7 March 2005 director general Mark Thompson launched the "Creative Futures" project to restructure the organisation. The individual positions of editor of the One and Six O'Clock News were replaced by a new daytime position in November 2005. Kevin Bakhurst became the first Controller of BBC News 24, replacing the position of editor. Amanda Farnsworth became daytime editor while Craig Oliver was later named editor of the Ten O'Clock News. Bulletins received new titles and a new set design in May 2006, to allow for Breakfast to move into the main studio for the first time since 1997. The new set featured Barco videowall screens with a background of the London skyline used for main bulletins and originally an image of cirrus clouds against a blue sky for Breakfast. This was later replaced following viewer criticism. The studio bore similarities with the ITN-produced ITV News in 2004, though ITN uses a CSO Virtual studio rather than the actual screens at BBC News. BBC News became part of a new BBC Journalism group in November 2006 as part of a restructuring of the BBC. The then-Director of BBC News, Helen Boaden reported to the then-Deputy Director-General and head of the journalism group, Mark Byford until he was made redundant in 2010. On 18 October 2007, ED Mark Thompson announced a six-year plan, "Delivering Creative Futures" (based on his project begun in March 2005), merging the television current affairs department into a new "News Programmes" division. Thompson's announcement, in response to a £2 billion shortfall in funding, would, he said, deliver "a smaller but fitter BBC" in the digital age, by cutting its payroll and, in 2013, selling Television Centre. The various separate newsrooms for television, radio and online operations were merged into a single multimedia newsroom. Programme making within the newsrooms was brought together to form a multimedia programme making department. BBC World Service director Peter Horrocks said that the changes would achieve efficiency at a time of cost-cutting at the BBC. In his blog, he wrote that by using the same resources across the various broadcast media meant fewer stories could be covered, or by following more stories, there would be fewer ways to broadcast them. A new graphics and video playout system was introduced for production of television bulletins in January 2007. This coincided with a new structure to BBC World News bulletins, editors favouring a section devoted to analysing the news stories reported on. The first new BBC News bulletin since the Six O'Clock News was announced in July 2007 following a successful trial in the Midlands. The summary, lasting 90 seconds, has been broadcast at 20:00 on weekdays since December 2007 and bears similarities with 60 Seconds on BBC Three, but also includes headlines from the various BBC regions and a weather summary. As part of a long-term cost cutting programme, bulletins were renamed the BBC News at One, Six and Ten respectively in April 2008 while BBC News 24 was renamed BBC News and moved into the same studio as the bulletins at BBC Television Centre. BBC World was renamed BBC World News and regional news programmes were also updated with the new presentation style, designed by Lambie-Nairn. 2008 also saw tri-media introduced across TV, radio, and online. The studio moves also meant that Studio N9, previously used for BBC World, was closed, and operations moved to the previous studio of BBC News 24. Studio N9 was later refitted to match the new branding, and was used for the BBC's UK local elections and European elections coverage in early June 2009. A strategy review of the BBC in March 2010, confirmed that having "the best journalism in the world" would form one of five key editorial policies, as part of changes subject to public consultation and BBC Trust approval. After a period of suspension in late 2012, Helen Boaden ceased to be the Director of BBC News. On 16 April 2013, incoming BBC Director-General Tony Hall named James Harding, a former editor of The Times of London newspaper as Director of News and Current Affairs. From August 2012 to March 2013, all news operations moved from Television Centre to new facilities in the refurbished and extended Broadcasting House, in Portland Place. The move began in October 2012, and also included the BBC World Service, which moved from Bush House following the expiry of the BBC's lease. This new extension to the north and east, referred to as "New Broadcasting House", includes several new state-of-the-art radio and television studios centred around an 11-storey atrium. The move began with the domestic programme The Andrew Marr Show on 2 September 2012, and concluded with the move of the BBC News channel and domestic news bulletins on 18 March 2013. The newsroom houses all domestic bulletins and programmes on both television and radio, as well as the BBC World Service international radio networks and the BBC World News international television channel. BBC News and CBS News established an editorial and newsgathering partnership in 2017, replacing an earlier long-standing partnership between BBC News and ABC News. In an October 2018 Simmons Research survey of 38 news organisations, BBC News was ranked the fourth most trusted news organisation by Americans, behind CBS News, ABC News and The Wall Street Journal. In January 2020 the BBC announced a BBC News savings target of £80 million per year by 2022, involving about 450 staff reductions from the current 6,000. BBC director of news and current affairs Fran Unsworth said there would be further moves toward digital broadcasting, in part to attract back a youth audience, and more pooling of reporters to stop separate teams covering the same news. A further 70 staff reductions were announced in July 2020. BBC Three began airing the news programme The Catch Up in February 2022. It is presented by Levi Jouavel, Kirsty Grant, and Callum Tulley and aims to get the channel's target audience (16 to 34-year olds) to make sense of the world around them while also highlighting optimistic stories. Compared to its predecessor 60 Seconds, The Catch Up is three times longer, running for about three minutes and not airing during weekends. According to its annual report as of December 2021[update], India has the largest number of people using BBC services in the world. In May 2025, following the earthquake that hit Myanmar and Thailand, a television news bulletin (BBC News Myanmar) from the Burmese service using a vacated Voice of America satellite frequency began its broadcasts. Programming and reporting In November 2023, BBC News joined with the International Consortium of Investigative Journalists, Paper Trail Media [de] and 69 media partners including Distributed Denial of Secrets and the Organised Crime and Corruption Reporting Project (OCCRP) and more than 270 journalists in 55 countries and territories to produce the 'Cyprus Confidential' report on the financial network which supports the regime of Vladimir Putin, mostly with connections to Cyprus, and showed Cyprus to have strong links with high-up figures in the Kremlin, some of whom have been sanctioned. Government officials including Cyprus president Nikos Christodoulides and European lawmakers began responding to the investigation's findings in less than 24 hours, calling for reforms and launching probes. BBC News is responsible for the news programmes and documentary content on the BBC's general television channels, as well as the news coverage on the BBC News Channel in the UK, and 22 hours of programming for the corporation's international BBC World News channel. Coverage for BBC Parliament is carried out on behalf of the BBC at Millbank Studios, though BBC News provides editorial and journalistic content. BBC News content is also output onto the BBC's digital interactive television services under the BBC Red Button brand, and until 2012, on the Ceefax teletext system. The music on all BBC television news programmes was introduced in 1999 and composed by David Lowe. It was part of the re-branding which commenced in 1999 and features 'BBC Pips'. The general theme was used on bulletins on BBC One, News 24, BBC World and local news programmes in the BBC's Nations and Regions. Lowe was also responsible for the music on Radio One's Newsbeat. The theme has had several changes since 1999, the latest in March 2013. The BBC Arabic Television news channel launched on 11 March 2008, a Persian-language channel followed on 14 January 2009, broadcasting from the Peel wing of Broadcasting House; both include news, analysis, interviews, sports and highly cultural programmes and are run by the BBC World Service and funded from a grant-in-aid from the British Foreign Office (and not the television licence). The BBC Verify service was launched in 2023 to fact-check news stories, followed by BBC Verify Live in 2025. BBC Radio News produces bulletins for the BBC's national radio stations and provides content for local BBC radio stations via the General News Service (GNS), a BBC-internal news distribution service. BBC News does not produce the BBC's regional news bulletins, which are produced individually by the BBC nations and regions themselves. The BBC World Service broadcasts to some 150 million people in English as well as 27 languages across the globe. BBC Radio News is a patron of the Radio Academy. BBC News Online is the BBC's news website. Launched in November 1997, it is one of the most popular news websites, with 1.2 billion website visits in April 2021, as well as being used by 60% of the UK's internet users for news. The website contains international news coverage as well as entertainment, sport, science, and political news. Mobile apps for Android, iOS and Windows Phone systems have been provided since 2010. Many television and radio programmes are also available to view on the BBC iPlayer and BBC Sounds services. The BBC News channel is also available to view 24 hours a day, while video and radio clips are also available within online news articles. In October 2019, BBC News Online launched a mirror on the dark web anonymity network Tor in an effort to circumvent censorship. Criticism The BBC is required by its charter to be free from both political and commercial influence and answers only to its viewers and listeners. This political objectivity is sometimes questioned. For instance, The Daily Telegraph (3 August 2005) carried a letter from the KGB defector Oleg Gordievsky, referring to it as "The Red Service". Books have been written on the subject, including anti-BBC works like Truth Betrayed by W J West and The Truth Twisters by Richard Deacon. The BBC has been accused of bias by Conservative MPs. The BBC's Editorial Guidelines on Politics and Public Policy state that while "the voices and opinions of opposition parties must be routinely aired and challenged", "the government of the day will often be the primary source of news". The BBC is regularly accused by the government of the day of bias in favour of the opposition and, by the opposition, of bias in favour of the government. Similarly, during times of war, the BBC is often accused by the UK government, or by strong supporters of British military campaigns, of being overly sympathetic to the view of the enemy. An edition of Newsnight at the start of the Falklands War in 1982 was described as "almost treasonable" by John Page, MP, who objected to Peter Snow saying "if we believe the British". During the first Gulf War, critics of the BBC took to using the satirical name "Baghdad Broadcasting Corporation". During the Kosovo War, the BBC were labelled the "Belgrade Broadcasting Corporation" (suggesting favouritism towards the FR Yugoslavia government over ethnic Albanian rebels) by British ministers, although Slobodan Milosević (then FRY president) claimed that the BBC's coverage had been biased against his nation. Conversely, some of those who style themselves anti-establishment in the United Kingdom or who oppose foreign wars have accused the BBC of pro-establishment bias or of refusing to give an outlet to "anti-war" voices. Following the 2003 invasion of Iraq, a study by the Cardiff University School of Journalism of the reporting of the war found that nine out of 10 references to weapons of mass destruction during the war assumed that Iraq possessed them, and only one in 10 questioned this assumption. It also found that, out of the main British broadcasters covering the war, the BBC was the most likely to use the British government and military as its source. It was also the least likely to use independent sources, like the Red Cross, who were more critical of the war. When it came to reporting Iraqi casualties, the study found fewer reports on the BBC than on the other three main channels. The report's author, Justin Lewis, wrote "Far from revealing an anti-war BBC, our findings tend to give credence to those who criticised the BBC for being too sympathetic to the government in its war coverage. Either way, it is clear that the accusation of BBC anti-war bias fails to stand up to any serious or sustained analysis." Prominent BBC appointments are constantly assessed by the British media and political establishment for signs of political bias. The appointment of Greg Dyke as Director-General was highlighted by press sources because Dyke was a Labour Party member and former activist, as well as a friend of Tony Blair. The BBC's former Political Editor, Nick Robinson, was some years ago a chairman of the Young Conservatives and did, as a result, attract informal criticism from the former Labour government, but his predecessor Andrew Marr faced similar claims from the right because he was editor of The Independent, a liberal-leaning newspaper, before his appointment in 2000. Mark Thompson, former Director-General of the BBC, admitted the organisation has been biased "towards the left" in the past. He said, "In the BBC I joined 30 years ago, there was, in much of current affairs, in terms of people's personal politics, which were quite vocal, a massive bias to the left". He then added, "The organization did struggle then with impartiality. Now it is a completely different generation. There is much less overt tribalism among the young journalists who work for the BBC." Following the EU referendum in 2016, some critics suggested that the BBC was biased in favour of leaving the EU. For instance, in 2018, the BBC received complaints from people who took issue that the BBC was not sufficiently covering anti-Brexit marches while giving smaller-scale events hosted by former UKIP leader Nigel Farage more airtime. On the other hand, a poll released by YouGov showed that 45% of people who voted to leave the EU thought that the BBC was 'actively anti-Brexit' compared to 13% of the same kinds of voters who think the BBC is pro-Brexit. In 2008, the BBC Hindi was criticised by some Indian outlets for referring to the terrorists who carried out the 2008 Mumbai attacks as "gunmen". The response to this added to prior criticism from some Indian commentators suggesting that the BBC may have an Indophobic bias. In March 2015, the BBC was criticised for a BBC Storyville documentary interviewing one of the rapists in India. In spite of a ban ordered by the Indian High court, the BBC still aired the documentary "India's Daughter" outside India. BBC News was at the centre of a political controversy following the 2003 invasion of Iraq. Three BBC News reports (Andrew Gilligan's on Today, Gavin Hewitt's on The Ten O'Clock News and another on Newsnight) quoted an anonymous source that stated the British government (particularly the Prime Minister's office) had embellished the September Dossier with misleading exaggerations of Iraq's weapons of mass destruction capabilities. The government denounced the reports and accused the corporation of poor journalism. In subsequent weeks the corporation stood by the report, saying that it had a reliable source. Following intense media speculation, David Kelly was named in the press as the source for Gilligan's story on 9 July 2003. Kelly was found dead, by suicide, in a field close to his home early on 18 July. An inquiry led by Lord Hutton was announced by the British government the following day to investigate the circumstances leading to Kelly's death, concluding that "Dr. Kelly took his own life." In his report on 28 January 2004, Lord Hutton concluded that Gilligan's original accusation was "unfounded" and the BBC's editorial and management processes were "defective". In particular, it specifically criticised the chain of management that caused the BBC to defend its story. The BBC Director of News, Richard Sambrook, the report said, had accepted Gilligan's word that his story was accurate in spite of his notes being incomplete. Davies had then told the BBC Board of Governors that he was happy with the story and told the Prime Minister that a satisfactory internal inquiry had taken place. The Board of Governors, under the chairman's, Gavyn Davies, guidance, accepted that further investigation of the Government's complaints were unnecessary. Because of the criticism in the Hutton report, Davies resigned on the day of publication. BBC News faced an important test, reporting on itself with the publication of the report, but by common consent (of the Board of Governors) managed this "independently, impartially and honestly". Davies' resignation was followed by the resignation of Director General, Greg Dyke, the following day, and the resignation of Gilligan on 30 January. While undoubtedly a traumatic experience for the corporation, an ICM poll in April 2003 indicated that it had sustained its position as the best and most trusted provider of news. The BBC has faced accusations of holding both anti-Israel and anti-Palestine bias. Douglas Davis, the London correspondent of The Jerusalem Post, has described the BBC's coverage of the Arab–Israeli conflict as "a relentless, one-dimensional portrayal of Israel as a demonic, criminal state and Israelis as brutal oppressors [which] bears all the hallmarks of a concerted campaign of vilification that, wittingly or not, has the effect of delegitimising the Jewish state and pumping oxygen into a dark old European hatred that dared not speak its name for the past half-century.". However two large independent studies, one conducted by Loughborough University and the other by Glasgow University's Media Group concluded that Israeli perspectives are given greater coverage. Critics of the BBC argue that the Balen Report proves systematic bias against Israel in headline news programming. The Daily Mail and The Daily Telegraph criticised the BBC for spending hundreds of thousands of British tax payers' pounds from preventing the report being released to the public. Jeremy Bowen, the Middle East Editor for BBC world news, was singled out specifically for bias by the BBC Trust which concluded that he violated "BBC guidelines on accuracy and impartiality." An independent panel appointed by the BBC Trust was set up in 2006 to review the impartiality of the BBC's coverage of the Israeli–Palestinian conflict. The panel's assessment was that "apart from individual lapses, there was little to suggest deliberate or systematic bias." While noting a "commitment to be fair accurate and impartial" and praising much of the BBC's coverage the independent panel concluded "that BBC output does not consistently give a full and fair account of the conflict. In some ways the picture is incomplete and, in that sense, misleading." It notes that, "the failure to convey adequately the disparity in the Israeli and Palestinian experience, [reflects] the fact that one side is in control and the other lives under occupation". Writing in the Financial Times, Philip Stephens, one of the panellists, later accused the BBC's director-general, Mark Thompson, of misrepresenting the panel's conclusions. He further opined "My sense is that BBC news reporting has also lost a once iron-clad commitment to objectivity and a necessary respect for the democratic process. If I am right, the BBC, too, is lost". Mark Thompson published a rebuttal in the FT the next day. The description by one BBC correspondent reporting on the funeral of Yassir Arafat that she had been left with tears in her eyes led to other questions of impartiality, particularly from Martin Walker in a guest opinion piece in The Times, who picked out the apparent case of Fayad Abu Shamala, the BBC Arabic Service correspondent, who told a Hamas rally on 6 May 2001, that journalists in Gaza were "waging the campaign shoulder to shoulder together with the Palestinian people". Walker argues that the independent inquiry was flawed for two reasons. Firstly, because the time period over which it was conducted (August 2005 to January 2006) surrounded the Israeli withdrawal from Gaza and Ariel Sharon's stroke, which produced more positive coverage than usual. Furthermore, he wrote, the inquiry only looked at the BBC's domestic coverage, and excluded output on the BBC World Service and BBC World. Tom Gross accused the BBC of glorifying Hamas suicide bombers, and condemned its policy of inviting guests such as Jenny Tonge and Tom Paulin who have compared Israeli soldiers to Nazis. Writing for the BBC, Paulin said Israeli soldiers should be "shot dead" like Hitler's S.S, and said he could "understand how suicide bombers feel". The BBC also faced criticism for not airing a Disasters Emergency Committee aid appeal for Palestinians who suffered in Gaza during 22-day war there between late 2008 and early 2009. Most other major UK broadcasters did air this appeal, but rival Sky News did not. British journalist Julie Burchill has accused BBC of creating a "climate of fear" for British Jews over its "excessive coverage" of Israel compared to other nations. In light of the Gaza war, the BBC suspended seven Arab journalists over allegations of expressing support for Hamas via social media. BBC and ABC share video segments and reporters as needed in producing their newscasts. with the BBC showing ABC World News Tonight with David Muir in the UK. However, in July 2017, the BBC announced a new partnership with CBS News allows both organisations to share video, editorial content, and additional newsgathering resources in New York, London, Washington and around the world. BBC News subscribes to wire services from leading international agencies including PA Media (formerly Press Association), Reuters, and Agence France-Presse. In April 2017, the BBC dropped Associated Press in favour of an enhanced service from AFP. BBC News reporters and broadcasts are now and have in the past been banned in several countries primarily for reporting which has been unfavourable to the ruling government. For example, correspondents were banned by the former apartheid regime of South Africa. The BBC was banned in Zimbabwe under Mugabe for eight years as a terrorist organisation until being allowed to operate again over a year after the 2008 elections. The BBC was banned in Burma (officially Myanmar) after their coverage and commentary on anti-government protests there in September 2007. The ban was lifted four years later in September 2011. Other cases have included Uzbekistan, China, and Pakistan. BBC Persian, the BBC's Persian language news site, was blocked from the Iranian internet in 2006. The BBC News website was made available in China again in March 2008, but as of October 2014[update], was blocked again. In June 2015, the Rwandan government placed an indefinite ban on BBC broadcasts following the airing of a controversial documentary regarding the 1994 Rwandan genocide, Rwanda's Untold Story, broadcast on BBC2 on 1 October 2014. The UK's Foreign Office recognised "the hurt caused in Rwanda by some parts of the documentary". In February 2017, reporters from the BBC (as well as the Daily Mail, The New York Times, Politico, CNN, and others) were denied access to a United States White House briefing. In 2017, BBC India was banned for a period of five years from covering all national parks and sanctuaries in India. Following the withdrawal of CGTN's UK broadcaster licence on 4 February 2021 by Ofcom, China banned BBC News from airing in China. See also References External links |below = Category }} |
======================================== |
[SOURCE: https://www.theverge.com/tech/876717/samsung-galaxy-unpacked-february-2026-s26] | [TOKENS: 1410] |
TechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechNewsCloseNewsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All NewsGoogleCloseGooglePosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All GoogleSamsung’s next Unpacked is confirmed for later this monthGet ready for a software-centered showcase.Get ready for a software-centered showcase.by Allison JohnsonCloseAllison JohnsonPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Allison JohnsonFeb 10, 2026, 11:00 PM UTCLinkShareGiftAllison JohnsonCloseAllison JohnsonPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Allison Johnson is a senior reviewer with over a decade of experience writing about consumer tech. She has a special interest in mobile photography and telecom. Previously, she worked at DPReview.Samsung’s annual Galaxy S-series reveal is later than usual this year, but we finally have a confirmed date to circle on the calendar: February 25th. And if you were hoping for major hardware upgrades from the company’s flagship phones, you probably shouldn’t hold your breath: an extensive leak published by WinFuture has seemingly confirmed that the S26 series will be a software-focused affair.According to the leaked specs, the standard S26 will allegedly get a nominally larger battery — 4300mAh compared to 4000mAh — which is great news for the last “small” phone left standing. Once again, all three phones will lack built-in Qi2 magnets, outsourcing them to cases instead. Camera hardware looks to be much the same as last year, though the main and 5x telephoto lenses appear to have faster apertures. All things being equal, that would be a good thing for low-light photos. Storage and RAM configurations look similar to last year, with a minimum of 16GB of RAM across the line, though WinFuture suggests that the cheaper 128GB version of the standard S26 may be going away.We’ll find out in a couple of weeks just how accurate this leak is, but given the emphasis on Galaxy AI in the invitation graphics, I’m willing to bet it’s pretty much spot on.Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Allison JohnsonCloseAllison JohnsonPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Allison JohnsonAndroidCloseAndroidPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All AndroidGoogleCloseGooglePosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All GoogleMobileCloseMobilePosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All MobileNewsCloseNewsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All NewsSamsungCloseSamsungPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All SamsungTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechMost PopularMost PopularXbox chief Phil Spencer is leaving MicrosoftRead Microsoft gaming CEO Asha Sharma’s first memo on the future of XboxThe RAM shortage is coming for everything you care aboutAmazon blames human employees for an AI coding agent’s mistakeWill Stancil, man of the people or just an annoying guy?The Verge DailyA free daily digest of the news that matters most.Email (required)Sign UpBy submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.Advertiser Content FromThis is the title for the native ad Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Google Samsung’s next Unpacked is confirmed for later this month Get ready for a software-centered showcase. Get ready for a software-centered showcase. Posts from this author will be added to your daily email digest and your homepage feed. See All by Allison Johnson Posts from this author will be added to your daily email digest and your homepage feed. See All by Allison Johnson Samsung’s annual Galaxy S-series reveal is later than usual this year, but we finally have a confirmed date to circle on the calendar: February 25th. And if you were hoping for major hardware upgrades from the company’s flagship phones, you probably shouldn’t hold your breath: an extensive leak published by WinFuture has seemingly confirmed that the S26 series will be a software-focused affair. According to the leaked specs, the standard S26 will allegedly get a nominally larger battery — 4300mAh compared to 4000mAh — which is great news for the last “small” phone left standing. Once again, all three phones will lack built-in Qi2 magnets, outsourcing them to cases instead. Camera hardware looks to be much the same as last year, though the main and 5x telephoto lenses appear to have faster apertures. All things being equal, that would be a good thing for low-light photos. Storage and RAM configurations look similar to last year, with a minimum of 16GB of RAM across the line, though WinFuture suggests that the cheaper 128GB version of the standard S26 may be going away. We’ll find out in a couple of weeks just how accurate this leak is, but given the emphasis on Galaxy AI in the invitation graphics, I’m willing to bet it’s pretty much spot on. Posts from this author will be added to your daily email digest and your homepage feed. See All by Allison Johnson Posts from this topic will be added to your daily email digest and your homepage feed. See All Android Posts from this topic will be added to your daily email digest and your homepage feed. See All Google Posts from this topic will be added to your daily email digest and your homepage feed. See All Mobile Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Samsung Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech Most Popular The Verge Daily A free daily digest of the news that matters most. This is the title for the native ad More in Tech This is the title for the native ad Top Stories © 2026 Vox Media, LLC. All Rights Reserved |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Extraterrestrial_life#cite_ref-68] | [TOKENS: 11349] |
Contents Extraterrestrial life Extraterrestrial life, or alien life (colloquially aliens), is life that originates from another world rather than on Earth. No extraterrestrial life has yet been scientifically or conclusively detected. Such life might range from simple forms such as prokaryotes to intelligent beings, possibly bringing forth civilizations that might be far more, or far less, advanced than humans. The Drake equation speculates about the existence of sapient life elsewhere in the universe. The science of extraterrestrial life is known as astrobiology. Speculation about inhabited worlds beyond Earth dates back to antiquity. Early Christian writers, including Augustine, discussed ideas from thinkers like Democritus and Epicurus about countless worlds in the vast universe. Pre-modern writers typically assumed extraterrestrial "worlds" were inhabited by living beings. William Vorilong, in the 15th century, acknowledged the possibility Jesus could have visited extraterrestrial worlds to redeem their inhabitants.: 26 In 1440, Nicholas of Cusa suggested Earth is a "brilliant star"; he theorized that all celestial bodies, even the Sun, could host life. Descartes wrote that there were no means to prove the stars were not inhabited by "intelligent creatures", but their existence was a matter of speculation.: 67 In comparison to the life-abundant Earth, the vast majority of intrasolar and extrasolar planets and moons have harsh surface conditions and disparate atmospheric chemistry, or lack an atmosphere. However, there are many extreme and chemically harsh ecosystems on Earth that do support forms of life and are often hypothesized to be the origin of life on Earth. Examples include life surrounding hydrothermal vents, acidic hot springs, and volcanic lakes, as well as halophiles and the deep biosphere. Since the mid-20th century, researchers have searched for extraterrestrial life and intelligence. Solar system studies focus on Venus, Mars, Europa, and Titan, while exoplanet discoveries now total 6,022 confirmed planets in 4,490 systems as of October 2025. Depending on the category of search, methods range from analysis of telescope and specimen data to radios used to detect and transmit interstellar communication. Interstellar travel remains largely hypothetical, with only the Voyager 1 and Voyager 2 probes confirmed to have entered the interstellar medium. The concept of extraterrestrial life, especially intelligent life, has greatly influenced culture and fiction. A key debate centers on contacting extraterrestrial intelligence: some advocate active attempts, while others warn it could be risky, given human history of exploiting other societies. Context Initially, after the Big Bang, the universe was too hot to allow life. It is estimated that the temperature of the universe was around 10 billion Kelvin at the one-second mark. Roughly 15 million years later, it cooled to temperate levels, though the elements of organic life were yet nonexistent. The only freely available elements at that point were hydrogen and helium. Carbon and oxygen (and later, water) would not appear until 50 million years later, created through stellar fusion. At that point, the difficulty for life to appear was not the temperature, but the scarcity of free heavy elements. Planetary systems emerged, and the first organic compounds may have formed in the protoplanetary disk of dust grains that would eventually create rocky planets like Earth. Although Earth was in a molten state after its birth and may have burned any organics that fell on it, it would have been more receptive once it cooled down. Once the right conditions on Earth were met, life started by a chemical process known as abiogenesis. Alternatively, life may have formed less frequently, then spread—by meteoroids, for example—between habitable planets in a process called panspermia. During most of its stellar evolution, stars combine hydrogen nuclei to make helium nuclei by stellar fusion, and the comparatively lighter weight of helium allows the star to release the extra energy. The process continues until the star uses all of its available fuel, with the speed of consumption being related to the size of the star. During its last stages, stars start combining helium nuclei to form carbon nuclei. The larger stars can further combine carbon nuclei to create oxygen and silicon, oxygen into neon and sulfur, and so on until iron. Ultimately, the star blows much of its content back into the stellar medium, where it would join clouds that would eventually become new generations of stars and planets. Many of those materials are the raw components of life on Earth. As this process takes place in all the universe, said materials are ubiquitous in the cosmos and not a rarity from the Solar System. Earth is a planet in the Solar System, a planetary system formed by a star at the center, the Sun, and the objects that orbit it: other planets, moons, asteroids, and comets. The sun is part of the Milky Way, a galaxy. The Milky Way is part of the Local Group, a galaxy group that is in turn part of the Laniakea Supercluster. The universe is composed of all similar structures in existence. The immense distances between celestial objects are a difficulty for studying extraterrestrial life. So far, humans have only set foot on the Moon and sent robotic probes to other planets and moons in the Solar System. Although probes can withstand conditions that may be lethal to humans, the distances cause time delays: the New Horizons took nine years after launch to reach Pluto. No probe has ever reached extrasolar planetary systems. The Voyager 2 left the Solar System at a speed of 50,000 kilometers per hour; if it headed towards the Alpha Centauri system, the closest one to Earth at 4.4 light years, it would reach it in 100,000 years. Under current technology, such systems can only be studied by telescopes, which have limitations. It is estimated that dark matter has a larger amount of combined matter than stars and gas clouds, but as it plays no role in the stellar evolution of stars and planets, it is usually not taken into account by astrobiology. There is an area around a star, the circumstellar habitable zone or "Goldilocks zone", wherein water may be at the right temperature to exist in liquid form at a planetary surface. This area is neither too close to the star, where water would become steam, nor too far away, where water would be frozen as ice. However, although useful as an approximation, planetary habitability is complex and defined by several factors. Being in the habitable zone is not enough for a planet to be habitable, not even to actually have such liquid water. Venus is located in the solar system's habitable zone, but does not have liquid water because of the conditions of its atmosphere. Jovian planets or gas giants are not considered habitable even if they orbit close enough to their stars as hot Jupiters, due to crushing atmospheric pressures. The actual distances for the habitable zones vary according to the type of star, and even the solar activity of each specific star influences the local habitability. The type of star also defines the time the habitable zone will exist, as its presence and limits will change along with the star's stellar evolution. The Big Bang occurred 13.8 billion years ago, the Solar System was formed 4.6 billion years ago, and the first hominids appeared 6 million years ago. Life on other planets may have started, evolved, given birth to extraterrestrial intelligences, and perhaps even faced a planetary extinction event millions or billions of years ago. When considered from a cosmic perspective, the brief times of existence of Earth's species may suggest that extraterrestrial life may be equally fleeting under such a scale. During a period of about 7 million years, from about 10 to 17 million years after the Big Bang, the background temperature was between 373 and 273 K (100 and 0 °C; 212 and 32 °F), allowing the possibility of liquid water if any planets existed. Avi Loeb (2014) speculated that primitive life might in principle have appeared during this window, which he called "the Habitable Epoch of the Early Universe". Life on Earth is quite ubiquitous across the planet and has adapted over time to almost all the available environments in it, extremophiles and the deep biosphere thrive at even the most hostile ones. As a result, it is inferred that life in other celestial bodies may be equally adaptive. However, the origin of life is unrelated to its ease of adaptation and may have stricter requirements. A celestial body may not have any life on it, even if it were habitable. Likelihood of existence Life in the cosmos beyond Earth has been observed. The hypothesis of ubiquitous extraterrestrial life relies on three main ideas. The first one, the size of the universe, allows for plenty of planets to have a similar habitability to Earth, and the age of the universe gives enough time for a long process analog to the history of Earth to happen there. The second is that the substances that make life, such as carbon and water, are ubiquitous in the universe. The third is that the physical laws are universal, which means that the forces that would facilitate or prevent the existence of life would be the same ones as on Earth. According to this argument, made by scientists such as Carl Sagan and Stephen Hawking, it would be improbable for life not to exist somewhere else other than Earth. This argument is embodied in the Copernican principle, which states that Earth does not occupy a unique position in the Universe, and the mediocrity principle, which states that there is nothing special about life on Earth. Other authors consider instead that life in the cosmos, or at least multicellular life, may actually be rare. The Rare Earth hypothesis maintains that life on Earth is possible because of a series of factors that range from the location in the galaxy and the configuration of the Solar System to local characteristics of the planet, and that it is unlikely that another planet simultaneously meets all such requirements. The proponents of this hypothesis consider that very little evidence suggests the existence of extraterrestrial life and that, at this point, it is just a desired result and not a reasonable scientific explanation for any gathered data. In 1961, astronomer and astrophysicist Frank Drake devised the Drake equation as a way to stimulate scientific dialogue at a meeting on the search for extraterrestrial intelligence (SETI). The Drake equation is a probabilistic argument used to estimate the number of active, communicative extraterrestrial civilizations in the Milky Way galaxy. The Drake equation is:: xix where: and Drake's proposed estimates are as follows, but numbers on the right side of the equation are agreed as speculative and open to substitution: 10,000 = 5 ⋅ 0.5 ⋅ 2 ⋅ 1 ⋅ 0.2 ⋅ 1 ⋅ 10,000 {\displaystyle 10{,}000=5\cdot 0.5\cdot 2\cdot 1\cdot 0.2\cdot 1\cdot 10{,}000} [better source needed] The Drake equation has proved controversial since, although it is written as a math equation, none of its values were known at the time. Although some values may eventually be measured, others are based on social sciences and are not knowable by their very nature. This does not allow one to make noteworthy conclusions from the equation. Based on observations from the Hubble Space Telescope, there are nearly 2 trillion galaxies in the observable universe. It is estimated that at least ten percent of all Sun-like stars have a system of planets. In other words, there are 6.25×1018 stars with planets orbiting them in the observable universe. Even if it is assumed that only one out of a billion of these stars has planets supporting life, there would be some 6.25 billion life-supporting planetary systems in the observable universe. A 2013 study based on results from the Kepler spacecraft estimated that the Milky Way contains at least as many planets as it does stars, resulting in 100–400 billion exoplanets. The Nebular hypothesis that explains the formation of the Solar System and other planetary systems would suggest that those can have several configurations, and not all of them may have rocky planets within the habitable zone. The apparent contradiction between high estimates of the probability of the existence of extraterrestrial civilisations and the lack of evidence for such civilisations is known as the Fermi paradox. Dennis W. Sciama claimed that life's existence in the universe depends on various fundamental constants. Zhi-Wei Wang and Samuel L. Braunstein suggest that a random universe capable of supporting life is likely to be just barely able to do so, giving a potential explanation to the Fermi paradox. Biochemical basis If extraterrestrial life exists, it could range from simple microorganisms and multicellular organisms similar to animals or plants, to complex alien intelligences akin to humans. When scientists talk about extraterrestrial life, they consider all those types. Although it is possible that extraterrestrial life may have other configurations, scientists use the hierarchy of lifeforms from Earth for simplicity, as it is the only one known to exist. The first basic requirement for life is an environment with non-equilibrium thermodynamics, which means that the thermodynamic equilibrium must be broken by a source of energy. The traditional sources of energy in the cosmos are the stars, such as for life on Earth, which depends on the energy of the sun. However, there are other alternative energy sources, such as volcanoes, plate tectonics, and hydrothermal vents. There are ecosystems on Earth in deep areas of the ocean that do not receive sunlight, and take energy from black smokers instead. Magnetic fields and radioactivity have also been proposed as sources of energy, although they would be less efficient ones. Life on Earth requires water in a liquid state as a solvent in which biochemical reactions take place. It is highly unlikely that an abiogenesis process can start within a gaseous or solid medium: the atom speeds, either too fast or too slow, make it difficult for specific ones to meet and start chemical reactions. A liquid medium also allows the transport of nutrients and substances required for metabolism. Sufficient quantities of carbon and other elements, along with water, might enable the formation of living organisms on terrestrial planets with a chemical make-up and temperature range similar to that of Earth. Life based on ammonia rather than water has been suggested as an alternative, though this solvent appears less suitable than water. It is also conceivable that there are forms of life whose solvent is a liquid hydrocarbon, such as methane, ethane or propane. Another unknown aspect of potential extraterrestrial life would be the chemical elements that would compose it. Life on Earth is largely composed of carbon, but there could be other hypothetical types of biochemistry. A replacement for carbon would need to be able to create complex molecules, store information required for evolution, and be freely available in the medium. To create DNA, RNA, or a close analog, such an element should be able to bind its atoms with many others, creating complex and stable molecules. It should be able to create at least three covalent bonds: two for making long strings and at least a third to add new links and allow for diverse information. Only nine elements meet this requirement: boron, nitrogen, phosphorus, arsenic, antimony (three bonds), carbon, silicon, germanium and tin (four bonds). As for abundance, carbon, nitrogen, and silicon are the most abundant ones in the universe, far more than the others. On Earth's crust the most abundant of those elements is silicon, in the Hydrosphere it is carbon and in the atmosphere, it is carbon and nitrogen. Silicon, however, has disadvantages over carbon. The molecules formed with silicon atoms are less stable, and more vulnerable to acids, oxygen, and light. An ecosystem of silicon-based lifeforms would require very low temperatures, high atmospheric pressure, an atmosphere devoid of oxygen, and a solvent other than water. The low temperatures required would add an extra problem, the difficulty to kickstart a process of abiogenesis to create life in the first place. Norman Horowitz, head of the Jet Propulsion Laboratory bioscience section for the Mariner and Viking missions from 1965 to 1976 considered that the great versatility of the carbon atom makes it the element most likely to provide solutions, even exotic solutions, to the problems of survival of life on other planets. However, he also considered that the conditions found on Mars were incompatible with carbon based life. Even if extraterrestrial life is based on carbon and uses water as a solvent, like Earth life, it may still have a radically different biochemistry. Life is generally considered to be a product of natural selection. It has been proposed that to undergo natural selection a living entity must have the capacity to replicate itself, the capacity to avoid damage/decay, and the capacity to acquire and process resources in support of the first two capacities. Life on Earth may have started with an RNA world and later evolved to its current form, where some of the RNA tasks were transferred to DNA and proteins. Extraterrestrial life may still be stuck using RNA, or evolve into other configurations. It is unclear if our biochemistry is the most efficient one that could be generated, or which elements would follow a similar pattern. However, it is likely that, even if cells had a different composition to those from Earth, they would still have a cell membrane. Life on Earth jumped from prokaryotes to eukaryotes and from unicellular organisms to multicellular organisms through evolution. So far no alternative process to achieve such a result has been conceived, even if hypothetical. Evolution requires life to be divided into individual organisms, and no alternative organisation has been satisfactorily proposed either. At the basic level, membranes define the limit of a cell, between it and its environment, while remaining partially open to exchange energy and resources with it. The evolution from simple cells to eukaryotes, and from them to multicellular lifeforms, is not guaranteed. The Cambrian explosion took place thousands of millions of years after the origin of life, and its causes are not fully known yet. On the other hand, the jump to multicellularity took place several times, which suggests that it could be a case of convergent evolution, and so likely to take place on other planets as well. Palaeontologist Simon Conway Morris considers that convergent evolution would lead to kingdoms similar to our plants and animals, and that many features are likely to develop in alien animals as well, such as bilateral symmetry, limbs, digestive systems and heads with sensory organs. Scientists from the University of Oxford analysed it from the perspective of evolutionary theory and wrote in a study in the International Journal of Astrobiology that aliens may be similar to humans. The planetary context would also have an influence: a planet with higher gravity would have smaller animals, and other types of stars can lead to non-green photosynthesizers. The amount of energy available would also affect biodiversity, as an ecosystem sustained by black smokers or hydrothermal vents would have less energy available than those sustained by a star's light and heat, and so its lifeforms would not grow beyond a certain complexity. There is also research in assessing the capacity of life for developing intelligence. It has been suggested that this capacity arises with the number of potential niches a planet contains, and that the complexity of life itself is reflected in the information density of planetary environments, which in turn can be computed from its niches. It is common knowledge that the conditions on other planets in the solar system, in addition to the many galaxies outside of the Milky Way galaxy, are very harsh and seem to be too extreme to harbor any life. The environmental conditions on these planets can have intense UV radiation paired with extreme temperatures, lack of water, and much more that can lead to conditions that don't seem to favor the creation or maintenance of extraterrestrial life. However, there has been much historical evidence that some of the earliest and most basic forms of life on Earth originated in some extreme environments that seem unlikely to have harbored life at least at one point in Earth's history. Fossil evidence as well as many historical theories backed up by years of research and studies have marked environments like hydrothermal vents or acidic hot springs as some of the first places that life could have originated on Earth. These environments can be considered extreme when compared to the typical ecosystems that the majority of life on Earth now inhabit, as hydrothermal vents are scorching hot due to the magma escaping from the Earth's mantle and meeting the much colder oceanic water. Even in today's world, there can be a diverse population of bacteria found inhabiting the area surrounding these hydrothermal vents which can suggest that some form of life can be supported even in the harshest of environments like the other planets in the solar system. The aspects of these harsh environments that make them ideal for the origin of life on Earth, as well as the possibility of creation of life on other planets, is the chemical reactions forming spontaneously. For example, the hydrothermal vents found on the ocean floor are known to support many chemosynthetic processes which allow organisms to utilize energy through reduced chemical compounds that fix carbon. In return, these reactions will allow for organisms to live in relatively low oxygenated environments while maintaining enough energy to support themselves. The early Earth environment was reducing and therefore, these carbon fixing compounds were necessary for the survival and possible origin of life on Earth. With the little amount of information that scientists have found regarding the atmosphere on other planets in the Milky Way galaxy and beyond, the atmospheres are most likely reducing or with very low oxygen levels, especially when compared with Earth's atmosphere. If there were the necessary elements and ions on these planets, the same carbon fixing, reduced chemical compounds occurring around hydrothermal vents could also occur on these planets' surfaces and possibly result in the origin of extraterrestrial life. Planetary habitability in the Solar System The Solar System has a wide variety of planets, dwarf planets, and moons, and each one is studied for its potential to host life. Each one has its own specific conditions that may benefit or harm life. So far, the only lifeforms found are those from Earth. No extraterrestrial intelligence other than humans exists or has ever existed within the Solar System. Astrobiologist Mary Voytek points out that it would be unlikely to find large ecosystems, as they would have already been detected by now. The inner Solar System is likely devoid of life. However, Venus is still of interest to astrobiologists, as it is a terrestrial planet that was likely similar to Earth in its early stages and developed in a different way. There is a greenhouse effect, the surface is the hottest in the Solar System, sulfuric acid clouds, all surface liquid water is lost, and it has a thick carbon-dioxide atmosphere with huge pressure. Comparing both helps to understand the precise differences that lead to beneficial or harmful conditions for life. And despite the conditions against life on Venus, there are suspicions that microbial life-forms may still survive in high-altitude clouds. Mars is a cold and almost airless desert, inhospitable to life. However, recent studies revealed that water on Mars used to be quite abundant, forming rivers, lakes, and perhaps even oceans. Mars may have been habitable back then, and life on Mars may have been possible. But when the planetary core ceased to generate a magnetic field, solar winds removed the atmosphere and the planet became vulnerable to solar radiation. Ancient life-forms may still have left fossilised remains, and microbes may still survive deep underground. As mentioned, the gas giants and ice giants are unlikely to contain life. The most distant solar system bodies, found in the Kuiper Belt and outwards, are locked in permanent deep-freeze, but cannot be ruled out completely. Although the giant planets themselves are highly unlikely to have life, there is much hope to find it on moons orbiting these planets. Europa, from the Jovian system, has a subsurface ocean below a thick layer of ice. Ganymede and Callisto also have subsurface oceans, but life is less likely in them because water is sandwiched between layers of solid ice. Europa would have contact between the ocean and the rocky surface, which helps the chemical reactions. It may be difficult to dig so deep in order to study those oceans, though. Enceladus, a tiny moon of Saturn with another subsurface ocean, may not need to be dug, as it releases water to space in eruption columns. The space probe Cassini flew inside one of these, but could not make a full study because NASA did not expect this phenomenon and did not equip the probe to study ocean water. Still, Cassini detected complex organic molecules, salts, evidence of hydrothermal activity, hydrogen, and methane. Titan is the only celestial body in the Solar System besides Earth that has liquid bodies on the surface. It has rivers, lakes, and rain of hydrocarbons, methane, and ethane, and even a cycle similar to Earth's water cycle. This special context encourages speculations about lifeforms with different biochemistry, but the cold temperatures would make such chemistry take place at a very slow pace. Water is rock-solid on the surface, but Titan does have a subsurface water ocean like several other moons. However, it is of such a great depth that it would be very difficult to access it for study. Scientific search The science that searches and studies life in the universe, both on Earth and elsewhere, is called astrobiology. With the study of Earth's life, the only known form of life, astrobiology seeks to study how life starts and evolves and the requirements for its continuous existence. This helps to determine what to look for when searching for life in other celestial bodies. This is a complex area of study, and uses the combined perspectives of several scientific disciplines, such as astronomy, biology, chemistry, geology, oceanography, and atmospheric sciences. The scientific search for extraterrestrial life is being carried out both directly and indirectly. As of September 2017[update], 3,667 exoplanets in 2,747 systems have been identified, and other planets and moons in the Solar System hold the potential for hosting primitive life such as microorganisms. As of 8 February 2021, an updated status of studies considering the possible detection of lifeforms on Venus (via phosphine) and Mars (via methane) was reported. Scientists search for biosignatures within the Solar System by studying planetary surfaces and examining meteorites. Some claim to have identified evidence that microbial life has existed on Mars. In 1996, a controversial report stated that structures resembling nanobacteria were discovered in a meteorite, ALH84001, formed of rock ejected from Mars. Although all the unusual properties of the meteorite were eventually explained as the result of inorganic processes, the controversy over its discovery laid the groundwork for the development of astrobiology. An experiment on the two Viking Mars landers reported gas emissions from heated Martian soil samples that some scientists argue are consistent with the presence of living microorganisms. Lack of corroborating evidence from other experiments on the same samples suggests that a non-biological reaction is a more likely hypothesis. In February 2005 NASA scientists reported they may have found some evidence of extraterrestrial life on Mars. The two scientists, Carol Stoker and Larry Lemke of NASA's Ames Research Center, based their claim on methane signatures found in Mars's atmosphere resembling the methane production of some forms of primitive life on Earth, as well as on their own study of primitive life near the Rio Tinto river in Spain. NASA officials soon distanced NASA from the scientists' claims, and Stoker herself backed off from her initial assertions. In November 2011, NASA launched the Mars Science Laboratory that landed the Curiosity rover on Mars. It is designed to assess the past and present habitability on Mars using a variety of scientific instruments. The rover landed on Mars at Gale Crater in August 2012. A group of scientists at Cornell University started a catalog of microorganisms, with the way each one reacts to sunlight. The goal is to help with the search for similar organisms in exoplanets, as the starlight reflected by planets rich in such organisms would have a specific spectrum, unlike that of starlight reflected from lifeless planets. If Earth was studied from afar with this system, it would reveal a shade of green, as a result of the abundance of plants with photosynthesis. In August 2011, NASA studied meteorites found on Antarctica, finding adenine, guanine, hypoxanthine, and xanthine. Adenine and guanine are components of DNA, and the others are used in other biological processes. The studies ruled out pollution of the meteorites on Earth, as those components would not be freely available the way they were found in the samples. This discovery suggests that several organic molecules that serve as building blocks of life may be generated within asteroids and comets. In October 2011, scientists reported that cosmic dust contains complex organic compounds ("amorphous organic solids with a mixed aromatic-aliphatic structure") that could be created naturally, and rapidly, by stars. It is still unclear if those compounds played a role in the creation of life on Earth, but Sun Kwok, of the University of Hong Kong, thinks so. "If this is the case, life on Earth may have had an easier time getting started as these organics can serve as basic ingredients for life." In August 2012, and in a world first, astronomers at Copenhagen University reported the detection of a specific sugar molecule, glycolaldehyde, in a distant star system. The molecule was found around the protostellar binary IRAS 16293-2422, which is located 400 light years from Earth. Glycolaldehyde is needed to form ribonucleic acid, or RNA, which is similar in function to DNA. This finding suggests that complex organic molecules may form in stellar systems prior to the formation of planets, eventually arriving on young planets early in their formation. In December 2023, astronomers reported the first time discovery, in the plumes of Enceladus, moon of the planet Saturn, of hydrogen cyanide, a possible chemical essential for life as we know it, as well as other organic molecules, some of which are yet to be better identified and understood. According to the researchers, "these [newly discovered] compounds could potentially support extant microbial communities or drive complex organic synthesis leading to the origin of life." Although most searches are focused on the biology of extraterrestrial life, an extraterrestrial intelligence capable enough to develop a civilization may be detectable by other means as well. Technology may generate technosignatures, effects on the native planet that may not be caused by natural causes. There are three main types of techno-signatures considered: interstellar communications, effects on the atmosphere, and planetary-sized structures such as Dyson spheres. Organizations such as the SETI Institute search the cosmos for potential forms of communication. They started with radio waves, and now search for laser pulses as well. The challenge for this search is that there are natural sources of such signals as well, such as gamma-ray bursts and supernovae, and the difference between a natural signal and an artificial one would be in its specific patterns. Astronomers intend to use artificial intelligence for this, as it can manage large amounts of data and is devoid of biases and preconceptions. Besides, even if there is an advanced extraterrestrial civilization, there is no guarantee that it is transmitting radio communications in the direction of Earth. The length of time required for a signal to travel across space means that a potential answer may arrive decades or centuries after the initial message. The atmosphere of Earth is rich in nitrogen dioxide as a result of air pollution, which can be detectable. The natural abundance of carbon, which is also relatively reactive, makes it likely to be a basic component of the development of a potential extraterrestrial technological civilization, as it is on Earth. Fossil fuels may likely be generated and used on such worlds as well. The abundance of chlorofluorocarbons in the atmosphere can also be a clear technosignature, considering their role in ozone depletion. Light pollution may be another technosignature, as multiple lights on the night side of a rocky planet can be a sign of advanced technological development. However, modern telescopes are not strong enough to study exoplanets with the required level of detail to perceive it. The Kardashev scale proposes that a civilization may eventually start consuming energy directly from its local star. This would require giant structures built next to it, called Dyson spheres. Those speculative structures would cause an excess infrared radiation, that telescopes may notice. The infrared radiation is typical of young stars, surrounded by dusty protoplanetary disks that will eventually form planets. An older star such as the Sun would have no natural reason to have excess infrared radiation. The presence of heavy elements in a star's light-spectrum is another potential biosignature; such elements would (in theory) be found if the star were being used as an incinerator/repository for nuclear waste products. Some astronomers search for extrasolar planets that may be conducive to life, narrowing the search to terrestrial planets within the habitable zones of their stars. Since 1992, over four thousand exoplanets have been discovered (6,128 planets in 4,584 planetary systems including 1,017 multiple planetary systems as of 30 October 2025). The extrasolar planets so far discovered range in size from that of terrestrial planets similar to Earth's size to that of gas giants larger than Jupiter. The number of observed exoplanets is expected to increase greatly in the coming years.[better source needed] The Kepler space telescope has also detected a few thousand candidate planets, of which about 11% may be false positives. There is at least one planet on average per star. About 1 in 5 Sun-like stars[a] have an "Earth-sized"[b] planet in the habitable zone,[c] with the nearest expected to be within 12 light-years distance from Earth. Assuming 200 billion stars in the Milky Way,[d] that would be 11 billion potentially habitable Earth-sized planets in the Milky Way, rising to 40 billion if red dwarfs are included. The rogue planets in the Milky Way possibly number in the trillions. The nearest known exoplanet is Proxima Centauri b, located 4.2 light-years (1.3 pc) from Earth in the southern constellation of Centaurus. As of March 2014[update], the least massive exoplanet known is PSR B1257+12 A, which is about twice the mass of the Moon. The most massive planet listed on the NASA Exoplanet Archive is DENIS-P J082303.1−491201 b, about 29 times the mass of Jupiter, although according to most definitions of a planet, it is too massive to be a planet and may be a brown dwarf instead. Almost all of the planets detected so far are within the Milky Way, but there have also been a few possible detections of extragalactic planets. The study of planetary habitability also considers a wide range of other factors in determining the suitability of a planet for hosting life. One sign that a planet probably already contains life is the presence of an atmosphere with significant amounts of oxygen, since that gas is highly reactive and generally would not last long without constant replenishment. This replenishment occurs on Earth through photosynthetic organisms. One way to analyse the atmosphere of an exoplanet is through spectrography when it transits its star, though this might only be feasible with dim stars like white dwarfs. History and cultural impact The modern concept of extraterrestrial life is based on assumptions that were not commonplace during the early days of astronomy. The first explanations for the celestial objects seen in the night sky were based on mythology. Scholars from Ancient Greece were the first to consider that the universe is inherently understandable and rejected explanations based on supernatural incomprehensible forces, such as the myth of the Sun being pulled across the sky in the chariot of Apollo. They had not developed the scientific method yet and based their ideas on pure thought and speculation, but they developed precursor ideas to it, such as that explanations had to be discarded if they contradict observable facts. The discussions of those Greek scholars established many of the pillars that would eventually lead to the idea of extraterrestrial life, such as Earth being round and not flat. The cosmos was first structured in a geocentric model that considered that the sun and all other celestial bodies revolve around Earth. However, they did not consider them as worlds. In Greek understanding, the world was composed by both Earth and the celestial objects with noticeable movements. Anaximander thought that the cosmos was made from apeiron, a substance that created the world, and that the world would eventually return to the cosmos. Eventually two groups emerged, the atomists that thought that matter at both Earth and the cosmos was equally made of small atoms of the classical elements (earth, water, fire and air), and the Aristotelians who thought that those elements were exclusive of Earth and that the cosmos was made of a fifth one, the aether. Atomist Epicurus thought that the processes that created the world, its animals and plants should have created other worlds elsewhere, along with their own animals and plants. Aristotle thought instead that all the earth element naturally fell towards the center of the universe, and that would make it impossible for other planets to exist elsewhere. Under that reasoning, Earth was not only in the center, it was also the only planet in the universe. Cosmic pluralism, the plurality of worlds, or simply pluralism, describes the philosophical belief in numerous "worlds" in addition to Earth, which might harbor extraterrestrial life. The earliest recorded assertion of extraterrestrial human life is found in ancient scriptures of Jainism. There are multiple "worlds" mentioned in Jain scriptures that support human life. These include, among others, Bharat Kshetra, Mahavideh Kshetra, Airavat Kshetra, and Hari kshetra. Medieval Muslim writers like Fakhr al-Din al-Razi and Muhammad al-Baqir supported cosmic pluralism on the basis of the Qur'an. Chaucer's poem The House of Fame engaged in medieval thought experiments that postulated the plurality of worlds. However, those ideas about other worlds were different from the current knowledge about the structure of the universe, and did not postulate the existence of planetary systems other than the Solar System. When those authors talk about other worlds, they talk about places located at the center of their own systems, and with their own stellar vaults and cosmos surrounding them. The Greek ideas and the disputes between atomists and Aristotelians outlived the fall of the Greek empire. The Great Library of Alexandria compiled information about it, part of which was translated by Islamic scholars and thus survived the end of the Library. Baghdad combined the knowledge of the Greeks, the Indians, the Chinese and its own scholars, and the knowledge expanded through the Byzantine Empire. From there it eventually returned to Europe by the time of the Middle Ages. However, as the Greek atomist doctrine held that the world was created by random movements of atoms, with no need for a creator deity, it became associated with atheism, and the dispute intertwined with religious ones. Still, the Church did not react to those topics in a homogeneous way, and there were stricter and more permissive views within the church itself. The first known mention of the term 'panspermia' was in the writings of the 5th-century BC Greek philosopher Anaxagoras. He proposed the idea that life exists everywhere. By the time of the late Middle Ages there were many known inaccuracies in the geocentric model, but it was kept in use because naked eye observations provided limited data. Nicolaus Copernicus started the Copernican Revolution by proposing that the planets revolve around the sun rather than Earth. His proposal had little acceptance at first because, as he kept the assumption that orbits were perfect circles, his model led to as many inaccuracies as the geocentric one. Tycho Brahe improved the available data with naked-eye observatories, which worked with highly complex sextants and quadrants. Tycho could not make sense of his observations, but Johannes Kepler did: orbits were not perfect circles, but ellipses. This knowledge benefited the Copernican model, which worked now almost perfectly. The invention of the telescope a short time later, perfected by Galileo Galilei, clarified the final doubts, and the paradigm shift was completed. Under this new understanding, the notion of extraterrestrial life became feasible: if Earth is but just a planet orbiting around a star, there may be planets similar to Earth elsewhere. The astronomical study of distant bodies also proved that physical laws are the same elsewhere in the universe as on Earth, with nothing making the planet truly special. The new ideas were met with resistance from the Catholic church. Galileo was tried for the heliocentric model, which was considered heretical, and forced to recant it. The best-known early-modern proponent of ideas of extraterrestrial life was the Italian philosopher Giordano Bruno, who argued in the 16th century for an infinite universe in which every star is surrounded by its own planetary system. Bruno wrote that other worlds "have no less virtue nor a nature different to that of our earth" and, like Earth, "contain animals and inhabitants". Bruno's belief in the plurality of worlds was one of the charges leveled against him by the Venetian Holy Inquisition, which tried and executed him. The heliocentric model was further strengthened by the postulation of the theory of gravity by Sir Isaac Newton. This theory provided the mathematics that explains the motions of all things in the universe, including planetary orbits. By this point, the geocentric model was definitely discarded. By this time, the use of the scientific method had become a standard, and new discoveries were expected to provide evidence and rigorous mathematical explanations. Science also took a deeper interest in the mechanics of natural phenomena, trying to explain not just the way nature works but also the reasons for working that way. There was very little actual discussion about extraterrestrial life before this point, as the Aristotelian ideas remained influential while geocentrism was still accepted. When it was finally proved wrong, it not only meant that Earth was not the center of the universe, but also that the lights seen in the sky were not just lights, but physical objects. The notion that life may exist in them as well soon became an ongoing topic of discussion, although one with no practical ways to investigate. The possibility of extraterrestrials remained a widespread speculation as scientific discovery accelerated. William Herschel, the discoverer of Uranus, was one of many 18th–19th-century astronomers who believed that the Solar System is populated by alien life. Other scholars of the period who championed "cosmic pluralism" included Immanuel Kant and Benjamin Franklin. At the height of the Enlightenment, even the Sun and Moon were considered candidates for extraterrestrial inhabitants. Speculation about life on Mars increased in the late 19th century, following telescopic observation of apparent Martian canals – which soon, however, turned out to be optical illusions. Despite this, in 1895, American astronomer Percival Lowell published his book Mars, followed by Mars and its Canals in 1906, proposing that the canals were the work of a long-gone civilisation. Spectroscopic analysis of Mars's atmosphere began in earnest in 1894, when U.S. astronomer William Wallace Campbell showed that neither water nor oxygen was present in the Martian atmosphere. By 1909 better telescopes and the best perihelic opposition of Mars since 1877 conclusively put an end to the canal hypothesis. As a consequence of the belief in the spontaneous generation there was little thought about the conditions of each celestial body: it was simply assumed that life would thrive anywhere. This theory was disproved by Louis Pasteur in the 19th century. Popular belief in thriving alien civilisations elsewhere in the solar system still remained strong until Mariner 4 and Mariner 9 provided close images of Mars, which debunked forever the idea of the existence of Martians and decreased the previous expectations of finding alien life in general. The end of the spontaneous generation belief forced investigation into the origin of life. Although abiogenesis is the more accepted theory, a number of authors reclaimed the term "panspermia" and proposed that life was brought to Earth from elsewhere. Some of those authors are Jöns Jacob Berzelius (1834), Kelvin (1871), Hermann von Helmholtz (1879) and, somewhat later, by Svante Arrhenius (1903). The science fiction genre, although not so named during the time, developed during the late 19th century. The expansion of the genre of extraterrestrials in fiction influenced the popular perception over the real-life topic, making people eager to jump to conclusions about the discovery of aliens. Science marched at a slower pace, some discoveries fueled expectations and others dashed excessive hopes. For example, with the advent of telescopes, most structures seen on the Moon or Mars were immediately attributed to Selenites or Martians, and later ones (such as more powerful telescopes) revealed that all such discoveries were natural features. A famous case is the Cydonia region of Mars, first imaged by the Viking 1 orbiter. The low-resolution photos showed a rock formation that resembled a human face, but later spacecraft took photos in higher detail that showed that there was nothing special about the site. The search and study of extraterrestrial life became a science of its own, astrobiology. Also known as exobiology, this discipline is studied by the NASA, the ESA, the INAF, and others. Astrobiology studies life from Earth as well, but with a cosmic perspective. For example, abiogenesis is of interest to astrobiology, not because of the origin of life on Earth, but for the chances of a similar process taking place in other celestial bodies. Many aspects of life, from its definition to its chemistry, are analyzed as either likely to be similar in all forms of life across the cosmos or only native to Earth. Astrobiology, however, remains constrained by the current lack of extraterrestrial life-forms to study, as all life on Earth comes from the same ancestor, and it is hard to infer general characteristics from a group with a single example to analyse. The 20th century came with great technological advances, speculations about future hypothetical technologies, and an increased basic knowledge of science by the general population thanks to science divulgation through the mass media. The public interest in extraterrestrial life and the lack of discoveries by mainstream science led to the emergence of pseudosciences that provided affirmative, if questionable, answers to the existence of aliens. Ufology claims that many unidentified flying objects (UFOs) would be spaceships from alien species, and ancient astronauts hypothesis claim that aliens would have visited Earth in antiquity and prehistoric times but people would have failed to understand it by then. Most UFOs or UFO sightings can be readily explained as sightings of Earth-based aircraft (including top-secret aircraft), known astronomical objects or weather phenomenons, or as hoaxes. Looking beyond the pseudosciences, Lewis White Beck strove to elevate the level of public discourse on the topic of extraterrestrial life by tracing the evolution of philosophical thought over the centuries from ancient times into the modern era. His review of the contributions made by Lucretius, Plutarch, Aristotle, Copernicus, Immanuel Kant, John Wilkins, Charles Darwin and Karl Marx demonstrated that even in modern times, humanity could be profoundly influenced in its search for extraterrestrial life by subtle and comforting archetypal ideas which are largely derived from firmly held religious, philosophical and existential belief systems. On a positive note, however, Beck further argued that even if the search for extraterrestrial life proves to be unsuccessful, the endeavor itself could have beneficial consequences by assisting humanity in its attempt to actualize superior ways of living here on Earth. By the 21st century, it was accepted that multicellular life in the Solar System can only exist on Earth, but the interest in extraterrestrial life increased regardless. This is a result of the advances in several sciences. The knowledge of planetary habitability allows to consider on scientific terms the likelihood of finding life at each specific celestial body, as it is known which features are beneficial and harmful for life. Astronomy and telescopes also improved to the point exoplanets can be confirmed and even studied, increasing the number of search places. Life may still exist elsewhere in the Solar System in unicellular form, but the advances in spacecraft allow to send robots to study samples in situ, with tools of growing complexity and reliability. Although no extraterrestrial life has been found and life may still be just a rarity from Earth, there are scientific reasons to suspect that it can exist elsewhere, and technological advances that may detect it if it does. Many scientists are optimistic about the chances of finding alien life. In the words of SETI's Frank Drake, "All we know for sure is that the sky is not littered with powerful microwave transmitters". Drake noted that it is entirely possible that advanced technology results in communication being carried out in some way other than conventional radio transmission. At the same time, the data returned by space probes, and giant strides in detection methods, have allowed science to begin delineating habitability criteria on other worlds, and to confirm that at least other planets are plentiful, though aliens remain a question mark. The Wow! signal, detected in 1977 by a SETI project, remains a subject of speculative debate. On the other hand, other scientists are pessimistic. Jacques Monod wrote that "Man knows at last that he is alone in the indifferent immensity of the universe, whence which he has emerged by chance". In 2000, geologist and paleontologist Peter Ward and astrobiologist Donald Brownlee published a book entitled Rare Earth: Why Complex Life is Uncommon in the Universe.[better source needed] In it, they discussed the Rare Earth hypothesis, in which they claim that Earth-like life is rare in the universe, whereas microbial life is common. Ward and Brownlee are open to the idea of evolution on other planets that is not based on essential Earth-like characteristics such as DNA and carbon. As for the possible risks, theoretical physicist Stephen Hawking warned in 2010 that humans should not try to contact alien life forms. He warned that aliens might pillage Earth for resources. "If aliens visit us, the outcome would be much as when Columbus landed in America, which didn't turn out well for the Native Americans", he said. Jared Diamond had earlier expressed similar concerns. On 20 July 2015, Hawking and Russian billionaire Yuri Milner, along with the SETI Institute, announced a well-funded effort, called the Breakthrough Initiatives, to expand efforts to search for extraterrestrial life. The group contracted the services of the 100-meter Robert C. Byrd Green Bank Telescope in West Virginia in the United States and the 64-meter Parkes Telescope in New South Wales, Australia. On 13 February 2015, scientists (including Geoffrey Marcy, Seth Shostak, Frank Drake and David Brin) at a convention of the American Association for the Advancement of Science, discussed Active SETI and whether transmitting a message to possible intelligent extraterrestrials in the Cosmos was a good idea; one result was a statement, signed by many, that a "worldwide scientific, political and humanitarian discussion must occur before any message is sent". Government responses The 1967 Outer Space Treaty and the 1979 Moon Agreement define rules of planetary protection against potentially hazardous extraterrestrial life. COSPAR also provides guidelines for planetary protection. A committee of the United Nations Office for Outer Space Affairs had in 1977 discussed for a year strategies for interacting with extraterrestrial life or intelligence. The discussion ended without any conclusions. As of 2010, the UN lacks response mechanisms for the case of an extraterrestrial contact. One of the NASA divisions is the Office of Safety and Mission Assurance (OSMA), also known as the Planetary Protection Office. A part of its mission is to "rigorously preclude backward contamination of Earth by extraterrestrial life." In 2016, the Chinese Government released a white paper detailing its space program. According to the document, one of the research objectives of the program is the search for extraterrestrial life. It is also one of the objectives of the Chinese Five-hundred-meter Aperture Spherical Telescope (FAST) program. In 2020, Dmitry Rogozin, the head of the Russian space agency, said the search for extraterrestrial life is one of the main goals of deep space research. He also acknowledged the possibility of existence of primitive life on other planets of the Solar System. The French space agency has an office for the study of "non-identified aero spatial phenomena". The agency is maintaining a publicly accessible database of such phenomena, with over 1600 detailed entries. According to the head of the office, the vast majority of entries have a mundane explanation; but for 25% of entries, their extraterrestrial origin can neither be confirmed nor denied. In 2020, chairman of the Israel Space Agency Isaac Ben-Israel stated that the probability of detecting life in outer space is "quite large". But he disagrees with his former colleague Haim Eshed who stated that there are contacts between an advanced alien civilisation and some of Earth's governments. In fiction Although the idea of extraterrestrial peoples became feasible once astronomy developed enough to understand the nature of planets, they were not thought of as being any different from humans. Having no scientific explanation for the origin of mankind and its relation to other species, there was no reason to expect them to be any other way. This was changed by the 1859 book On the Origin of Species by Charles Darwin, which proposed the theory of evolution. Now with the notion that evolution on other planets may take other directions, science fiction authors created bizarre aliens, clearly distinct from humans. A usual way to do that was to add body features from other animals, such as insects or octopuses. Costuming and special effects feasibility alongside budget considerations forced films and TV series to tone down the fantasy, but these limitations lessened since the 1990s with the advent of computer-generated imagery (CGI), and later on as CGI became more effective and less expensive. Real-life events sometimes captivate people's imagination and this influences the works of fiction. For example, during the Barney and Betty Hill incident, the first recorded claim of an alien abduction, the couple reported that they were abducted and experimented on by aliens with oversized heads, big eyes, pale grey skin, and small noses, a description that eventually became the grey alien archetype once used in works of fiction. See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Internet.org] | [TOKENS: 1680] |
Contents Internet.org Internet.org is a partnership between social networking services company Meta Platforms and six companies (Samsung, Ericsson, MediaTek, Opera Software, Nokia and Qualcomm) that plans to bring affordable access to selected Internet services to less developed countries by increasing efficiency, and facilitating the development of new business models around the provision of Internet access. The app delivering these services was renamed Free Basics in September 2015. As of April 2018, 100 million people were using internet.org. It has been criticized for violating net neutrality, and by handpicking internet services that are included, for discriminating against companies not in the list, including competitors of Meta Platforms' subsidiary Facebook. In February 2016, regulators banned the Free Basics service in India based on "Prohibition of Discriminatory Tariffs for Data Services Regulations". The Telecom Regulatory Authority of India (TRAI) accused Facebook of failing to pass on the four questions in the regulator's consultation paper and also blocking access to TRAI's designated email for feedback on Free Basics. On February 11, 2016, Facebook withdrew the Free Basics platform from India. In July 2017, Global Voices published the widespread report "Free Basics in Real Life" analyzing its practices in Africa, Asia and Latin America, and concluding it violates net neutrality, focuses on "Western corporate content", and overall "it's not even very helpful". History Internet.org was launched on August 20, 2013. At the time of launch, Facebook's founder and CEO Mark Zuckerberg released a ten-page whitepaper he had written elaborating on the vision that asserts that connectivity is a "human right". In the paper, he wrote that Internet.org was a further step in the direction of Facebook's past initiatives, such as Facebook Zero, to improve Internet access for people around the world. During TechCrunch Disrupt on September 11, 2013 Zuckerberg elaborated further on his vision. TechCrunch compared Internet.org with Google's Project Loon. Zuckerberg also released a video on September 30, 2013 explaining Internet.org's goal of making the Internet 100 times more affordable. On February 24, 2014, shortly before a keynote presentation by Zuckerberg at the Mobile World Congress in Barcelona, Internet.org unveiled several new projects: an education partnership called SocialEDU with Nokia and local carrier AirTel, edX, and the government in Rwanda; a project with Unilever in India; and a new Internet.org Innovation Lab with Ericsson in its Menlo Park HQ. In the presentation, Zuckerberg says that Facebook's recent acquisition of mobile messaging app WhatsApp for $19 billion was closely related to the Internet.org vision. In May 2015, Facebook announced the Internet.org Platform, an open program for developers to easily create services that integrate with Internet.org. This was seen by commentators as a response to concerns raised over net neutrality. Participating websites must meet three criteria: On March 27, 2014, Facebook announced a connectivity lab as part of the Internet.org initiative, with the goal of bringing the Internet to everybody via drones acquired from the company Ascenta. Connectivity Lab also stated that low-Earth orbit and geosynchronous satellites would be part of the project for establishing internet connectivity in other areas. All three systems would rely on free-space optics, where the signal is sent in a compact bundle of infrared light. At Mobile World Congress March 2015, Mark Zuckerberg says that the Internet.org initiative was "willing to work" with Project Loon, Google's project to use high-altitude balloons to provide people cheaper Internet access. However, he emphasized that in his view, the real work is in partnering with existing telecommunications companies to improve access and reduce costs for people already within range of a network, which he estimates at over 80% of the population. In October 2015, Facebook and Eutelsat leased the entire Ka-band capacity (36 spot beams with a total throughput of 18 Gbit/s) on the planned AMOS-6 satellite to provide access to parts of Africa. AMOS-6 was intended to be launched on flight 29 of a SpaceX Falcon 9 to geosynchronous transfer orbit on 3 September 2016. However, on 1 September 2016, during the run-up to a static fire test, there was an anomaly on the launch pad resulting in a fire and the loss of the vehicle and its payload, AMOS-6. There were no injuries. In January 2016, Google had exited Facebook's Free Basics platform in Zambia. They were included in the initial trial of this project, which was first launched in Zambia. The first Internet.org summit was held on 9 October 2014 in New Delhi, India. The primary objective of this summit was to bring together experts, officials and industry leaders to focus on ways to deliver more Internet services for people in languages other than English. Zuckerberg also met Indian Prime Minister Narendra Modi to talk about how Facebook and the Indian government can collaborate on Internet.org. In 2015, after criticism of Internet.org, which has a partnership with Reliance in India, Mark Zuckerberg stated in an article for Hindustan Times that Internet.org and net neutrality can co-exist, and Internet.org will never differentiate between services. His claims were contested by many response articles, including one published in the Hindustan Times. In May 2015, the Internet.org Platform, open to participation by any developers meeting specified guidelines, was announced. Some commentators viewed this announcement as a response to the net neutrality concerns expressed. The PMO has expressed displeasure at Facebook's reaction to and handling of TRAI's consultation paper, calling it a crudely majoritarian and orchestrated opinion poll. An Indian journalist, in his reply to Mark Zuckerberg's article, criticized Internet.org as "being just a Facebook proxy targeting India's poor" as it provides restricted Internet access to Reliance Telecom's subscribers in India. Until April 2015, Internet.org users could access (for free) only a few websites, and Facebook's role as gatekeeper in determining what websites were in that list was criticised for violating net neutrality. In May 2015, Facebook announced that the Free Basics Platform would be opened to websites that met its criteria. In April 2015, some Indian startups started pulling out of Internet.org to protect net neutrality. The Telecom Regulatory Authority of India (TRAI) in January 2016 criticized Facebook for its misleading commercials and astroturfing the Free Basics campaign. TRAI accused Facebook of failing to pass on the four questions in the regulator's consultation paper and also blocking access to TRAI's designated email for feedback on Free Basics. On February 8, 2016, TRAI banned the Free Basics service in India based on "Prohibition of Discriminatory Tariffs for Data Services Regulations, 2016" notification. On February 11, 2016 Facebook withdrew the Free Basics platform from India. In May 2017, Facebook, in partnership with Indian telecoms operator Bharti Airtel, launched a service under the Express Wi-Fi banner. Participants Below is a selective history of launch dates and participating mobile networks: Reception An article published on Datamation in August 2013 discussed Internet.org in relation to past accessibility initiatives by Facebook and Google such as Facebook Zero, Google Free Zone, and Project Loon. Internet.org and Project Loon have been described as being engaged in an Internet space race. There have also been technical debates about the relative feasibility and value of using balloons (as championed by Project Loon) instead of drones, with Mark Zuckerberg favoring drones. In December 2013, David Talbot wrote a detailed article for Technology Review titled Facebook's Two Faces: Facebook and Google Aim to Fix Global Connectivity, but for Whom? about Internet.org and other Internet accessibility initiatives. In 2015, researchers evaluating how Facebook Zero shapes information and communication technologies (ICT) usage in the developing world found that 11% of Indonesians who said they used Facebook also said they did not use the Internet. 65% of Nigerians, and 61% of Indonesians agree with the statement that "Facebook is the Internet" compared with only 5% in the US. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Category:CS1_maint:_DOI_inactive_as_of_January_2026] | [TOKENS: 511] |
Category:CS1 maint: DOI inactive as of January 2026 This category lists pages that have cs1|2 templates that use |doi=, where a digital object identifier doi value has been specified but then recognized as inactive. These are collected in Category:CS1 maint: DOI inactive. This may represent: Pages in this category should only be added by Module:Citation/CS1. By default, Citation Style 1 and Citation Style 2 error messages are visible to all readers and maintenance messages are hidden from all readers. To display maintenance messages in the rendered article, include the following text in your common CSS page (common.css) or your specific skin's CSS page and (skin.css). (Note to new editors: those CSS pages are specific to you, and control your view of pages, by adding to your user account's CSS code. If you have not yet created such a page, then clicking one of the .css links above will yield a page that starts "Wikipedia does not have a user page with this exact name." Click the "Start the User:username/filename page" link, paste the text below, save the page, follow the instructions at the bottom of the new page on bypassing your browser's cache, and finally, in order to see the previously hidden maintenance messages, refresh the page you were editing earlier.) To display hidden-by-default error messages: Even with this CSS installed, older pages in Wikipedia's cache may not have been updated to show these error messages even though the page is listed in one of the tracking categories. A null edit will resolve that issue. After (error and/maintenance) messages are displayed, it might still not be easy to find them in a large article with a lot of citations. Messages can then be found by searching (with Ctrl-F) for "(help)" or "cs1". To hide normally-displayed error messages: You can personalize the display of these messages (such as changing the color), but you will need to ask someone who knows CSS or at the technical village pump if you do not understand how. Nota bene: these CSS rules are not obeyed by Navigation popups. They also do not hide script warning messages in the Preview box that begin with "This is only a preview; your changes have not yet been saved". Contents Pages in category "CS1 maint: DOI inactive as of January 2026" The following 200 pages are in this category, out of approximately 557 total. This list may not reflect recent changes. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Eurodollar] | [TOKENS: 1256] |
Contents Eurodollar Eurodollars are U.S. dollars held in time deposit accounts in banks outside the United States. The term was originally applied to U.S. dollar accounts held in banks situated in Europe, but it expanded over the years to cover U.S. dollar accounts held anywhere outside the U.S. Thus, a U.S. dollar-denominated deposit in Dubai or Singapore would likewise be deemed a Eurodollar deposit (sometimes an Asiadollar). More generally, the euro- prefix can be used to indicate any currency held in a country where it is not the official currency, broadly termed "eurocurrency", for example, Euroyen or even Euroeuro. Eurodollars have different regulatory requirements than dollars held in U.S. banks. Eurodollars can be riskier than assets held in U.S. banks, which include at least partial deposit insurance, and as a result, demand a higher interest rate. There is no connection with the euro currency of the European Union. Eurodollars facilitate global trade and investment and liquidity. History After World War II, the quantity of physical U.S. dollar banknotes outside the United States increased significantly, as a result of both the dollar funding of the Marshall Plan and from dollar proceeds of European exports to the U.S., which had become the largest consumer market. As a result, large amounts of U.S. dollar banknotes were in the custody of foreign banks outside the United States. Some foreign countries, including the Soviet Union, also had deposits in U.S. dollars in American banks, evidenced by certificates of deposit. Various narrations are given of the creation of the first eurodollar account, but most trace back to Communist governments keeping dollar deposits abroad. In one version, the first eurodollar account was created in France in favour of Communist China, which in 1949 managed to move almost all of its U.S. dollar banknotes to the Soviet-owned Banque Commerciale pour l'Europe du Nord – Eurobank in Paris before the United States froze its remaining U.S. situated assets during the Korean War. In another version, the first eurodollar account was created by an English bank in favour of the Soviet Union during the Cold War, following the Hungarian Revolution of 1956, as the Soviet Union feared that its deposits in North American banks would be frozen as a sanction. It therefore decided to move some of its U.S. dollars held directly in North American banks to the Moscow Narodny Bank, an English limited liability company registered in London in 1919, whose shares were owned by the Soviet Union. The English bank would then re-deposit the dollars into U.S. banks. Thus although in reality the dollars never left North America, there would be no chance of the U.S. confiscating that money, because now it belonged legally to the British bank and not directly to the Soviets, the beneficial owners. Accordingly, on 28 February 1957, the sum of $800,000 was duly transferred, creating the first eurodollars. Initially dubbed "Eurobank dollars" after the bank's telex address, they eventually became known as "eurodollars" as such deposits were at first held mostly by European banks and financial institutions. City of London banks, such as Midland Bank, now part of HSBC, and their offshore holding companies also played a major role in holding the deposits. In the mid-1950s, Eurodollar trading and its development into a dominant world currency began when the Soviet Union wanted better interest rates on their Eurodollars and convinced an Italian banking cartel to give them more interest than could have been earned if the dollars were deposited in the U.S. The Italian bankers then had to find customers ready to borrow the Soviet dollars and pay above the U.S. legal interest-rate caps for their use, and were able to do so; thus, Eurodollars began to be used increasingly in global finance. By the end of the 1960s, the eurodollar market was $70 billion. These deposits were lent on as U.S. dollar loans to businesses in other countries where interest rates on loans were perhaps much higher in the local currency, and where the businesses were exporting to the U.S. and receiving payment in dollars, thereby avoiding foreign exchange risk on their funding arrangements. In 1974, after the Nixon shock, the 1970s energy crisis, and the collapse of Franklin National Bank, 10 central banks worldwide agreed to backstop the eurodollar market to prevent a run. By the mid-1980s, there were more eurodollars than dollars. Several factors led eurodollars to overtake certificates of deposit (CDs) issued by U.S. banks as the primary private short-term money market instruments by the 1980s, including:[clarification needed] In 1997, nearly 90% of all international loans were made via Eurodollars. Until the repeal of Regulation Q on 21 July 2011, banks were not allowed to pay interest on corporate transactional accounts. Banks would automatically transfer, or sweep, funds from a corporation's checking account into an overnight investment option such as Eurodollar sweep accounts to effectively earn interest on those funds. In 2016, the Eurodollar market size was estimated at around 13.833 trillion. Since 2016, the use of Eurodollars has been on a consistent decline. After reserve requirements were eliminated in 2020, U.S. banks began shifting toward selected deposits (domestic, offshore-style instruments) instead of Eurodollars. As of early 2024, selected deposits made up nearly 85% of overnight volume, compared to about 50–50 in 2019. Eurodollar futures contracts The Eurodollar futures contract was launched in 1981. It was the first cash-settled futures contract. It traded on the Chicago Mercantile Exchange. Eurodollar futures were an instrument used to wager on Federal Reserve policy or to hedge the direction of short-term interest rates. In April 2023, after the Libor scandal, they were eliminated and transitioned to SOFR-based contracts. See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Internet#cite_ref-142] | [TOKENS: 9291] |
Contents Internet The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that comprises private, public, academic, business, and government networks of local to global scope, linked by electronic, wireless, and optical networking technologies. The Internet carries a vast range of information services and resources, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, discussion groups, internet telephony, streaming media and file sharing. Most traditional communication media, including telephone, radio, television, paper mail, newspapers, and print publishing, have been transformed by the Internet, giving rise to new media such as email, online music, digital newspapers, news aggregators, and audio and video streaming websites. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has also grown to occupy a significant market across industries, enabling firms to extend brick and mortar presences to serve larger markets. Business-to-business and financial services on the Internet affect supply chains across entire industries. The origins of the Internet date back to research that enabled the time-sharing of computer resources, the development of packet switching, and the design of computer networks for data communication. The set of communication protocols to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France. The Internet has no single centralized governance in either technological implementation or policies for access and usage. Each constituent network sets its own policies. The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the non-profit Internet Engineering Task Force (IETF). Terminology The word internetted was used as early as 1849, meaning interconnected or interwoven. The word Internet was used in 1945 by the United States War Department in a radio operator's manual, and in 1974 as the shorthand form of Internetwork. Today, the term Internet most commonly refers to the global system of interconnected computer networks, though it may also refer to any group of smaller networks. The word Internet may be capitalized as a proper noun, although this is becoming less common. This reflects the tendency in English to capitalize new terms and move them to lowercase as they become familiar. The word is sometimes still capitalized to distinguish the global internet from smaller networks, though many publications, including the AP Stylebook since 2016, recommend the lowercase form in every case. In 2016, the Oxford English Dictionary found that, based on a study of around 2.5 billion printed and online sources, "Internet" was capitalized in 54% of cases. The terms Internet and World Wide Web are often used interchangeably; it is common to speak of "going on the Internet" when using a web browser to view web pages. However, the World Wide Web, or the Web, is only one of a large number of Internet services. It is the global collection of web pages, documents and other web resources linked by hyperlinks and URLs. History In the 1960s, computer scientists began developing systems for time-sharing of computer resources. J. C. R. Licklider proposed the idea of a universal network while working at Bolt Beranek & Newman and, later, leading the Information Processing Techniques Office at the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. Research into packet switching,[c] one of the fundamental Internet technologies, started in the work of Paul Baran at RAND in the early 1960s and, independently, Donald Davies at the United Kingdom's National Physical Laboratory in 1965. After the Symposium on Operating Systems Principles in 1967, packet switching from the proposed NPL network was incorporated into the design of the ARPANET, an experimental resource sharing network proposed by ARPA. ARPANET development began with two network nodes which were interconnected between the University of California, Los Angeles and the Stanford Research Institute on 29 October 1969. The third site was at the University of California, Santa Barbara, followed by the University of Utah. By the end of 1971, 15 sites were connected to the young ARPANET. Thereafter, the ARPANET gradually developed into a decentralized communications network, connecting remote centers and military bases in the United States. Other user networks and research networks, such as the Merit Network and CYCLADES, were developed in the late 1960s and early 1970s. Early international collaborations for the ARPANET were rare. Connections were made in 1973 to Norway (NORSAR and, later, NDRE) and to Peter Kirstein's research group at University College London, which provided a gateway to British academic networks, the first internetwork for resource sharing. ARPA projects, the International Network Working Group and commercial initiatives led to the development of various protocols and standards by which multiple separate networks could become a single network, or a network of networks. In 1974, Vint Cerf at Stanford University and Bob Kahn at DARPA published a proposal for "A Protocol for Packet Network Intercommunication". Cerf and his graduate students used the term internet as a shorthand for internetwork in RFC 675. The Internet Experiment Notes and later RFCs repeated this use. The work of Louis Pouzin and Robert Metcalfe had important influences on the resulting TCP/IP design. National PTTs and commercial providers developed the X.25 standard and deployed it on public data networks. The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet Protocol Suite (TCP/IP) was standardized, which facilitated worldwide proliferation of interconnected networks. TCP/IP network access expanded again in 1986 when the National Science Foundation Network (NSFNet) provided access to supercomputer sites in the United States for researchers, first at speeds of 56 kbit/s and later at 1.5 Mbit/s and 45 Mbit/s. The NSFNet expanded into academic and research organizations in Europe, Australia, New Zealand and Japan in 1988–89. Although other network protocols such as UUCP and PTT public data networks had global reach well before this time, this marked the beginning of the Internet as an intercontinental network. Commercial Internet service providers emerged in 1989 in the United States and Australia. The ARPANET was decommissioned in 1990. The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web, marked the beginning of the transition to the modern Internet. Steady advances in semiconductor technology and optical networking created new economic opportunities for commercial involvement in the expansion of the network in its core and for delivering services to the public. In mid-1989, MCI Mail and Compuserve established connections to the Internet, delivering email and public access products to the half million users of the Internet. Just months later, on 1 January 1990, PSInet launched an alternate Internet backbone for commercial use; one of the networks that added to the core of the commercial Internet of later years. In March 1990, the first high-speed T1 (1.5 Mbit/s) link between the NSFNET and Europe was installed between Cornell University and CERN, allowing much more robust communications than were capable with satellites. Later in 1990, Tim Berners-Lee began writing WorldWideWeb, the first web browser, after two years of lobbying CERN management. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the HyperText Transfer Protocol (HTTP) 0.9, the HyperText Markup Language (HTML), the first Web browser (which was also an HTML editor and could access Usenet newsgroups and FTP files), the first HTTP server software (later known as CERN httpd), the first web server, and the first Web pages that described the project itself. In 1991 the Commercial Internet eXchange was founded, allowing PSInet to communicate with the other commercial networks CERFnet and Alternet. Stanford Federal Credit Union was the first financial institution to offer online Internet banking services to all of its members in October 1994. In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. By 1995, the Internet was fully commercialized in the U.S. when the NSFNet was decommissioned, removing the last restrictions on use of the Internet to carry commercial traffic. As technology advanced and commercial opportunities fueled reciprocal growth, the volume of Internet traffic started experiencing similar characteristics as that of the scaling of MOS transistors, exemplified by Moore's law, doubling every 18 months. This growth, formalized as Edholm's law, was catalyzed by advances in MOS technology, laser light wave systems, and noise performance. Since 1995, the Internet has tremendously impacted culture and commerce, including the rise of near-instant communication by email, instant messaging, telephony (Voice over Internet Protocol or VoIP), two-way interactive video calls, and the World Wide Web. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1 Gbit/s, 10 Gbit/s, or more. The Internet continues to grow, driven by ever-greater amounts of online information and knowledge, commerce, entertainment and social networking services. During the late 1990s, it was estimated that traffic on the public Internet grew by 100 percent per year, while the mean annual growth in the number of Internet users was thought to be between 20% and 50%. This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network. In November 2006, the Internet was included on USA Today's list of the New Seven Wonders. As of 31 March 2011[update], the estimated total number of Internet users was 2.095 billion (30% of world population). It is estimated that in 1993 the Internet carried only 1% of the information flowing through two-way telecommunication. By 2000 this figure had grown to 51%, and by 2007 more than 97% of all telecommunicated information was carried over the Internet. Modern smartphones can access the Internet through cellular carrier networks, and internet usage by mobile and tablet devices exceeded desktop worldwide for the first time in October 2016. As of 2018[update], 80% of the world's population were covered by a 4G network. The International Telecommunication Union (ITU) estimated that, by the end of 2017, 48% of individual users regularly connect to the Internet, up from 34% in 2012. Mobile Internet connectivity has played an important role in expanding access in recent years, especially in Asia and the Pacific and in Africa. The number of unique mobile cellular subscriptions increased from 3.9 billion in 2012 to 4.8 billion in 2016, two-thirds of the world's population, with more than half of subscriptions located in Asia and the Pacific. The limits that users face on accessing information via mobile applications coincide with a broader process of fragmentation of the Internet. Fragmentation restricts access to media content and tends to affect the poorest users the most. One solution, zero-rating, is the practice of Internet service providers allowing users free connectivity to access specific content or applications without cost. Social impact The Internet has enabled new forms of social interaction, activities, and social associations, giving rise to the scholarly study of the sociology of the Internet. Between 2000 and 2009, the number of Internet users globally rose from 390 million to 1.9 billion. By 2010, 22% of the world's population had access to computers with 1 billion Google searches every day, 300 million Internet users reading blogs, and 2 billion videos viewed daily on YouTube. In 2014 the world's Internet users surpassed 3 billion or 44 percent of world population, but two-thirds came from the richest countries, with 78 percent of Europeans using the Internet, followed by 57 percent of the Americas. However, by 2018, Asia alone accounted for 51% of all Internet users, with 2.2 billion out of the 4.3 billion Internet users in the world. China's Internet users surpassed a major milestone in 2018, when the country's Internet regulatory authority, China Internet Network Information Centre, announced that China had 802 million users. China was followed by India, with some 700 million users, with the United States third with 275 million users. However, in terms of penetration, in 2022, China had a 70% penetration rate compared to India's 60% and the United States's 90%. In 2022, 54% of the world's Internet users were based in Asia, 14% in Europe, 7% in North America, 10% in Latin America and the Caribbean, 11% in Africa, 4% in the Middle East and 1% in Oceania. In 2019, Kuwait, Qatar, the Falkland Islands, Bermuda and Iceland had the highest Internet penetration by the number of users, with 93% or more of the population with access. As of 2022, it was estimated that 5.4 billion people use the Internet, more than two-thirds of the world's population. Early computer systems were limited to the characters in the American Standard Code for Information Interchange (ASCII), a subset of the Latin alphabet. After English (27%), the most requested languages on the World Wide Web are Chinese (25%), Spanish (8%), Japanese (5%), Portuguese and German (4% each), Arabic, French and Russian (3% each), and Korean (2%). Modern character encoding standards, such as Unicode, allow for development and communication in the world's widely used languages. However, some glitches such as mojibake (incorrect display of some languages' characters) still remain. Several neologisms exist that refer to Internet users: Netizen (as in "citizen of the net") refers to those actively involved in improving online communities, the Internet in general or surrounding political affairs and rights such as free speech, Internaut refers to operators or technically highly capable users of the Internet, digital citizen refers to a person using the Internet in order to engage in society, politics, and government participation. The Internet allows greater flexibility in working hours and location, especially with the spread of unmetered high-speed connections. The Internet can be accessed almost anywhere by numerous means, including through mobile Internet devices. Mobile phones, datacards, handheld game consoles and cellular routers allow users to connect to the Internet wirelessly.[citation needed] Educational material at all levels from pre-school (e.g. CBeebies) to post-doctoral (e.g. scholarly literature through Google Scholar) is available on websites. The internet has facilitated the development of virtual universities and distance education, enabling both formal and informal education. The Internet allows researchers to conduct research remotely via virtual laboratories, with profound changes in reach and generalizability of findings as well as in communication between scientists and in the publication of results. By the late 2010s the Internet had been described as "the main source of scientific information "for the majority of the global North population".: 111 Wikis have also been used in the academic community for sharing and dissemination of information across institutional and international boundaries. In those settings, they have been found useful for collaboration on grant writing, strategic planning, departmental documentation, and committee work. The United States Patent and Trademark Office uses a wiki to allow the public to collaborate on finding prior art relevant to examination of pending patent applications. Queens, New York has used a wiki to allow citizens to collaborate on the design and planning of a local park. The English Wikipedia has the largest user base among wikis on the World Wide Web and ranks in the top 10 among all sites in terms of traffic. The Internet has been a major outlet for leisure activity since its inception, with entertaining social experiments such as MUDs and MOOs being conducted on university servers, and humor-related Usenet groups receiving much traffic. Many Internet forums have sections devoted to games and funny videos. Another area of leisure activity on the Internet is multiplayer gaming. This form of recreation creates communities, where people of all ages and origins enjoy the fast-paced world of multiplayer games. These range from MMORPG to first-person shooters, from role-playing video games to online gambling. While online gaming has been around since the 1970s, modern modes of online gaming began with subscription services such as GameSpy and MPlayer. Streaming media is the real-time delivery of digital media for immediate consumption or enjoyment by end users. Streaming companies (such as Netflix, Disney+, Amazon's Prime Video, Mubi, Hulu, and Apple TV+) now dominate the entertainment industry, eclipsing traditional broadcasters. Audio streamers such as Spotify and Apple Music also have significant market share in the audio entertainment market. Video sharing websites are also a major factor in the entertainment ecosystem. YouTube was founded on 15 February 2005 and is now the leading website for free streaming video with more than two billion users. It uses a web player to stream and show video files. YouTube users watch hundreds of millions, and upload hundreds of thousands, of videos daily. Other video sharing websites include Vimeo, Instagram and TikTok.[citation needed] Although many governments have attempted to restrict both Internet pornography and online gambling, this has generally failed to stop their widespread popularity. A number of advertising-funded ostensible video sharing websites known as "tube sites" have been created to host shared pornographic video content. Due to laws requiring the documentation of the origin of pornography, these websites now largely operate in conjunction with pornographic movie studios and their own independent creator networks, acting as de-facto video streaming services. Major players in this field include the market leader Aylo, the operator of PornHub and numerous other branded sites, as well as other independent operators such as xHamster and Xvideos. As of 2023[update], Internet traffic to pornographic video sites rivalled that of mainstream video streaming and sharing services. Remote work is facilitated by tools such as groupware, virtual private networks, conference calling, videotelephony, and VoIP so that work may be performed from any location, such as the worker's home.[citation needed] The spread of low-cost Internet access in developing countries has opened up new possibilities for peer-to-peer charities, which allow individuals to contribute small amounts to charitable projects for other individuals. Websites, such as DonorsChoose and GlobalGiving, allow small-scale donors to direct funds to individual projects of their choice. A popular twist on Internet-based philanthropy is the use of peer-to-peer lending for charitable purposes. Kiva pioneered this concept in 2005, offering the first web-based service to publish individual loan profiles for funding. The low cost and nearly instantaneous sharing of ideas, knowledge, and skills have made collaborative work dramatically easier, with the help of collaborative software, which allow groups to easily form, cheaply communicate, and share ideas. An example of collaborative software is the free software movement, which has produced, among other things, Linux, Mozilla Firefox, and OpenOffice.org (later forked into LibreOffice).[citation needed] Content management systems allow collaborating teams to work on shared sets of documents simultaneously without accidentally destroying each other's work.[citation needed] The internet also allows for cloud computing, virtual private networks, remote desktops, and remote work.[citation needed] The online disinhibition effect describes the tendency of many individuals to behave more stridently or offensively online than they would in person. A significant number of feminist women have been the target of various forms of harassment, including insults and hate speech, to, in extreme cases, rape and death threats, in response to posts they have made on social media. Social media companies have been criticized in the past for not doing enough to aid victims of online abuse. Children also face dangers online such as cyberbullying and approaches by sexual predators, who sometimes pose as children themselves. Due to naivety, they may also post personal information about themselves online, which could put them or their families at risk unless warned not to do so. Many parents choose to enable Internet filtering or supervise their children's online activities in an attempt to protect their children from pornography or violent content on the Internet. The most popular social networking services commonly forbid users under the age of 13. However, these policies can be circumvented by registering an account with a false birth date, and a significant number of children aged under 13 join such sites.[citation needed] Social networking services for younger children, which claim to provide better levels of protection for children, also exist. Internet usage has been correlated to users' loneliness. Lonely people tend to use the Internet as an outlet for their feelings and to share their stories with others, such as in the "I am lonely will anyone speak to me" thread.[citation needed] Cyberslacking can become a drain on corporate resources; employees spend a significant amount of time surfing the Web while at work. Internet addiction disorder is excessive computer use that interferes with daily life. Nicholas G. Carr believes that Internet use has other effects on individuals, for instance improving skills of scan-reading and interfering with the deep thinking that leads to true creativity. Electronic business encompasses business processes spanning the entire value chain: purchasing, supply chain management, marketing, sales, customer service, and business relationship. E-commerce seeks to add revenue streams using the Internet to build and enhance relationships with clients and partners. According to International Data Corporation, the size of worldwide e-commerce, when global business-to-business and -consumer transactions are combined, equate to $16 trillion in 2013. A report by Oxford Economics added those two together to estimate the total size of the digital economy at $20.4 trillion, equivalent to roughly 13.8% of global sales. While much has been written of the economic advantages of Internet-enabled commerce, there is also evidence that some aspects of the Internet such as maps and location-aware services may serve to reinforce economic inequality and the digital divide. Electronic commerce may be responsible for consolidation and the decline of mom-and-pop, brick and mortar businesses resulting in increases in income inequality. A 2013 Institute for Local Self-Reliance report states that brick-and-mortar retailers employ 47 people for every $10 million in sales, while Amazon employs only 14. Similarly, the 700-employee room rental start-up Airbnb was valued at $10 billion in 2014, about half as much as Hilton Worldwide, which employs 152,000 people. At that time, Uber employed 1,000 full-time employees and was valued at $18.2 billion, about the same valuation as Avis Rent a Car and The Hertz Corporation combined, which together employed almost 60,000 people. Advertising on popular web pages can be lucrative, and e-commerce. Online advertising is a form of marketing and advertising which uses the Internet to deliver promotional marketing messages to consumers. It includes email marketing, search engine marketing (SEM), social media marketing, many types of display advertising (including web banner advertising), and mobile advertising. In 2011, Internet advertising revenues in the United States surpassed those of cable television and nearly exceeded those of broadcast television.: 19 Many common online advertising practices are controversial and increasingly subject to regulation. The Internet has achieved new relevance as a political tool. The presidential campaign of Howard Dean in 2004 in the United States was notable for its success in soliciting donation via the Internet. Many political groups use the Internet to achieve a new method of organizing for carrying out their mission, having given rise to Internet activism. Social media websites, such as Facebook and Twitter, helped people organize the Arab Spring, by helping activists organize protests, communicate grievances, and disseminate information. Many have understood the Internet as an extension of the Habermasian notion of the public sphere, observing how network communication technologies provide something like a global civic forum. However, incidents of politically motivated Internet censorship have now been recorded in many countries, including western democracies. E-government is the use of technological communications devices, such as the Internet, to provide public services to citizens and other persons in a country or region. E-government offers opportunities for more direct and convenient citizen access to government and for government provision of services directly to citizens. Cybersectarianism is a new organizational form that involves: highly dispersed small groups of practitioners that may remain largely anonymous within the larger social context and operate in relative secrecy, while still linked remotely to a larger network of believers who share a set of practices and texts, and often a common devotion to a particular leader. Overseas supporters provide funding and support; domestic practitioners distribute tracts, participate in acts of resistance, and share information on the internal situation with outsiders. Collectively, members and practitioners of such sects construct viable virtual communities of faith, exchanging personal testimonies and engaging in the collective study via email, online chat rooms, and web-based message boards. In particular, the British government has raised concerns about the prospect of young British Muslims being indoctrinated into Islamic extremism by material on the Internet, being persuaded to join terrorist groups such as the so-called "Islamic State", and then potentially committing acts of terrorism on returning to Britain after fighting in Syria or Iraq.[citation needed] Applications and services The Internet carries many applications and services, most prominently the World Wide Web, including social media, electronic mail, mobile applications, multiplayer online games, Internet telephony, file sharing, and streaming media services. The World Wide Web is a global collection of documents, images, multimedia, applications, and other resources, logically interrelated by hyperlinks and referenced with Uniform Resource Identifiers (URIs), which provide a global system of named references. URIs symbolically identify services, web servers, databases, and the documents and resources that they can provide. HyperText Transfer Protocol (HTTP) is the main access protocol of the World Wide Web. Web services also use HTTP for communication between software systems for information transfer, sharing and exchanging business data and logistics and is one of many languages or protocols that can be used for communication on the Internet. World Wide Web browser software, such as Microsoft Edge, Mozilla Firefox, Opera, Apple's Safari, and Google Chrome, enable users to navigate from one web page to another via the hyperlinks embedded in the documents. These documents may also contain computer data, including graphics, sounds, text, video, multimedia and interactive content. Client-side scripts can include animations, games, office applications and scientific demonstrations. Email is an important communications service available via the Internet. The concept of sending electronic text messages between parties, analogous to mailing letters or memos, predates the creation of the Internet. Internet telephony is a common communications service realized with the Internet. The name of the principal internetworking protocol, the Internet Protocol, lends its name to voice over Internet Protocol (VoIP).[citation needed] VoIP systems now dominate many markets, being as easy and convenient as a traditional telephone, while having substantial cost savings, especially over long distances. File sharing is the practice of transferring large amounts of data in the form of computer files across the Internet, for example via file servers. The load of bulk downloads to many users can be eased by the use of "mirror" servers or peer-to-peer networks. Access to the file may be controlled by user authentication, the transit of the file over the Internet may be obscured by encryption, and money may change hands for access to the file. The price can be paid by the remote charging of funds from, for example, a credit card whose details are also passed—usually fully encrypted—across the Internet. The origin and authenticity of the file received may be checked by a digital signature. Governance The Internet is a global network that comprises many voluntarily interconnected autonomous networks. It operates without a central governing body. The technical underpinning and standardization of the core protocols (IPv4 and IPv6) is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. While the hardware components in the Internet infrastructure can often be used to support other software systems, it is the design and the standardization process of the software that characterizes the Internet and provides the foundation for its scalability and success. The responsibility for the architectural design of the Internet software systems has been assumed by the IETF. The IETF conducts standard-setting work groups, open to any individual, about the various aspects of Internet architecture. The resulting contributions and standards are published as Request for Comments (RFC) documents on the IETF web site. The principal methods of networking that enable the Internet are contained in specially designated RFCs that constitute the Internet Standards. Other less rigorous documents are simply informative, experimental, or historical, or document the best current practices when implementing Internet technologies. To maintain interoperability, the principal name spaces of the Internet are administered by the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN is governed by an international board of directors drawn from across the Internet technical, business, academic, and other non-commercial communities. The organization coordinates the assignment of unique identifiers for use on the Internet, including domain names, IP addresses, application port numbers in the transport protocols, and many other parameters. Globally unified name spaces are essential for maintaining the global reach of the Internet. This role of ICANN distinguishes it as perhaps the only central coordinating body for the global Internet. The National Telecommunications and Information Administration, an agency of the United States Department of Commerce, had final approval over changes to the DNS root zone until the IANA stewardship transition on 1 October 2016. Regional Internet registries (RIRs) were established for five regions of the world to assign IP address blocks and other Internet parameters to local registries, such as Internet service providers, from a designated pool of addresses set aside for each region:[citation needed] The Internet Society (ISOC) was founded in 1992 with a mission to "assure the open development, evolution and use of the Internet for the benefit of all people throughout the world". Its members include individuals as well as corporations, organizations, governments, and universities. Among other activities ISOC provides an administrative home for a number of less formally organized groups that are involved in developing and managing the Internet, including: the Internet Engineering Task Force (IETF), Internet Architecture Board (IAB), Internet Engineering Steering Group (IESG), Internet Research Task Force (IRTF), and Internet Research Steering Group (IRSG). On 16 November 2005, the United Nations-sponsored World Summit on the Information Society in Tunis established the Internet Governance Forum (IGF) to discuss Internet-related issues.[citation needed] Infrastructure The communications infrastructure of the Internet consists of its hardware components and a system of software layers that control various aspects of the architecture. As with any computer network, the Internet physically consists of routers, media (such as cabling and radio links), repeaters, and modems. However, as an example of internetworking, many of the network nodes are not necessarily Internet equipment per se. Internet packets are carried by other full-fledged networking protocols, with the Internet acting as a homogeneous networking standard, running across heterogeneous hardware, with the packets guided to their destinations by IP routers.[citation needed] Internet service providers (ISPs) establish worldwide connectivity between individual networks at various levels of scope. At the top of the routing hierarchy are the tier 1 networks, large telecommunication companies that exchange traffic directly with each other via very high speed fiber-optic cables and governed by peering agreements. Tier 2 and lower-level networks buy Internet transit from other providers to reach at least some parties on the global Internet, though they may also engage in peering. End-users who only access the Internet when needed to perform a function or obtain information, represent the bottom of the routing hierarchy.[citation needed] An ISP may use a single upstream provider for connectivity, or implement multihoming to achieve redundancy and load balancing. Internet exchange points are major traffic exchanges with physical connections to multiple ISPs. Large organizations, such as academic institutions, large enterprises, and governments, may perform the same function as ISPs, engaging in peering and purchasing transit on behalf of their internal networks. Research networks tend to interconnect with large subnetworks such as GEANT, GLORIAD, Internet2, and the UK's national research and education network, JANET.[citation needed] Common methods of Internet access by users include broadband over coaxial cable, fiber optics or copper wires, Wi-Fi, satellite, and cellular telephone technology.[citation needed] Grassroots efforts have led to wireless community networks. Commercial Wi-Fi services that cover large areas are available in many cities, such as New York, London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. Most servers that provide internet services are today hosted in data centers, and content is often accessed through high-performance content delivery networks. Colocation centers often host private peering connections between their customers, internet transit providers, cloud providers, meet-me rooms for connecting customers together, Internet exchange points, and landing points and terminal equipment for fiber optic submarine communication cables, connecting the internet. Internet Protocol Suite The Internet standards describe a framework known as the Internet protocol suite (also called TCP/IP, based on the first two components.) This is a suite of protocols that are ordered into a set of four conceptional layers by the scope of their operation, originally documented in RFC 1122 and RFC 1123:[citation needed] The most prominent component of the Internet model is the Internet Protocol. IP enables internetworking, essentially establishing the Internet itself. Two versions of the Internet Protocol exist, IPv4 and IPv6.[citation needed] Aside from the complex array of physical connections that make up its infrastructure, the Internet is facilitated by bi- or multi-lateral commercial contracts (e.g., peering agreements), and by technical specifications or protocols that describe the exchange of data over the network.[citation needed] For locating individual computers on the network, the Internet provides IP addresses. IP addresses are used by the Internet infrastructure to direct internet packets to their destinations. They consist of fixed-length numbers, which are found within the packet. IP addresses are generally assigned to equipment either automatically via Dynamic Host Configuration Protocol, or are configured.[citation needed] Domain Name Systems convert user-inputted domain names (e.g. "en.wikipedia.org") into IP addresses.[citation needed] Internet Protocol version 4 (IPv4) defines an IP address as a 32-bit number. IPv4 is the initial version used on the first generation of the Internet and is still in dominant use. It was designed in 1981 to address up to ≈4.3 billion (109) hosts. However, the explosive growth of the Internet has led to IPv4 address exhaustion, which entered its final stage in 2011, when the global IPv4 address allocation pool was exhausted. Because of the growth of the Internet and the depletion of available IPv4 addresses, a new version of IP IPv6, was developed in the mid-1990s, which provides vastly larger addressing capabilities and more efficient routing of Internet traffic. IPv6 uses 128 bits for the IP address and was standardized in 1998. IPv6 deployment has been ongoing since the mid-2000s and is currently in growing deployment around the world, since Internet address registries began to urge all resource managers to plan rapid adoption and conversion. By design, IPv6 is not directly interoperable with IPv4. Instead, it establishes a parallel version of the Internet not directly accessible with IPv4 software. Thus, translation facilities exist for internetworking, and some nodes have duplicate networking software for both networks. Essentially all modern computer operating systems support both versions of the Internet Protocol.[citation needed] Network infrastructure, however, has been lagging in this development.[citation needed] A subnet or subnetwork is a logical subdivision of an IP network.: 1, 16 Computers that belong to a subnet are addressed with an identical most-significant bit-group in their IP addresses. This results in the logical division of an IP address into two fields, the network number or routing prefix and the rest field or host identifier. The rest field is an identifier for a specific host or network interface.[citation needed] The routing prefix may be expressed in Classless Inter-Domain Routing (CIDR) notation written as the first address of a network, followed by a slash character (/), and ending with the bit-length of the prefix. For example, 198.51.100.0/24 is the prefix of the Internet Protocol version 4 network starting at the given address, having 24 bits allocated for the network prefix, and the remaining 8 bits reserved for host addressing. Addresses in the range 198.51.100.0 to 198.51.100.255 belong to this network. The IPv6 address specification 2001:db8::/32 is a large address block with 296 addresses, having a 32-bit routing prefix.[citation needed] For IPv4, a network may also be characterized by its subnet mask or netmask, which is the bitmask that when applied by a bitwise AND operation to any IP address in the network, yields the routing prefix. Subnet masks are also expressed in dot-decimal notation like an address. For example, 255.255.255.0 is the subnet mask for the prefix 198.51.100.0/24.[citation needed] Computers and routers use routing tables in their operating system to forward IP packets to reach a node on a different subnetwork. Routing tables are maintained by manual configuration or automatically by routing protocols. End-nodes typically use a default route that points toward an ISP providing transit, while ISP routers use the Border Gateway Protocol to establish the most efficient routing across the complex connections of the global Internet.[citation needed] The default gateway is the node that serves as the forwarding host (router) to other networks when no other route specification matches the destination IP address of a packet. Security Internet resources, hardware, and software components are the target of criminal or malicious attempts to gain unauthorized control to cause interruptions, commit fraud, engage in blackmail or access private information. Malware is malicious software used and distributed via the Internet. It includes computer viruses which are copied with the help of humans, computer worms which copy themselves automatically, software for denial of service attacks, ransomware, botnets, and spyware that reports on the activity and typing of users.[citation needed] Usually, these activities constitute cybercrime. Defense theorists have also speculated about the possibilities of hackers using cyber warfare using similar methods on a large scale. Malware poses serious problems to individuals and businesses on the Internet. According to Symantec's 2018 Internet Security Threat Report (ISTR), malware variants number has increased to 669,947,865 in 2017, which is twice as many malware variants as in 2016. Cybercrime, which includes malware attacks as well as other crimes committed by computer, was predicted to cost the world economy US$6 trillion in 2021, and is increasing at a rate of 15% per year. Since 2021, malware has been designed to target computer systems that run critical infrastructure such as the electricity distribution network. Malware can be designed to evade antivirus software detection algorithms. The vast majority of computer surveillance involves the monitoring of data and traffic on the Internet. In the United States for example, under the Communications Assistance For Law Enforcement Act, all phone calls and broadband Internet traffic (emails, web traffic, instant messaging, etc.) are required to be available for unimpeded real-time monitoring by Federal law enforcement agencies. Under the Act, all U.S. telecommunications providers are required to install packet sniffing technology to allow Federal law enforcement and intelligence agencies to intercept all of their customers' broadband Internet and VoIP traffic.[d] The large amount of data gathered from packet capture requires surveillance software that filters and reports relevant information, such as the use of certain words or phrases, the access to certain types of web sites, or communicating via email or chat with certain parties. Agencies, such as the Information Awareness Office, NSA, GCHQ and the FBI, spend billions of dollars per year to develop, purchase, implement, and operate systems for interception and analysis of data. Similar systems are operated by Iranian secret police to identify and suppress dissidents. The required hardware and software were allegedly installed by German Siemens AG and Finnish Nokia. Some governments, such as those of Myanmar, Iran, North Korea, Mainland China, Saudi Arabia and the United Arab Emirates, restrict access to content on the Internet within their territories, especially to political and religious content, with domain name and keyword filters. In Norway, Denmark, Finland, and Sweden, major Internet service providers have voluntarily agreed to restrict access to sites listed by authorities. While this list of forbidden resources is supposed to contain only known child pornography sites, the content of the list is secret. Many countries, including the United States, have enacted laws against the possession or distribution of certain material, such as child pornography, via the Internet but do not mandate filter software. Many free or commercially available software programs, called content-control software are available to users to block offensive specific on individual computers or networks in order to limit access by children to pornographic material or depiction of violence.[citation needed] Performance As the Internet is a heterogeneous network, its physical characteristics, including, for example the data transfer rates of connections, vary widely. It exhibits emergent phenomena that depend on its large-scale organization. PB per monthYear020,00040,00060,00080,000100,000120,000140,000199019952000200520102015Petabytes per monthGlobal Internet Traffic Volume The volume of Internet traffic is difficult to measure because no single point of measurement exists in the multi-tiered, non-hierarchical topology. Traffic data may be estimated from the aggregate volume through the peering points of the Tier 1 network providers, but traffic that stays local in large provider networks may not be accounted for.[citation needed] An Internet blackout or outage can be caused by local signaling interruptions. Disruptions of submarine communications cables may cause blackouts or slowdowns to large areas, such as in the 2008 submarine cable disruption. Less-developed countries are more vulnerable due to the small number of high-capacity links. Land cables are also vulnerable, as in 2011 when a woman digging for scrap metal severed most connectivity for the nation of Armenia. Internet blackouts affecting almost entire countries can be achieved by governments as a form of Internet censorship, as in the blockage of the Internet in Egypt, whereby approximately 93% of networks were without access in 2011 in an attempt to stop mobilization for anti-government protests. Estimates of the Internet's electricity usage have been the subject of controversy, according to a 2014 peer-reviewed research paper that found claims differing by a factor of 20,000 published in the literature during the preceding decade, ranging from 0.0064 kilowatt hours per gigabyte transferred (kWh/GB) to 136 kWh/GB. The researchers attributed these discrepancies mainly to the year of reference (i.e. whether efficiency gains over time had been taken into account) and to whether "end devices such as personal computers and servers are included" in the analysis. In 2011, academic researchers estimated the overall energy used by the Internet to be between 170 and 307 GW, less than two percent of the energy used by humanity. This estimate included the energy needed to build, operate, and periodically replace the estimated 750 million laptops, a billion smart phones and 100 million servers worldwide as well as the energy that routers, cell towers, optical switches, Wi-Fi transmitters and cloud storage devices use when transmitting Internet traffic. According to a non-peer-reviewed study published in 2018 by The Shift Project (a French think tank funded by corporate sponsors), nearly 4% of global CO2 emissions could be attributed to global data transfer and the necessary infrastructure. The study also said that online video streaming alone accounted for 60% of this data transfer and therefore contributed to over 300 million tons of CO2 emission per year, and argued for new "digital sobriety" regulations restricting the use and size of video files. See also Notes References Sources Further reading External links |
======================================== |
[SOURCE: https://www.theverge.com/games/881766/los-angeles-county-is-suing-roblox] | [TOKENS: 540] |
Roblox: all the news about the popular social and gaming platformSee all StoriesPosted Feb 20, 2026 at 1:17 AM UTCJExternal LinkJay PetersLos Angeles County is suing Roblox.According to Fox 11 Los Angeles:The lawsuit alleges that “children in Los Angeles County have been repeatedly exposed to sexually explicit content, exploitation and grooming on Roblox because the company chooses to put corporate profit over the safety of children.”Some states have sued Roblox, too.Roblox sued by LA County over allegations of failing to protect children | FOX 11 Los Angeles[Fox 11 Los Angeles]Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Jay PetersCloseJay PetersSenior ReporterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Jay PetersEntertainmentCloseEntertainmentPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All EntertainmentGamingCloseGamingPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All GamingCommentsLoading commentsGetting the conversation ready...Most PopularMost PopularXbox chief Phil Spencer is leaving MicrosoftRead Microsoft gaming CEO Asha Sharma’s first memo on the future of XboxThe RAM shortage is coming for everything you care aboutAmazon blames human employees for an AI coding agent’s mistakeWill Stancil, man of the people or just an annoying guy?The Verge DailyA free daily digest of the news that matters most.Email (required)Sign UpBy submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply. Roblox: all the news about the popular social and gaming platform See all Stories According to Fox 11 Los Angeles: The lawsuit alleges that “children in Los Angeles County have been repeatedly exposed to sexually explicit content, exploitation and grooming on Roblox because the company chooses to put corporate profit over the safety of children.” Some states have sued Roblox, too. [Fox 11 Los Angeles] Posts from this author will be added to your daily email digest and your homepage feed. See All by Jay Peters Posts from this topic will be added to your daily email digest and your homepage feed. See All Entertainment Posts from this topic will be added to your daily email digest and your homepage feed. See All Gaming Most Popular The Verge Daily A free daily digest of the news that matters most. More in Gaming This is the title for the native ad Top Stories © 2026 Vox Media, LLC. All Rights Reserved |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Iran%E2%80%93Israel_proxy_conflict] | [TOKENS: 13583] |
Contents Iran–Israel proxy conflict Ongoing: Iran Israel 2024 Iran–Israel conflict Hezbollah–Israel conflict Gaza–Israel conflict Syrian civil war Houthi–Israel conflict Nuclear program of Iran West Bank conflicts International incidents The Iran–Israel proxy conflict, also known as the Iran–Israel Cold War, is an ongoing proxy war between Iran and Israel. In the Israeli–Lebanese conflict, Iran has supported Lebanese Shia militias, most notably Hezbollah. In the Israeli–Palestinian conflict, Iran has backed Palestinian groups such as Hamas.[a] Israel has supported Iranian rebels, conducted airstrikes against Iranian allies in Syria, assassinated Iranian nuclear scientists, and directly attacked Iranian forces in Syria. In 2024 the proxy war escalated to a series of direct confrontations between the two countries, and in June 2025, the Iran–Israel war began, involving the United States. Motivated by the periphery doctrine, Imperial Iran and Israel had close relations, seeing Arab powers as a common threat. After the 1979 Islamic revolution, Iran cut off relations, but covert ties continued during the subsequent Iran–Iraq War. Iran trained and armed Hezbollah to resist the Israel's 1982 invasion of Lebanon, and continued to back Shia militias throughout the Israeli occupation of Southern Lebanon. Even before 1979, Iranian Islamists had materially supported the Palestinians; after 1979 Iran attempted relations with the Palestine Liberation Organization, and later with Palestinian Islamic Jihad and Hamas. Israel fought a war with Hezbollah in 2006. Israel has fought several wars with Palestinians in and around the Gaza Strip: in 2008–2009, 2012, 2014, 2021 and since 2023. The 1982 Lebanon War and Gaza war have been the deadliest wars of the Arab–Israeli conflict. Various reasons have been given for the Iran–Israel conflict. Iran and Israel had previously enjoyed warm ties due to common threats, but by the 1990s the USSR had dissolved and Iraq had been weakened. Iranian Islamists have long championed the Palestinian people, whom they perceive as oppressed. Scholars believe that by supporting the Palestinians, Iran seeks greater acceptance among Sunnis and Arabs, both of whom dominate the Middle East. At times, Iran has supported the one-state and the two-state solution as a response to the plight of Palestinians, while the country has also used more inflammatory language to predict Israel's demise. Israel sees Iran as an existential threat. Israel has accused Iran of harboring genocidal intentions, while Iran has accused Israel of conducting a genocide in Gaza. Consequently, Israel has sought sanctions and military action against Iran to stop it from acquiring nuclear weapons. News outlets expressed how Iranian proxy militias stayed largely silent and left Iran "isolated in war" during the 2025 war with Israel. Background Iranian Islamists have a long history of sympathizing with the Palestinians. In 1949, Iranian ayatollah Mahmoud Taleghani visited the West Bank and was moved by the plight of Palestinian refugees. Taleghani began advocating for Palestinians in the 1950s and 1960s. After the Six-Day War in 1967, he raises funds (e.g. zakat) inside Iran to be sent to Palestinians. The Iranian government at the time was alarmed at these activities and SAVAK documents indicate that the government believed that the Iranian public was sympathetic to the Palestinian people. Likewise Ruhollah Khomeini championed the Palestinian people before he became Iran's Supreme Leader in 1979. He also criticized the Pahlavi dynasty's ties with Israel, viewing Israel as a supporter of the Pahlavi regime. Following the 1979 Iranian revolution, Khomeini's new government adopted a policy of hostility towards Israel. The new Iranian government saw Israel as a colonial outpost. Iran withdrew recognition of Israel as a state, and severed all diplomatic, commercial and other ties with Israel, referring to its government as the "Zionist regime" and Israel as "occupied Palestine". Despite the tension between the two countries, Israel provided support to Iran during the Iran–Iraq War from 1980 to 1988. During the war, Israel was one of the main suppliers of military equipment to Iran and also provided military instructors. Israel gave direct support to Iran's war effort when it bombed and destroyed Iraq's Osirak nuclear reactor in Operation Babylon. The nuclear reactor was considered a central component of Iraq's nuclear weapons program. The 1982 Israeli invasion of Lebanon resulted in the departures of the Palestine Liberation Organization (PLO) from Lebanon. The ensuing Israeli occupation of Southern Lebanon temporarily benefited Israeli allies in Lebanon and the civilian Israeli population with fewer violent attacks on Northern Israel by Hezbollah than previously by PLO in the 1970s.[citation needed] However, the Sabra and Shatila massacre perpetrated by Israeli proxies (the Maronite Lebanese Forces and right-wing Phalangists) against Lebanese Shias had as a long-term consequence the emergence of a homegrown Lebanese, rather than Palestinian, resistance movement within South Lebanon, which by the second half of the 1990s was posing more strategic trouble to Israel than the PLO could pose in the 1970s. Iran has established a network of allies and proxy forces across the Middle East, which it describes as part of an "axis of resistance" aimed at opposing US and Israeli interests in the area. Israel views Iran as an existential threat on account of Tehran's rhetoric, its support for proxy forces in the region, and its arming and financing of Palestinian groups such as Hamas. In some cases, proxy groups evolved into political parties, a transition that was both encouraged and nurtured by Iran. These dual-role proxies earned political legitimacy while masking terrorist activities. U.S. intelligence officials said they believe Iran does not seek a broader conflict, arguing that the primary goal of Iranian proxies is to target Israel and the United States in a way that avoids triggering a large-scale war. The United States is considered to be Israel's largest "military backer". Germany, Britain, and Italy have also supplied weapons to Israel. As detailed by the Stimson Center, Iran has historically employed at least four main fronts: Hamas in Gaza, Hezbollah in Lebanon, Shiʿite militias in Iraq, and the Houthis in Yemen. According to the report the goal is to compel Israel to defend on multiple fronts simultaneously, reducing its ability to focus on Iranian nuclear or military capabilities and pressure Israel indirectly. History Starting in the 1960s, many Iranians (both leftist and religious) had volunteered to fight against Israel with various Palestinian organizations, including the Palestinian Liberation Organization (PLO). Some of these volunteers, who had received training in Lebanon and Jordan, then returned to Iran to fight against the Shah. Yasser Arafat visited Iran on 17 February 1979, becoming the first foreign leader to visit the country after the Islamic Revolution. During Arafat's visit, Iran severed ties with Israel and expelled Israeli diplomats. The PLO found Iran's revolution inspiring, given that Khomenei, who had been exiled from his homeland, defeated a militarily powerful enemy supported by the US, something that the PLO thought it could replicate against Israel. On the other hand, Palestinians felt Arab nationalism was at a dead end. Arabs were defeated in the 1967 war, Jordan expelled the PLO in 1970 and Egypt recognized Israel in 1978. During the Iran hostage crisis, the PLO attempted to mediate with the Iranian students, but failed. In addition, secret documents were allegedly discovered at the US embassy detailing Israeli support for the Shah's regime. At the start of the Iran–Iraq War, Yasser Arafat tried to mediate between Saddam Hussein and Khomenei. Arafat feared the war would distract from the Palestinian cause. Arafat travelled personally on 20 September 1980, to Baghdad and Tehran, but his efforts were unsuccessful. Arafat eventually sided with Iraq during the war. Despite this, Iranian leaders kept a pro-Palestinian stance. Following the Iranian Revolution and the fall of the Pahlavi dynasty in 1979, Iran adopted a strong anti-Israel stance. Iran cut off all official relations with Israel. Iran also ceased to accept Israeli passports, and the holders of Iranian passports were banned from travelling to "the occupied Palestine". The Israeli Embassy in Tehran was closed and handed over to the PLO. Ayatollah Khomeini declared Israel an "enemy of Islam" and the "Little Satan". The United States was called the "Great Satan" while the Soviet Union was called the "Lesser Satan". Iran provided support for Islamist-Shia Lebanese parties, helping to consolidate them into a single political and military organization, Hezbollah, and providing them the ideological indoctrination, military training and equipment to attack Israeli and American targets. In 1982, Israel invaded Lebanon. The leaders of the Lebanese Shia community appealed to Iran for help. Khomeini sent his defense minister and military leaders to Syria to assist, however he eventually concluded that Iran could not fight a two-front war given its ongoing war with Iraq. Despite Israeli success in eradicating PLO bases and partial withdrawal in 1985, the Israeli invasion had actually increased the severity of conflict with local Lebanese militias and resulted in the consolidation of several local Shia Muslim movements in Lebanon, including Hezbollah and Amal, from a previously unorganized guerrilla movement in the south. Over the years, military casualties of both sides grew higher, as both parties used more modern weaponry, and Hezbollah progressed in its tactics. Iran supplied the militant organization Hezbollah with substantial amounts of training, weapons, explosives, financial, political, diplomatic, and organizational aid while persuading Hezbollah to take action against Israel. Hezbollah's 1985 manifesto listed among its four main goals "Israel's final departure from Lebanon as a prelude to its final obliteration." According to reports released in February 2010, Hezbollah received $400 million from Iran. By the early 1990s, Hezbollah, with support from Syria and Iran, emerged as the leading group and military power, monopolizing the directorship of the guerrilla activity in South Lebanon.[citation needed] Since Israel withdrew from Southern Lebanon and Hezbollah took over the assets of the South Lebanon Army in May 2000, the conflict continued at low-level, centering around the Shabaa Farms region. With the election of Iranian hardliner Mahmoud Ahmadinejad in 2005, relations between Iran and Israel became increasingly tense as the countries engaged in a series of proxy conflicts and covert operations against each other. During the 2006 Lebanon War, Iranian Revolutionary Guards (IRGC) were believed to have directly assisted Hezbollah fighters in their attacks on Israel. Multiple sources suggested that hundreds of Revolutionary Guard operatives participated in the firing of rockets into Israel during the war, and secured Hezbollah's long-range missiles. Revolutionary Guard operatives were allegedly seen operating openly at Hezbollah outposts during the war. In addition, Revolutionary Guard operatives were alleged to have supervised Hezbollah's attack on the INS Hanit with a C-802 anti-ship missile. The attack severely damaged the warship and killed four crewmen. It is alleged that between six and nine Revolutionary Guard operatives were killed by the Israeli military during the war. According to the Israeli media, their bodies were transferred to Syria and from there flown to Tehran. On 6 September 2007, the Israeli Air Force destroyed a suspected nuclear reactor in Syria, with ten North Koreans reportedly killed. During and immediately after the Gaza War, the Israeli Air Force, with the assistance of Israeli commandos, was reported to have allegedly carried out three airstrikes against Iranian arms being smuggled to Hamas through Sudan, as Iran launched an intensive effort to supply Hamas with weapons and ammunition. Israel hinted that it was behind the attacks. Two truck convoys were destroyed, and an arms-laden ship was sunk in the Red Sea. On 4 November 2009, Israel captured a ship in the eastern Mediterranean Sea and its cargo of hundreds of tons of weapons allegedly bound from Iran to Hezbollah. In June 2010 Stuxnet, an advanced computer worm was discovered. It is believed that it had been developed by US and Israel to attack Iran's nuclear facilities. In a study conducted by ISIS it was estimated that Stuxnet might have damaged as many as 1,000 centrifuges (10% of all installed) in the Natanz enrichment plant. Other computer viruses and malware, including Duqu and Flame, were reportedly related to Stuxnet. Iran claims that its adversaries regularly engineer sales of faulty equipment and attacks by computer viruses to sabotage its nuclear program. On 15 March 2011, Israel seized a ship from Syria bringing Iranian weapons to Gaza. In addition, the Mossad was also suspected of being responsible for an explosion that reportedly damaged the nuclear facility at Isfahan. Iran denied that any explosion had occurred, but The Times reported damage to the nuclear plant based on satellite images, and quoted Israeli intelligence sources as saying that the blast indeed targeted a nuclear site, and was "no accident". Hours after the blast took place, Hezbollah fired two rockets into northern Israel. The Israel Defense Forces reacted by firing four artillery shells at the area from where the launch originated. It was speculated that the attack was ordered by Iran and Syria as a warning to Israel. The Israeli attack was reported to have killed 7 people, including foreign nationals. Another 12 people were injured, of whom 7 later died in hospital. The Mossad was suspected of being behind an explosion at a Revolutionary Guard missile base in November 2011. The blast killed 17 Revolutionary Guard operatives, including General Hassan Moqaddam, described as a key figure in Iran's missile program. Israeli journalist Ron Ben-Yishai wrote that several lower-ranked Iranian missile experts had probably been previously killed in several explosions at various sites. In response to Israeli covert operations, Iranian agents reportedly began trying to hit Israeli targets; potential targets were then placed on high alert. Yoram Cohen, the head of Shin Bet, claimed that three planned attacks in Turkey, Azerbaijan and Thailand were thwarted at the last minute. On 11 October 2011, the United States claimed to have foiled an alleged Iranian plot that included bombing the Israeli and Saudi embassies in Washington DC and Buenos Aires. On 13 February 2012, Israeli embassy staff in Georgia and India were targeted. In Georgia, a car bomb failed to explode near the embassy and was safely detonated by Georgian police. In India, the car bomb exploded, injuring four people. Amongst the wounded was the wife of an Israeli Defense Ministry employee. Israel accused Iran of being behind the attacks. The following day, three alleged Iranian agents were uncovered in Bangkok, Thailand, thought to have been planning to kill Israeli diplomatic officials, including the ambassador, by attaching bombs to embassy cars. The cell was uncovered when one of their bombs exploded. Police responded, and the Iranian agent present at the house threw an explosive device at officers that tore his legs off, and was subsequently taken into custody. A second suspect was arrested as he tried to catch a flight out of the country, and the third escaped to Malaysia, where he was arrested by Royal Malaysian Police. Thai police subsequently arrested two people suspected of involvement. Indian police arrested a Delhi-based journalist in connection with February's car bomb. Journalist Syed Mohammed Kazmi was arrested on 6 March 2012, after being in contact with a suspect that police believe might have stuck a magnetic bomb to the diplomat's car. It is said Kazmi was an Indian citizen who worked for an Iranian publication. On 18 July 2012, a bus carrying Israeli tourists in Bulgaria was destroyed in a bombing attack that killed five Israeli tourists and the Bulgarian driver, and injured 32 people. Israeli prime minister Benjamin Netanyahu blamed Iran and Hezbollah for the attack. In July 2012, a senior Israeli defense official stated that since May 2011, more than 20 terrorist attacks planned by Iranians or suspected Hezbollah agents against Israeli targets worldwide had been foiled, including in South Africa, Azerbaijan, Kenya, Turkey, Thailand, Cyprus, Bulgaria, Nepal, Nigeria, and Peru, and that Iranian and Hezbollah operatives were incarcerated in jails throughout the world. On 6 October 2012, Israeli airplanes shot down a small UAV as it flew over northern Negev. Hezbollah confirmed it sent the drone and Nasrallah said in a televised speech that the drone's parts were manufactured in Iran. On 24 October 2012, Sudan claimed that Israel had bombed a munitions factory, allegedly belonging to Iran's Revolutionary Guard, south of Khartoum. In November 2012, Israel reported that an Iranian ship was being loaded with rockets to be exported to countries within range of Israel and that Israel "will attack and destroy any shipment of arms". In January 2013, rumors were released that the Fordow Fuel Enrichment Plant had been hit by an explosion. Further reports by IAEA concluded that there had been no such incident. On 25 April 2013, an Israeli aircraft shot down a drone off the coast of Haifa, allegedly belonging to Hezbollah. On 7 May 2013, residents of Tehran reported hearing three blasts in an area where Iran maintains its missile research and depots. Later, an Iranian website said the blasts occurred at a privately owned chemical factory. On 10 December, Hamas announced that they have resumed ties with Iran after a brief cut off over the Syrian conflict. A court in Jerusalem has sentenced an Israeli man, Yitzhak Bergel, to four-and-a-half years in prison for offering to spy for Iran. Bergel belongs to the anti-Zionist Jewish group Neturei Karta. On 5 March 2014, the Israeli navy intercepted the Klos-C cargo ship. Israel stated Iran was using the vessel to smuggle dozens of long-range rockets to Gaza, including Syrian-manufactured M-302 rockets. The operation, named Full Disclosure and carried out by Shayetet 13 special forces, took place in the Red Sea, 1,500 kilometers away from Israel and some 160 kilometers from Port Sudan. Iranian media reported that on 24 August 2014, the IRGC had shot down an Israeli drone near Natanz fuel enrichment plant. The Israeli military did not comment on the reports. Two workers were killed in an explosion that took place at a military explosives factory southeast of Tehran, near the suspected nuclear reactor in Parchin. In what seemed to be a response ordered by Iran, Hezbollah set off an explosive device on the border between Lebanon and the Israeli-controlled side of the Shebaa farms, wounding two Israeli soldiers. Israel responded with artillery fire toward two Hezbollah positions in southern Lebanon. Israel and Syria have observed a truce since Israel reaffirmed its control of most of the Golan Heights in the 1973 Yom Kippur War, but the Syrian Civil War, which began in 2011, has led to several incidents of fire exchange across the borders. The Israeli military was reported to be preparing itself for potential threats in case of a potential power vacuum in Syria. "After Assad and after establishing or strengthening their foothold in Syria, they are going to move and deflect their effort and attack Israel," an Israeli official told the Associated Press in January 2014. Some experts say that while the encroaching militant forces on Israel's border will heighten security measures, the advancements are not likely to create significant changes to Israel's policy disengagement in the Syria crisis. In the Syrian Civil War, hoping to bolster its logistics and force projection capabilities in the area, Tehran aims to clear a path from the Iranian capital to Damascus and the Mediterranean coast. The Israeli government is convinced that Iran is interested in creating territorial contiguity from Iran to the Mediterranean and in transferring military forces – including naval vessels, fighter planes and thousands of troops – to permanent bases in Syria and is trying to "Lebanonize" Syria and take over using Shi'ite militias, as it had done with Hezbollah in Lebanon. As Israeli Defence Minister Avigdor Lieberman has warned, "everything possible will be done to prevent the existence of a Shi'ite corridor from Tehran to Damascus". In 2017, Israeli intelligence discovered an Iranian base being built in Syria just 50 km from the Israeli border. The assistance provided by Iran's IRGC Quds Force under General Qasem Soleimani, Hezbollah, and Russia to the Syrian government enabled Bashar al-Assad to emerge victorious from the war in 2017, which ensured that the "worst-case scenario" for Israel, a contiguous "Axis of Resistance" stretching from Iran and Iraq through Syria to the Lebanese-Israeli border, had been realized. Mossad director Yossi Cohen said in 2018 that Israel's failure to prevent an Assad victory in Syria, together with Israel's failure to defeat Hezbollah in 2006, had meant that "[Iranian General] Qasem Soleimani, should he be so minded, could drive his car from Tehran to Lebanon's border with Israel without being stopped. And the same route would be open to truckloads of rockets bound for Iran's main regional proxy, Hezbollah." On several occasions between 2013 and 2017 Israel reportedly carried out or supported attacks on Hezbollah and Iranian targets within Syrian territories or Lebanon. One of the first reliably reported incidents took place on 30 January 2013, when Israeli aircraft struck a Syrian convoy allegedly transporting Iranian weapons to Hezbollah. Habitually, Israel refused to comment on the incident, a stance that is believed to seek to ensure that the Syrian government did not feel obliged to retaliate. More incidents were attributed to the Israeli Air Force (IAF) in May 2013, December 2014 and April 2015. Some of those reports were confirmed by the Syrian Arab Republic, whereas others were denied. Israel systematically refused to comment on alleged targeting of Hezbollah and Ba'athist Syrian targets in Syrian territory. In 2015, suspected Hezbollah militants launched a retaliatory attack on Israeli forces in Shebaa farms as a response to an Israeli airstrike in the Syrian Golan that killed Hezbollah and IRGC senior operatives. In March 2017, Syria launched anti-aircraft missiles towards the Israeli-occupied part of the Golan Heights, allegedly targeting Israeli IAF aircraft, which Syria claimed were on their way to attack targets in Palmyra in Syria. After the incident, the State of Israel stated it was targeting weapons shipments headed toward anti-Israeli forces, specifically Hezbollah, located in Lebanon. Israel denied Syria's claim that one jet fighter was shot down and another damaged. Israel has not reported any pilots or aircraft missing in Syria, or anywhere else in the Middle East following the incident. According to some sources, the incident was the first time Israeli officials clearly confirmed an Israeli strike on a Hezbollah convoy during the Syrian Civil War. As of September 2017, this was the only time such confirmation was issued. In January 2014, Israeli prime minister Benjamin Netanyahu warned that Iran's nuclear program would only be set back six weeks as a result of its interim agreement with the international community. In one of the region's oddest pairings, Israel and the Gulf Arab states led by Saudi Arabia increasingly are finding common ground – and a common political language – on their mutual dismay over the prospect of a nuclear deal in Geneva that could curb Tehran's atomic program but leave the main elements intact, such as uranium enrichment. In June 2017, former Israeli Defense Minister Moshe Ya'alon stated that "We and the Arabs, the same Arabs who organized in a coalition in the Six-Day War to try to destroy the Jewish state, today find themselves in the same boat with us ... The Sunni Arab countries, apart from Qatar, are largely in the same boat with us since we all see a nuclear Iran as the number one threat against all of us". Beginning in January 2017, the Israeli Air Force began flying almost daily attack missions against Iranian targets in Syria, dropping about 2,000 bombs in 2018 alone. Some Iranian targets were also attacked by Israeli surface-to-surface missiles or in raids by Israeli special forces. According to former IDF chief of staff Gadi Eizenkot, the decision to strike Iranian bases in Syria was made after Iran changed its strategy in 2016 as the US-led military intervention against ISIL was drawing to an end, planning to exploit the power vacuum to establish hegemony in Syria, building bases and bringing in foreign Shiite fighters. Although the full extent of the campaign would not be revealed until 2019, by early December 2017 the Israeli Air Force confirmed it had attacked arms convoys of Ba'athist Syria and Lebanon's Hezbollah nearly 100 times during more than six years of the conflict in Syria. In January 2019, Eizenkot claimed that up to that point, only a few dozen Iranian military personnel had been killed in the attacks, as Israel had taken care to primarily target Iranian infrastructure while sparing personnel so as not to give Iran any pretext to retaliate. It was reported that the Mossad stole nuclear secrets from a secure warehouse in Tehran in January 2018. According to reports, the agents came in a truck semitrailer at midnight, cut into dozens of safes with "high intensity torches", and carted out "50,000 pages and 163 compact discs of memos, videos and plans" before leaving in time to make their escape when the guards came for the morning shift at 7 am. According to the Israelis, the documents and files (which it shared with European countries and the United States), demonstrated that the Iranian AMAD Project aimed to develop nuclear weapons, that Iran had a nuclear program when it claimed to have "largely suspended it", and that there were two nuclear sites in Iran that had been hidden from inspectors. This was followed by the Trump administration withdrawing the United States from the JCPOA and reimposing US sanctions on Iran. Shortly after retiring as head of Mossad, Yossi Cohen admitted he oversaw the operation to steal the Iranian documents during a televised interview in June 2021. Benjamin Netanyahu's 2022 book revealed several new details of the operation, including an intent to sabotage the nuclear program by stealing irreplaceable documents, in addition to proving its mere existence. In July 2019, it was reported that Israel had expanded its strikes against Iranian missile shipments to Iraq, with Israeli F-35 combat planes allegedly striking Iranian targets in Iraq twice. Israeli airstrikes reportedly targeted Iran-backed militias in Iraq during 2019. On 16 September 2019, air strikes, targeting three positions of the Iranian Revolutionary Guards and allied Iraqi militias, killed at least 10 pro-Iranian militiamen in Albu Kamal, Syria. The strikes were allegedly blamed on Israel. According to the Time, the increase of Iran-Israel tension concurs with discussion of a possible rapprochement between Iran and the U.S. According to Lebanese media reports, on 26 August 2019, Israeli drones attacked a Popular Front for the Liberation of Palestine (PFLP) position in Qousaya, located in the Beqaa Valley of Lebanon, close to the border with Syria. The attack came a day after two drones exploded in the Lebanese capital Beirut. According to an official from the Palestinian position in the town three air strikes hit the PFLP-GC military position in Quasaya early morning 26 August 2019 causing only material damage. On 27 July 2020, explosions and exchange of fire were heard during a "security incident" at the border between the Israeli-occupied Golan Heights and Lebanon. The incident involved Israeli soldiers and four Hezbollah fighters who allegedly crossed the border, and came days after a Hezbollah member was killed by Israeli airstrikes in Syria and an Israeli drone crashed in Lebanon. The Israel Defense Forces said that there were no Israeli casualties and that the four Hezbollah fighters fled back to Lebanon after being shot at. However, Hezbollah denied that their forces attacked the Israeli army, and said that their fighters had not crossed the border. The group said that Israel opened fire first. Two dozen explosions were heard in Lebanon; an Israeli shell smashed in a civilian home, narrowly missing a family in the house at the time, but nobody was hurt. The US assassinated Qasem Soleimani on 3 January, reversing policy of the prior administration which had warned Iran of Israeli attempts at assassinating Soleimani. The Iranians retaliated with Operation Martyr Soleimani, in which 11 Qiam 1 missiles hit Al-Asad Airbase, causing traumatic brain injuries to 110 American soldiers. On the same day, the IRGC mistakenly shot down Ukraine International Airlines Flight 752, killing all 176 passengers and crew aboard, including 82 Iranian citizens. This triggered another wave of Iranian anti-government protests (part of the larger 2019–2020 Iranian protests), with many Iranians calling for the removal of Supreme Leader Ali Khamenei. When giant U.S. and Israeli flags were painted on the ground for crowds of Iranian protestors to trample on them, according to video filmed at the scene that has been verified by NBC News, the crowds of people outside Beheshti University refused to trample over them. On 9 May 2020, Israel was reportedly behind a cyberattack that disrupted the Shahid Rajaee port in the Strait of Hormuz, causing traffic jams of delivery trucks and delays in shipments. It was suggested that the attack was a response to a failed Iranian cyberattack on an Israeli water facility of the Sharon central region in April. In June and July, a series of explosions targeted Iran's nuclear and missile programs and various other infrastructure. There were accidents and damages reported in the Parchin military complex near Tehran on 26 June, the Sina At'har clinic in Northern Tehran on 30 June, the Natanz nuclear facility on 2 July, the Shahid Medhaj power plant (Zargan) in Ahvaz and the Karun petrochemical center in the city of Mahshahr on 4 July. It has been speculated that Israel was involved, and the damage at the centrifuge plant in Natanz alone could delay the Iranian nuclear weapons program by one or two years, according to intelligence officials. On 6 July, another explosion occurred at the Sepahan Boresh factory in the city of Baqershahr. On 9 July, explosions were reported at a missile depot belonging to Iran's Revolutionary Guards Corps west of Tehran. On 11 July, an explosion took place at the basement of an old two-story house containing gas cylinders in northern Tehran. On 12 July, a fire broke out at the Shahid Tondgooyan Petrochemical Company in southwest Iran. On 13 July, an explosion occurred at a gas condensate plant of the Kavian Fariman industrial zone in the Razavi Khorasan province. On 15 July, a large fire broke out at a shipyard in the city of Bushehr, spreading to seven wooden boats. On 18 July, an oil pipeline exploded in the Ahvaz region in Southern Iran. On 19 July, another explosion took place in a power station in Isfahan. On 4 August, a massive explosion took place at the Beirut port caused by ammonium nitrate that was stored at the place. According to the German newspaper Die Welt, Iran supplied Hezbollah with hundreds of tons of ammonium nitrate between 2013 and 2014, while around that time Lebanon confiscated thousands of tons of the explosive substance that years later led to the blast. Abdullah Ahmed Abdullah, the second-in-command of al-Qaeda, was killed on 7 August 2020 in Tehran. Mohsen Fakhrizadeh, head of Iran's nuclear weapons program, was assassinated on 27 November 2020 in Absard. Israeli commandos carried out attacks which damaged numerous Iranian cargo ships carrying oil and weapons to Syria from late 2019 to 2021. Israeli-owned ships were attacked in the Gulf of Oman and the Arabian Sea, allegedly by Iran. Israel was also reportedly behind an attack on an Iranian intelligence ship of the Islamic Revolutionary Guard Corps Navy in the Red Sea, which was heavily damaged by a limpet mine in April 2021. On 10 April 2021, Iran began injecting uranium hexafluoride gas into advanced IR-6 and IR-5 centrifuges at Natanz, but an accident occurred in the electricity distribution network the next day due to Mossad activity, according to Western and Iranian sources. On 13 April 2021, in what seemed to be an Iranian response, an Israeli-owned ship was attacked by a missile or a drone near the shores of the Fujairah emirate in the United Arab Emirates, causing light damage to the vessel. On 24 April, an Iranian fuel tanker was reportedly attacked off the Syrian coast by an Israeli drone, causing damage but no casualties. On 7 May, a massive fire broke out in Iran's southwestern city of Bushehr near the only functioning nuclear power plant of the country. The IRGC-affiliated Tasnim News Agency reported that the fire was intentional, although its cause was unknown. On 9 May, an explosion occurred at an oil tanker off the coast of Syria, causing a small fire in one of its engines. On 23 May, at least nine people were injured in a blast at an Iranian plant that reportedly produces UAVs in Isfahan. The blast occurred after Prime Minister Netanyahu said a drone armed with explosives that was downed by Israeli forces earlier in the week was launched by Iran toward Israel from either Syria or Iraq, amid the fighting in Gaza. On 26 May, an explosion took place at a petrochemical complex in the city of Asaluyeh in Southern Iran, killing a worker and injuring two. On 2 June, a fire broke out at an Iranian navy vessel, the IRIS Kharg, near the port of Jask in the Gulf of Oman, although the entire crew was able to safely disembark before the ship sank. Later in the day there was a gas leak at an oil refinery in Tehran which caused a massive fire. No injuries were reported. On 5 June, an explosion took place at the Zarand Iranian Steel Company in eastern Iran. No injuries were reported. On 20 June, it was reported that Iran's sole nuclear power plant at Bushehr underwent an emergency shutdown that would last between three and four days. On 23 June, a major damage was caused to one of the buildings of Iran's Atomic Energy Organization, although Iranian authorities denied there was any damage or casualties as a result of the sabotage attempt. On 3 July, an Israeli-owned cargo ship was struck by an "unknown weapon" in the northern Indian Ocean, causing a fire to erupt onboard the vessel, although no injuries were reported. Israeli sources suspect that Iran was behind the attack. On 5 July, a large fire was reported at a warehouse or factory near the city of Karaj, where an alleged previous attack targeted a nuclear facility reportedly used to produce centrifuges. On 14 July, Iranian media reported an explosion at an office building in western Tehran, causing heavy damage to part of the building. On 29 July, an Israeli-operated oil tanker was attacked near the coast of Oman. According to senior Israeli officials, the attack was conducted by Iran. On 10 August, a major explosion took place on a commercial ship docked at the Latakia port in Syria. Some reports identified the targeted ship as Iranian. The same day a fire broke out at an Iranian petrochemical factory on Khark Island in the Persian Gulf. On 26 September, three people were injured in a fire at an IRGC research center west of Tehran. On 26 October, a cyberattack crippled gas stations across Iran. It was reported that some hacked systems displayed messages addressing Iranian Supreme Leader Ali Khamenei, demanding to know "where is the gas." On 7 November, it was reported that Mossad thwarted multiple Iranian attacks on Israelis in Tanzania, Senegal and Ghana. African authorities arrested five suspects. Iran has also attempted to strike Israel overseas and using cyberattacks. In February 2022, an Israeli attack against an Iranian base destroyed hundrends of drones, which prompted Iran to fire missiles on an American consulate in Irbil (Iraqi Kurdistan) the following month. An Iranian cyberattack on Israeli websites was also reported. In March 2022, Reuters reported that Israel was carrying out airstrikes against Iranian personnel and militias in Syria backed by Iran. The report said that Israel "seeks to prevent Iran from transferring weapons to Hezbollah". On 22 May, Col. Hassan Sayyad Khodaei, a senior member of Iran's Islamic Revolutionary Guard Corps, was shot dead in his car in Tehran. He was among those responsible within the Guard's elite Quds Force for carrying out Iranian operations in Iraq and Syria. On 25 May, an engineer was killed and another employee was wounded during an incident at the Parchin military facility south of Tehran. Also in May, Israeli and Turkish security agencies foiled an Iranian plot to kidnap Israeli tourists in Turkey. Another plot was foiled in June following a Mossad rescue operation in Istanbul. On 12 June, Argentinian authorities immobilized an Iranian Mahan Air cargo plane that was leased to a Venezuelan state-owned airline. The passports of five Iranian passengers traveling on the plane were confiscated, some of whom were purportedly linked to the IRGC. On 13 June, Mohammad Abdous, an Iranian Air Force scientist from the Aerospace Unit working on several projects, was killed during a mission at a base in northern Iran. The incident occurred less than 24 hours after Ali Kamani, another member of the air force's Aerospace Unit, died in a car accident in the city of Khomein. The New York Times reported that Iranian officials suspect Israel poisoned engineer Ayoob Entezari and geologist Kamran Aghamolaei. On 14 June, an explosion at a chemical factory in the southern city of Firouzabad injured over 100 Iranian workers, most of them lightly. On 15 June, another IRGC officer of the aerospace division, Wahab Premarzian, died in the city of Maragheh. On 19 June, an explosion was reported at an IRGC missile base in west Tehran. The site had been targeted last year as well. On 27 June, A large cyberattack forced the Iranian state-owned Khuzestan Steel Company to halt production, with two other major steel producers also being targeted. Israeli military correspondents hinted that Israel was responsible for the assault in retaliation for a suspected Iranian cyberattack that caused rocket sirens to be heard in Jerusalem and Eilat the previous week. In July, IRGC engineer Said Thamardar Mutlak was killed in a suspected Mossad assassination in Shiraz, while Iranian state-media reported that a Mossad-linked spy network planning to carry out "unprecedented acts of sabotage and terrorist operations" in Iran was captured by IRGC intelligence. On 22 August, IRGC Brigadier General Abolfazal Alijani was killed in the Aleppo region of Syria. On 1 September, an explosion occurred at a key oil refinery in Abadan that supplies 25% of Iran's fuel needs. No injuries were reported. On 15 November, an oil tanker owned by an Israeli billionaire was attacked off the coast of Oman by an Iranian drone belonging to the IRGC, causing damage but no injuries. The same day security services in Georgia announced they foiled an Iranian plot to assassinate an Israeli businessman in that country, which was supposed to have been carried out by a Pakistani hit squad hired by Iran and assisted by the IRGC. On 23 November, Iran blamed Israel for the death of a senior adviser of the IRGC's aerospace division who was killed by a roadside bomb near Damascus. On 28 January 2023, a series of bomb-carrying drones attacked an Iranian defense factory in Isfahan, causing material damage at the plant, while a fire broke out at a refinery in the country's northwest the same day. According to The Wall Street Journal, Israel was responsible for the strike. On 17 February, the IRGC launched a drone attack against an Israeli-owned vessel in the Persian Gulf, causing minor damage. In March, Greek authorities—with help from Mossad—arrested two foreigners that were part of an Iranian cell looking to attack Jews and Israelis in that country. In April, Shin Bet announced the arrest of two Palestinians in the West Bank who were recruited by Hezbollah and the Iranian Quds Force to carry out attacks on their behalf. In June, a planned attack by IRGC members against Jews and Israelis in Cyprus was foiled by Cypriot intelligence services in cooperation with US and Israeli agencies. In September, a suspected explosion struck an Iranian missile base in the city of Khorramabad, with some observers suggesting it was a Mossad operation. On 27 September, Shin Bet arrested a five-person cell, consisting of three Palestinians and two Israeli citizens, that security officials claimed was an Iranian-led cell gathering intelligence to assassinate far-right Israeli minister Itamar Ben-Gvir and far-right activist Yehuda Glick. On 28 September, a fire broke out at a car battery factory owned by the Iranian Defense Ministry for the second time in less than a week. In 2024, former Iranian president Mahmoud Ahmadinejad said that Iran's intelligence service created a unit to counter Mossad operations, but its leader was later revealed to be a Mossad agent himself. He further added that around 20 Iranian operatives had been supplying intelligence to Israel, serving as double agents. The Gaza war between Israel and Palestinian militant groups led by Hamas[b] began on 7 October 2023, with a coordinated surprise offensive on Israel. The attack began with a rocket barrage of at least 3,000 rockets launched from the Hamas-controlled Gaza Strip against Israel. In parallel, approximately 2,500 Palestinian militants breached the Gaza–Israel barrier, attacking military bases and massacring civilians in neighboring Israeli communities. At least 1,200 people were killed, including 360 at a music festival. Unarmed civilian hostages and captured Israeli soldiers were taken to the Gaza Strip, including women and children. The surprise Palestinian attack was met with Israeli retaliatory strikes, and Israel formally declared war on Hamas and its allies a day later. Iran, which reportedly assisted Hamas with planning the attack, threatened Israel to immediately stop the war on Gaza. A conflict was reported between militants in Lebanon, including Hezbollah, and Israeli forces on 8 and 9 October. Iran and its proxies, Russia and to a lesser extent China have launched a disinformation campaign against Israel, Ukraine – which condemned the attacks – and their main ally, the United States. The Institute for Strategic Dialogue's report singled out Iranian accounts on Facebook and X that glorified the crimes of Hamas and encouraged more violence against Israeli civilians. Researchers have documented at least 40,000 bots or fake social media accounts, as well as strategic use of state-controlled media outlets like RT, Sputnik and Tasnim. Iran also accused Israel and the United States of committing war crimes in Gaza. Amidst the war, the Houthi insurgent group extended the conflict's reach by launching missile attacks directed towards Israeli territory. In December, two Iranian suspects linked to the IRGC were arrested in Cyprus for planning to target Israelis there. On 16 December, Iran reportedly executed a spy working for the Mossad in Sistan and Baluchestan province. On 18 December, Iranian media reported disruptions at 60% of gas stations across the country after a cyberattack by the hacktivist group Predatory Sparrow. On 23 December, a drone reportedly from Iran struck an Israeli-affiliated, Liberian-flagged chemical products tanker identified by Asian News International as MV Chem Pluto in the Arabian Sea, 200 nautical miles (370 km) southwest of the coast of Veraval, India. The attack caused a fire on board, but there were no injuries reported. The tanker, which was carrying crude oil, had a crew of 20 Indians, and came from a port in Saudi Arabia. The Indian Navy responded to the incident, and sent ICGS Vikram to the tanker. On 25 December, Sayyed Razi Mousavi, a top commander and senior adviser of the IRGC, was killed by an Israeli airstrike in the Sayyida Zeinab area. On 28 December, eleven leaders of the IRGC were killed in an airstrike targeting the Damascus International Airport, according to Saudi media. It was reported that the IRGC commanders were meeting high-ranking delegates at the time of the strike. On 29 December, Iran executed four people for allegedly spying for the Mossad, and arrested several others. On 20 January 2024, an airstrike killed five elite Iranian Revolutionary Guards in a building in Damascus. Iran blamed Israel. On 29 January, Iran executed four people who were accused of planning to carry out a bomb plot ordered by Mossad against a factory producing military equipment. In February, it was reported that the IRGC has recruited British Shia Muslims visiting religious sites in Iran and Iraq to spy or carry out attacks on Jews and Iranian dissidents living in the UK. On 14 February, explosions struck a natural gas pipeline in Iran, with an official blaming "sabotage and terrorist action". On 1 April 2024, the Iranian consulate annex building adjacent to the Iranian embassy in Damascus, Syria, was struck by an Israeli airstrike, killing 16 people, including Brigadier General Mohammad Reza Zahedi, a senior Quds Force commander of the Islamic Revolutionary Guard Corps (IRGC), and seven other IRGC officers. On 13 April, Iran retaliated against the attack with missile and drone strikes in Israel. On 19 April, Israel launched a series of retaliatory missile strikes on Iranian military sites. Iranian officials have also reported explosions at military sites in Syria and Iraq. Iran is the third-largest producer in oil cartel OPEC, so there was concern about rising oil prices. It was reported in the end of April that Israel killed an IRGC operative in Tehran who was allegedly involved in targeting Jews in Germany. In May 2024, reports emerged detailing Iran's alleged orchestration of terror attacks targeting Israeli embassies in Europe, facilitated by local criminal networks. The Swedish Security Service verified Iran's involvement in these security-threatening activities aimed at Israeli and Jewish establishments in Sweden. Subsequently, Swedish authorities reinforced security protocols at these sites. On the night between 30 July and 31 July, two senior figures of Iranian backed proxy groups were killed in assassinations attributed to Israel by Iran. On 14 August, several Iranian banks, including the Central Bank, were targeted as part of a significant cyberattack that led to widespread disruptions in the Iranian banking system. It was described as one of the largest-ever cyberattacks against Iran's state infrastructure. On 25 August, Islamic Resistance in Iraq said that they fired a drone strike to Haifa. On 27 August, Islamic Resistance in Iraq claimed that their drone strike struck a "vital target" in Haifa. In September, a couple was detained and charged by a Paris court for their involvement in an Iranian plot to kill Israelis and Jews in Germany and France. Iran has been accused of recruiting criminals, including drug lords, to conduct such operations on European soil. On 8 September, Israeli commandos raided an underground facility near Masyaf used by Iran and Hezbollah to build precision-guided missiles. On 17 September, the Shin Bet claimed that it thwarted a Hezbollah attempt to assassinate a former senior defence official with a claymore mine. At least 11 people were killed and 4,000 were wounded, mostly Hezbollah members, after the explosions of their pagers nationwide, including in Beirut. Among those reported injured was the Iranian ambassador, Mojtaba Amani. On 19 October 2024, a drone strike reportedly targeted Israeli PM Benjamin Netanyahu's residence, but no injuries were reported as neither he nor his wife was present at the time. Netanyahu remarked, "the proxies of Iran who today tried to assassinate me and my wife made a bitter mistake." Iran has attributed the reported attack to Hezbollah, with the state-run IRNA news agency quoting Iran's mission to the UN saying: "The action in question has been carried out by Hezbollah in Lebanon." In February 2026, Israel's Shin Bet and National Cyber Directorate reported foiling hundreds of cyberattack attempts attributed to Iranian intelligence over the prior year. These mainly involved targeted phishing and account takeovers against senior government officials, defense figures, academics, journalists, and others, with increased activity since the June 2025 Israel–Iran war. Israeli authorities stated that most attempts were thwarted, with no major successful breaches reported. Iranian supporters and alleged proxies Iran and Ba'athist Syria were close strategic allies, and Iran had provided significant support for the Bashar al-Assad's government in the Syrian Civil War, including logistical, technical and financial support, as well as training and some combat troops. Iran saw the survival of the Syrian government as being crucial to its regional interests, however the regime collapsed after a rebel offensive in December 2024. The Iranian government downplayed the extent of its strategic loss, although Brig. Gen. IRGC Commander Behrouz Esbati admitted that Iran was "defeated very badly". The Supreme Leader of Iran, Ali Khamenei, was reported in September 2011 to be vocally in favor of the Syrian government. When the uprising developed into the Syrian Civil War, there were increasing reports of Iranian military support, and of Iranian training of National Defence Forces (NDF) both in Syria and Iran. Iranian security and intelligence services were advising and assisting the Syrian military in order to preserve Assad's hold on power. Those efforts include training, technical support, and combat troops. Thousands of Iranian operatives – as many as 10,000 by the end of 2013 – have fought in the Syrian civil war on the pro-government side, including regular troops and militia members. In 2018, Tehran said that 2,100 Iranian soldiers have been killed in Syria and Iraq over the past seven years. Iran has also sponsored and facilitated the involvement of Shia militias from across the region to fight in Syria, including Lebanese Hezbollah, Afghan Liwa Fatemiyoun, Pakistani Liwa Zainebiyoun, Iraqi Harakat al-Nujaba, Kataib Seyyed al-Shuhada and Kataib Hezbollah, and Bahraini Saraya Al-Mukhtar. Hezbollah was established as a hybrid organization, with political and social components aimed at gaining legitimacy. Similar to Iran's bonyads, Hezbollah set up an extensive welfare network for the Shiite in poverty. The group provided discounted or free medical care, education, and cultural events. Hezbollah has grown to an organization with seats in the Lebanese government, a radio and a satellite television-station, programs for social development and large-scale military deployment of fighters beyond Lebanon's borders. The organization has been called a "state within a state". Hezbollah is part of the March 8 Alliance within Lebanon, in opposition to the March 14 Alliance. Hezbollah maintains strong support among Lebanon's Shi'a population, while Sunnis have disagreed with the group's agenda. Following the end of the Israeli occupation of South Lebanon in 2000, its military strength grew significantly, such that its paramilitary wing is considered more powerful than the Lebanese Army. Iran has supported Hezbollah since its founding in 1982, Hezbollah was the first significant proxy nurtured by the Islamic republic as part of Iran's "strategy of confronting Israel on multiple fronts". Hezbollah receives military training, weapons, and financial support from Iran, and political support from Syria. Hezbollah also fought against Israel in the 2006 Lebanon War. Hezbollah has been a major combatant in the Syrian Civil War, helping to ensure the survival of the Iran-backed Assad government. Active support and troop deployment began in 2012, steadily increasing thereafter. Hezbollah deployed several thousand fighters in Syria and by 2015 lost up to 1,500 fighters in combat. Hezbollah has also been very active to prevent rebel penetration from Syria to Lebanon, being one of the most active forces in the Syrian Civil War spillover in Lebanon. By March 2019, 1,677 Lebanese Hezbollah fighters had reportedly been killed in Syria. Lina Khatib, director of the SOAS Middle East Institute in London, said in an interview with the Associated Press "Iran’s support has helped Hezbollah consolidate its position as Lebanon’s most powerful political actor as well as the most equipped military actor supported by Iran in the whole of the Middle East". Iran has backed Hamas since the 1990's. The Washington Institute details that during the second intifada, the IRGC, Quads force and Hezbollah cooperated closely with Hamas. the integration positioned Hamas as a strategic arm within Iran's "Axis of resistance", operating closely with other Iranian proxies. Between 2005 and 2011, Iran was one of the main funders and suppliers of Hamas. Israel estimates the Hamas' Brigades have a core of several hundred members, who received military training, including training in Iran and in Syria (before the Syrian Civil War). In 2011, after the outbreak of the Syrian Civil War, Hamas distanced itself from the Syrian government and its members began leaving Syria. In a speech for the spokesman of the Qassam brigades in 2014 on Hamas's 27 anniversary he thanked Iran for aid in finance and weapons. In 2008, Sudan and Iran signed a military cooperation agreement. The agreement was signed by Iran's Defense Minister Mostafa Mohammad-Najjar and his Sudanese counterpart Abdelrahim Mohamed Hussein. In 2011, however, Sudan reduced its cooperation with Iran after the start of the Syrian Civil War.[citation needed] In 2015, Sudan completely severed its ties with Iran, by participating in the Saudi-led intervention in the Yemeni Crisis on behalf of the Sunni Arab alliance. Iran is a major financial supporter of the Islamic Jihad Movement in Palestine (PIJ). Following the Israeli and Egyptian squeeze on Hamas in early 2014, PIJ has seen its power steadily increase with the backing of funds from Iran. Its financial backing is believed to also come from Syria. The 2003 invasion of Iraq unintentionally allowed Iran to strengthen its influence by gaining political influence and establishing loyal militias within Iraq. Asa'ib Ahl al-Haq and Harakat Hezbollah al-Nujaba are militias in Iraq backed by Iran. Iran has supplied weapons to the Shia Houthi movement in Yemen known as "Ansar Allah". Houthis control of Al Hudaydah's port and Sanaa solved the Quds Force's logistics for delivering weapons into Yemen. By 2012, the IRGC's Weapons Transfer Unit (Unit 190), under the leadership of Brigadier Generals Behnam Shahariyari and Sayyed Jabar Hosseini, began seeking methods to smuggle weapons into Yemen. The IRGC set up an "air bridge," initially operating two flights per day. Later, an Iranian fleet began transporting military supplies to the Hudaydah Port. Israeli supporters and alleged proxies Israel's closest military ally, the United States, has a long history of violence against Iran, including the August 1953 overthrow of the Mossadegh government by U.S. and U.K. covert operatives, and the decades long U.S. support for the authoritarian rule of the Shah. The U.S. provided major military and other support to Saddam Hussein's Iraq for decades after Iraq attacked Iran, and in 1988, the United States launched Operation Praying Mantis against Iran, the largest American naval combat operation since World War II. The United States has military bases that virtually encircle Iran. On 22 June 2025, the United States Air Force and Navy attacked three nuclear facilities in Iran as part of the Iran–Israel war. While Iran is the world's main Shia Muslim-led country, Saudi Arabia is considered a leading Sunni Muslim nation. In what has been described as a cold war, the Iran–Saudi Arabia proxy conflict, waged on multiple levels over geopolitical, economic, and sectarian influence in pursuit of regional hegemony, has been a major feature of Western Asia since 1979. American support for Saudi Arabia and its allies as well as Russian and Chinese support for Iran and its allies have drawn comparisons to the dynamics of the Cold War era, and the proxy conflict has been characterized as a front in what Russian prime minister Dmitry Medvedev has referred to as the "New Cold War". The rivalry today is primarily a political and economic struggle exacerbated by religious differences, and sectarianism in the region is exploited by both countries for geopolitical purposes as part of the conflict. Israel and Saudi Arabia do not have any official diplomatic relations. However, news reports have indicated extensive behind-the-scenes diplomatic and intelligence cooperation between the countries, in pursuit of mutual goals against regional enemy Iran. The Gaza war had a significant impact on diplomatic efforts. Speculation arose that Iran was trying to sabotage relations between Israel and Saudi Arabia, with former head of research for Shin Bet Neomi Neumann saying the attack could have been timed in part due to Iran's hopes to scuttle efforts to normalize relations between Israel and its Sunni rival. On 9 October, Iranian Foreign Ministry spokesperson Nasser Kanaani denied claims of Tehran's involvement in Hamas's attack. On 12 October, Saudi Crown Prince Mohammed bin Salman discussed the Israel-Gaza situation with Iranian president Ebrahim Raisi. On 13 October, Saudi Arabia criticized Israel for the displacement of Palestinians from Gaza and the attacks on civilians. On 14 October, Saudi Arabia suspended talks on the possible normalization of relations with Israel. Following the escalation of clashes between Israel and Iran, the Kurdistan Freedom Party declared its support for the Israeli strikes on Iran and called for an uprising. Other non-state actors opposed to the Iranian government have also been linked to developments related to Israel’s confrontation with Iran. The People's Mojahedin Organization of Iran has been reported in some media and intelligence accounts to have had alleged links to Israeli intelligence operations targeting Iranian nuclear scientists, including claims of training, funding, and operational support. Notable wars and violent events 621 soldiers of the South Lebanon Army 1,276 Hezbollah soldiers 16 Hezbollah soldiers In 2010, a wave of assassinations targeting Iranian nuclear scientists began. The assassinations were widely believed to be the work of Mossad, Israel's foreign intelligence service. According to Iran and global media sources, the methods used to kill the scientists is reminiscent of the way Mossad had previously assassinated targets. The assassinations were alleged to be an attempt to stop Iran's nuclear program, or to ensure that it cannot recover following a strike on Iranian nuclear facilities. In the first attack, particle physicist Masoud Alimohammadi was killed on 12 January 2010 when a booby-trapped motorcycle parked near his car exploded. On 12 October 2010, an explosion occurred at an IRGC military base near the city of Khorramabad, killing 18 soldiers. On 29 November 2010, two senior Iranian nuclear scientists, Majid Shahriari and Fereydoon Abbasi, were targeted by hitmen on motorcycles, who attached bombs to their cars and detonated them from a distance. Shahriari was killed, while Abbasi was severely wounded. On 23 July 2011, Darioush Rezaeinejad was shot dead in eastern Tehran. On 11 January 2012, Mostafa Ahmadi Roshan and his driver were killed by a bomb attached to their car from a motorcycle. Iran blamed Israel and the U.S. for the assassinations. Iranian officials also blamed British intelligence agencies. Mahmoud Alavi, Iran's intelligence minister, said the person who planned the killing was "a member of the armed forces" indirectly suggesting that the perpetrator might have been from the Islamic Revolutionary Guard Corps (IRGC). In 2014 NBC reported that two US officials said the MEK had received finance and training from Irael in killing nuclear scientists, while a senior State Department official later said they never claimed the MEK was involved in the assassinations of Iranian nuclear scientists. In early 2011, Majid Jamali Fashi confessed to the killing of Masoud Alimohammadi on Iranian state television, saying that he had trained for the operation at a Mossad facility near Tel Aviv. Fashi was executed in May 2012. That month, Iranian authorities announced the arrest of another 14 Iranians – eight men and six women – described as an Israeli-trained terror cell responsible for five of the attacks on Iranian scientists. Iran's IRTV Channel 1 broadcast a half-hour documentary, Terror Club, which included "the televised confessions of the 12 suspects allegedly involved in the killings of Ali-Mohammadi, Shahriari, Rezaeinejad, and Roshan, and the attempted killing of Abbasi." In 2024, the proxy war escalated to a series of direct confrontations between the two countries. On 1 April, Israel bombed an Iranian consulate complex in Damascus, Syria, killing multiple senior Iranian officials. In response, Iran and its Axis of Resistance allies seized the Israeli-linked ship MSC Aries and launched strikes inside Israel on 13 April. Israel then carried out retaliatory strikes in Iran and Syria on 19 April. The Israeli strikes were limited, and analysts say they signaled a desire to de-escalate. Iran did not respond to the attack, and tensions de-escalated back down to the proxy war. On 13 June 2025, the conflict escalated further into the armed conflict between Iran and the Houthis against Israel and the United States, when Israel launched a surprise attack targeting key Iranian military and nuclear facilities. The opening hours of the war saw targeted assassinations and attempted assassinations of Iran's top military leaders, nuclear scientists, and politicians (including Ali Shamkhani, who was overseeing nuclear talks with the United States), airstrikes on nuclear and military facilities, and destruction of Iran's air defenses. Iran retaliated by launching missiles at military sites and cities in Israel. The United States, which had been defending Israel since the beginning of the war by shooting down Iranian missiles and drones, took offensive action on 22 June 2025, by striking three Iranian nuclear sites. In response, the Houthis in Yemen considered the American strikes a "declaration of war" and have fired several missiles at Israel. The New York Times, France24 and other news outlets expressed how Iranian proxy militias stayed largely silent and left Iran "isolated in war" during the 2025 war with Israel. Post 2025 war Following Israeli military operations during the Iran–Israel war that resulted in the deaths of key Iranian military figures, including senior members of the Islamic Revolutionary Guard Corps (IRGC), Iran has continued to support allied groups across the Middle East. According to the United States Central Command, the Iranian Revolutionary Guard has supplied weapons, including drones and missiles, to groups such as Hezbollah in Lebanon, the Houthis in Yemen, and Shiite militias in Iraq. In Yemen, US authorities reported intercepting a vessel carrying 750 tons of Iranian weapons, including drone engines and radar systems, allegedly destined for the Houthis. In Iraq, the Kurdish Regional Government (KRG) accused suspected Iranian-backed militias of carrying out drone attacks on oil infrastructure, including facilities operated by US companies. In Syria, the Syrian Interior Ministry and Syrian police reported intercepting multiple weapons shipments allegedly bound for Hezbollah, including anti-tank missiles hidden in commercial trucks. These developments have occurred as nuclear negotiations between Iran and the United States remain stalled. International responses Russian foreign policy in the Middle East during the early 2000s, in light of the Iran–Israel proxy war. After 2001 the government of Vladimir Putin intensified Russia's involvement in the region, supporting Iran's nuclear programs and forgiving Syria 73% of its $13 billion debt. According to March 2007 brief entitled Russia's New Middle Eastern Policy: Back to Bismarck? by Ariel Cohen (Institute for Contemporary Affairs), Syria ... was supplying Hizbullah with Russian weapons. In 2006, Israeli forces found evidence of the Russian-made Kornet-E and Metis-M anti-tank systems in Hizbullah's possession in southern Lebanon. The Russian response to accusations that it was supplying terrorist groups with weapons was an announcement, in February 2007, that Russia's military will conduct inspections of Syrian weapons storage facilities with the goal of preventing the weapons from reaching unintended customers. Predictably, such developments placed considerable strain on the already-deteriorating relations between Russia and Israel... For several years Russia has been attempting to engage in military cooperation with both Israel and Syria. However, the levels of cooperation with the two states are inversely related and an escalation of arms sales to Syria can only damage the relationship with Israel. Russian-Syrian military cooperation has gone through numerous stages: high levels of cooperation during the Soviet era, which was virtually halted until 2005, and now Russia's attempt to balance its relationship with both Israel and Syria. However, Russia's recent eastward leanings might indicate that Moscow is prepared to enter a new stage in its military cooperation with Syria, even if this is to the detriment of its relationship with Israel. Israel–Russia relations improved after the Russian military intervention in Syria in September 2015. From then until July 2018, Israeli prime minister Benjamin Netanyahu and Putin met a total of nine times. Prior to and immediately after the 2016 United States presidential election, Israel began lobbying the United States to strike a deal with Russia over restricting the Iranian military presence in Syria in exchange for removing U.S. sanctions against Russia. In 2019, Russia rejected an Iranian request to buy S-400 missile defense system. Ruslan Pukhov, head of the Center of Analysis of Strategies and Technologies in Moscow, said: "If Russia decides to provide Iran with S-400, it will be a direct challenge to Saudi Arabia and Israel, so it will be against Russia's own national interests." See also Notes Citations References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Fox_News#cite_note-4] | [TOKENS: 18442] |
Contents Fox News The Fox News Channel (FNC), often referred to as Fox News, is an American multinational conservative news and political commentary television channel and website based in New York City. Owned by the Fox News Media subsidiary of Fox Corporation, it is the most-watched cable news network in the United States, and as of 2023 it generates approximately 70% of its parent company's pre-tax profit. The channel broadcasts primarily from studios at 1211 Avenue of the Americas in Midtown Manhattan. Fox News provides service to 86 countries and territories, with international broadcasts featuring Fox Extra segments during advertising breaks. The channel was created by Australian-born American media mogul Rupert Murdoch in 1996 to appeal to a conservative audience, hiring former Republican media consultant and CNBC executive Roger Ailes as its founding CEO. It launched on October 7, 1996 to 17 million cable subscribers. Fox News grew during the late 1990s and 2000s to become the dominant United States cable news subscription network. By September 2018, 87 million U.S. households (91% of television subscribers) could receive Fox News. In 2019, it was the top-rated cable network, averaging 2.5 million viewers in prime time. Murdoch, the executive chairman since 2016, said in 2023 that he would step down and hand responsibilities to his son, Lachlan. Suzanne Scott has been the CEO since 2018. It has been criticized for biased and false reporting in favor of the Republican Party, its politicians, and conservative causes, while portraying the Democratic Party in a negative light. Some researchers have argued that the channel is damaging to the integrity of news overall, and acts as the de facto broadcasting arm of the Republican Party. Since its formation, the channel has politically shifted further rightwards over time, and by 2016 became solidly pro-Trump. The channel has knowingly endorsed false conspiracy theories to promote Republican and conservative causes. These include, but are not limited to, false claims regarding fraud with Dominion voting machines during their reporting on the 2020 presidential election, climate change denial,[a] and COVID-19 misinformation. It has also been involved in multiple controversies, including accusations of permitting sexual harassment and racial discrimination by on-air hosts, executives, and employees, ultimately paying out millions of dollars in legal settlements. History In May 1985, Australian publisher Rupert Murdoch announced that he and American industrialist and philanthropist Marvin Davis intended to develop "a network of independent stations as a fourth marketing force" to directly compete with CBS, NBC, and ABC through the purchase of six television stations owned by Metromedia. In July 1985, 20th Century Fox announced Murdoch had completed his purchase of 50% of Fox Filmed Entertainment, the parent company of 20th Century Fox Film Corporation. Subsequently, and prior to founding FNC, Murdoch had gained experience in the 24-hour news business when News Corporation's BSkyB subsidiary began Europe's first 24-hour news channel (Sky News) in the United Kingdom in 1989. With the success of his efforts establishing Fox as a TV network in the United States, experience gained from Sky News and the turnaround of 20th Century Fox, Murdoch announced on January 30, 1996, that News Corp. would launch a 24-hour news channel on cable and satellite systems in the United States as part of a News Corp. "worldwide platform" for Fox programming: "The appetite for news – particularly news that explains to people how it affects them – is expanding enormously". In February 1996, after former U.S. Republican Party political strategist and NBC executive Roger Ailes left cable television channel America's Talking (now MSNBC), Murdoch asked him to start Fox News Channel. Ailes demanded five months of 14-hour workdays and several weeks of rehearsal shows before its launch on October 7, 1996. At its debut, 17 million households were able to watch FNC; however, it was absent from the largest U.S. media markets of New York City and Los Angeles. Rolling news coverage during the day consisted of 20-minute single-topic shows such as Fox on Crime or Fox on Politics, surrounded by news headlines. Interviews featured facts at the bottom of the screen about the topic or the guest. The flagship newscast at the time was The Schneider Report, with Mike Schneider's fast-paced delivery of the news. During the evening, Fox featured opinion shows: The O'Reilly Report (later The O'Reilly Factor), The Crier Report (hosted by Catherine Crier) and Hannity & Colmes. From the beginning, FNC has placed heavy emphasis on visual presentation. Graphics were designed to be colorful and gain attention; this helped the viewer to grasp the main points of what was being said, even if they could not hear the host (with on-screen text summarizing the position of the interviewer or speaker, and "bullet points" when a host was delivering commentary). Fox News also created the "Fox News Alert", which interrupted its regular programming when a breaking news story occurred. To accelerate its adoption by cable providers, Fox News paid systems up to $11 per subscriber to distribute the channel. This contrasted with the normal practice, in which cable operators paid stations carriage fees for programming. When Time Warner bought Ted Turner's Turner Broadcasting System, a federal antitrust consent decree required Time Warner to carry a second all-news channel in addition to its own CNN on its cable systems. Time Warner selected MSNBC as the secondary news channel, not Fox News. Fox News claimed this violated an agreement (to carry Fox News). Citing its agreement to keep its U.S. headquarters and a large studio in New York City, News Corporation enlisted the help of Mayor Rudolph Giuliani's administration to pressure Time Warner Cable (one of the city's two cable providers) to transmit Fox News on a city-owned channel. City officials threatened to take action affecting Time Warner's cable franchises in the city. In 2001, during the September 11 attacks, Fox News was the first news organization to run a news ticker on the bottom of the screen to keep up with the flow of information that day. The ticker has remained, and has proven popular with viewers. In January 2002, Fox News surpassed CNN in ratings for the first time. Accelerating in the 2000s, the role of conservative media and Fox News led to it being trusted by the Republican Party's base over that of traditional conservative elites, and partly led to Donald Trump's victory in the Republican primaries against the wishes of a very weak party establishment and traditional power brokers.: 27–28 Fox News subsequently became solidly pro-Trump, and cultivated deep ties between itself and the government. For his first term, nearly 20 current and former Fox News hosts received administrative and cabinet-level positions in his administration, and his second term also featured 23 current and former Fox News hosts appointed and nominated. In 2023, The Economist reported that Murdoch had "ditched a plan" to remerge News Corporation with Fox because it "faced resistance from News Corp investors unhappy at the prospect of being lumped together with Fox News, which they consider a toxic brand." Later that year, Murdoch said he would step down and that his son Lachlan would take over both Fox Corporation and News Corp, although the succession was disputed legally. In September 2025, Lachlan Murdoch secured control of Fox News, the New York Post and The Wall Street Journal in a $3.3 billion dollar deal as part of a renegotiated trust. The new trust and Lachlan's control was described as ensuring the channel's conservative slant until its expiration in 2050. Political alignment Fox News has been identified as practicing biased and false reporting in favor of the Republican Party, its politicians, and conservative causes, while portraying the Democratic Party in a negative light. Fox News has been characterized by critics, commentators, and researchers as an advocacy news organization[b] and as damaging to the integrity of news overall. It has been criticized for sharing propaganda.[c] The network is pro-Trump. During and after the 2020 presidential election, its primetime hosts promoted Trump and the Republican Party, and host Jeanine Pirro was in communication with the chair of the Republican National Committee. By 2017, a growing number of studies and academic literature found Fox's prime-time programming engaging in rhetorical and nonfactual themes similar to propaganda and not journalism or persuasion. Academic studies have argued that it has played a major role in boosting Republican turnout in American elections and that its role in American politics has been underestimated by political and communications scholars. Fox has been described as operating in an information silo where its audience views other media sources as "too liberal", and thus rely on Fox and no other forms of news media. Researchers and commentators have compared conservative Fox News as similar in purpose to liberal MSNBC, but that "the proportion of Fox News statements that are mostly false or worse is almost 50 percent higher than for MSNBC, and more than twice that of CNN". Its news coverage has gradually shifted further rightwards over time. Fox's most popular programs such as Hannity and Tucker Carlson Tonight do not make any claims to be accurate or fact-checked, and have little to no distinction between news and commentary. Media analyst Brian Stelter, who has written extensively about the network, observed in 2021 that in more recent years it had adjusted its programming to present "less news on the air and more opinions-about-the-news" throughout the day, on concerns it was losing viewers to more conservative competitors that were presenting such content. Outlets FNC maintains an archive of most of its programs. This archive also includes Movietone News series of newsreels from its now Disney-owned namesake movie studio, 20th Century Studios. Licensing for the Fox News archive is handled by ITN Source, the archiving division of ITN. FNC presents a variety of programming, with up to 15 hours of live broadcasting per day in addition to programming and content for the Fox Broadcasting Company. Most programs are broadcast from Fox News headquarters in New York City (at 1211 Avenue of the Americas), in its streetside studio on Sixth Avenue in the west wing of Rockefeller Center, sharing its headquarters with sister channel Fox Business Network. Fox News Channel has eight studios at its New York City headquarters that are used for its and Fox Business' programming: Studio B (used for Fox Business programming), Studio D (which has an area for studio audiences; no longer in current use), Studio E (used for Gutfeld! and The Journal Editorial Report), Studio F (used for The Story with Martha MacCallum, The Five, Fox Democracy 2020, Fox & Friends, Outnumbered, The Faulkner Focus, and Fox News Primetime), Studio G (which houses Fox Business shows, The Fox Report, Your World with Neil Cavuto, and Cavuto Live), Studio H (Fox News Deck used for breaking news coverage, no longer in current use), Studio J (used for America's Newsroom, Hannity, Fox News Live, Fox & Friends First, and Sunday Morning Futures) Starting in 2018, Thursday Night Football had its pregame show, Fox NFL Thursday, originating from Studio F. Another Fox Sports program, First Things First, also broadcasts from Studio E. Other such programs (such as Special Report with Bret Baier, The Ingraham Angle, Fox News @ Night, Media Buzz, and editions of Fox News Live not broadcast from the New York City studios) are broadcast from Fox News's Washington, D.C. studios, located on Capitol Hill across from Washington Union Station in a secured building shared by a number of other television networks, which includes NBC News and C-SPAN. The Next Revolution is broadcast from Fox News' Los Angeles bureau studio, which is also used for news updates coming from Los Angeles. Life, Liberty & Levin is done from Levin's personal studio in Virginia. Audio simulcasts of the channel are aired on SiriusXM Satellite Radio. In an October 11, 2009, in a New York Times article, Fox said its hard-news programming runs from "9 AM to 4 PM and 6 to 8 PM on weekdays". However, it makes no such claims for its other broadcasts, which primarily consist of editorial journalism and commentary. Fox News Channel began broadcasting in the 720p resolution format on May 1, 2008. This format is available on all major cable and satellite providers. Fox News Media produces Fox News Sunday, which airs on Fox Broadcasting and re-airs on the Fox News Channel. Fox News also produces occasional special event coverage that is broadcast on Fox Business. With the growth of the FNC, the company introduced a radio division, Fox News Radio, in 2003. Syndicated throughout the United States, the division provides short newscasts and talk radio programs featuring personalities from the television and radio divisions. In 2006, the company also introduced Fox News Talk, a satellite radio station featuring programs syndicated by (and featuring) Fox News personalities. Introduced in December 1995, the Fox News website features news articles and videos about national and international news. Content on the website is divided into politics, media, U.S., and business. Fox News' articles are based on the network's broadcasts, reports from Fox affiliates and articles produced by other news agencies, such as the Associated Press. Articles are usually accompanied by a video related to the article. Fox News Latino is the version aimed at a Hispanic audience, although presented almost entirely in English, with a Spanish section. According to NewsGuard, "Much of FoxNews.com's content, particularly articles produced by beat reporters and broadcasts produced by network correspondents, is accurate and well-sourced ... However, FoxNews.com has regularly advanced false and misleading claims on topics including the Jan. 6, 2021, attack on the U.S. Capitol, the Russo-Ukrainian War, COVID-19, and U.S. elections". In September 2008, FNC joined other channels in introducing a live streaming segment to its website: The Strategy Room, designed to appeal to older viewers. It airs weekdays from 9 AM to 5 PM and takes the form of an informal discussion, with running commentary on the news. Regular discussion programs include Business Hour, News With a View and God Talk. In March 2009, The Fox Nation was launched as a website intended to encourage readers to post articles commenting on the news. Fox News Mobile is the portion of the FNC website dedicated to streaming news clips formatted for video-enabled mobile phones. In 2018, Fox News announced that it would launch a subscription video on demand service known as Fox Nation. It serves as a companion service to FNC, carrying original and acquired talk, documentary, and reality programming designed to appeal to Fox News viewers. Some of its original programs feature Fox News personalities and contributors. Ratings and reception In 2003, Fox News saw a large ratings jump during the early stages of the U.S. invasion of Iraq. At the height of the conflict, according to some reports, Fox News had as much as a 300% increase in viewership (averaging 3.3 million viewers daily). In 2004, Fox News' ratings for its broadcast of the Republican National Convention exceeded those of the three major broadcast networks. During President George W. Bush's address, Fox News attracted 7.3 million viewers nationally; NBC, ABC, and CBS had a viewership of 5.9 million, 5.1 million, and 5.0 million respectively. Between late 2005 and early 2006, Fox News saw a brief decline in ratings. One was in the second quarter of 2006, when it lost viewers for every prime-time program compared with the previous quarter. The audience for Special Report with Brit Hume, for example, dropped 19%. Several weeks later, in the wake of the 2006 North Korean missile test and the 2006 Lebanon War, Fox saw a surge in viewership and remained the top-rated cable news channel. Fox produced eight of the top ten most-watched nightly cable news shows, with The O'Reilly Factor and Hannity & Colmes finishing first and second respectively. FNC ranked No. 8 in viewership among all cable channels in 2006, and No. 7 in 2007. The channel ranked number one during the week of Barack Obama's election (November 3–9) in 2008, and reached the top spot again in January 2010 (during the week of the special Senate election in Massachusetts). Comparing Fox to its 24-hour-news-channel competitors in May 2010, the channel drew an average daily prime-time audience of 1.8 million viewers (versus 747,000 for MSNBC and 595,000 for CNN). In September 2009, the Pew Research Center published a report on the public view of national news organizations. In the report, 72% of polled Republican Fox viewers rated the channel as "favorable", while 43% of polled Democratic viewers and 55% of all polled viewers shared that opinion. However, Fox was given the highest "unfavorable" rating of all national outlets studied (25% of all polled viewers). The report went on to say that "partisan differences in views of Fox News have increased substantially since 2007". A January 2020 Pew Research Center study found that 43% of all American adults trusted Fox News, including 65% of Republicans and people who lean Republican, while 61% of Democrats and people who lean Democratic distrusted Fox News. A Public Policy Polling poll concluded in 2013 that positive perceptions of FNC had declined from 2010. 41% of polled voters said they trust it, down from 49% in 2010, while 46% said they distrust it, up from 37% in 2010. It was also called the "most trusted" network by 34% of those polled, more than had said the same of any other network. On the night of October 22, 2012, Fox set a record for its highest-rated telecast, with 11.5 million viewers for the third U.S. presidential debate. In prime time the week before, Fox averaged almost 3.7 million viewers with a total day average of 1.66 million viewers. In prime time and total day ratings for the week of April 15 to 21, 2013, Fox News, propelled by its coverage of the Boston Marathon bombing, was the highest-ranked network on U.S. cable television, for the first time since August 2005, when Hurricane Katrina hit the Gulf Coast of the United States. January 2014 marked Fox News's 145th consecutive month as the highest-rated cable news channel. During that month, Fox News beat CNN and MSNBC combined in overall viewers in both prime time hours and the total day. In the third quarter of 2014, the network was the most-watched cable channel during prime time hours. During the final week of the campaign for the United States elections, 2014, Fox News had the highest ratings of any cable channel, news or otherwise. On election night itself, Fox News' coverage had higher ratings than that of any of the other five cable or network news sources among viewers between 25 and 54 years of age. The network hosted the first prime-time GOP candidates' forum of the 2016 campaign on August 6. The debate reached a record-breaking 24 million viewers, by far the largest audience for any cable news event. A 2017 study by the Berkman Klein Center for Internet & Society at Harvard University found that Fox News was the third most-shared source among supporters of Donald Trump on Twitter during the 2016 presidential election, behind The Hill and Breitbart News. In 2018, Fox News was rated by Nielsen as America's most watched cable network, averaging a record 2.4 million viewers in prime time and total day during the period of January 1 to December 30, 2018. In an October 2018 Simmons Research survey of the trust in 38 news organizations, Fox News was ranked roughly in the center, with 44.7% of surveyed Americans saying they trusted it. The COVID-19 pandemic led to increased viewership for all cable news networks. For the first calendar quarter of 2020 (January 1 – March 31), Fox News had their highest-rated quarter in the network's history, with Nielsen showing a prime time average total audience of 3.387 million viewers. Sean Hannity's program, Hannity, weeknights at 9 pm ET was the top-rated show in cable news for the quarter averaging 4.2 million viewers, a figure that not only beat out all of its cable news competition but also placed it ahead of network competition in the same time slot. Fox ended the quarter with the top five shows in prime time, with Fox's Tucker Carlson Tonight finishing the quarter in second overall with an average audience of 4.2 million viewers, followed by The Five, The Ingraham Angle, and Special Report with Bret Baier. The Rachel Maddow Show was the highest non-Fox show on cable, coming in sixth place. Finishing the quarter in 22nd place was The Lead with Jake Tapper, CNN's highest rated show. According to a Fox News article on the subject, Fox & Friends averaged 1.8 million viewers, topping CNN's New Day and MSNBC's Morning Joe combined. The same Fox News article said that the Fox Business Network also had its highest-rated quarter in history and that Fox News finished March as the highest-rated network in cable for the 45th consecutive month. According to the Los Angeles Times on August 19, 2020: "Fox News Channel had six of last week's 11 highest-rated prime-time programs to finish first in the network ratings race for the third time since June" 2020. A Morning Consult survey the week after Election Day 2020 showed 30 percent of Republicans in the United States had an unfavorable opinion of Fox News, while 54 percent of Republicans viewed the network favorably, compared to 67 percent before the election. A McClatchy news story suggested criticism from Donald Trump as a major reason, as well as the network's early calling of Arizona for Joe Biden, and later joining other networks in declaring Biden the winner of the 2020 election. Ratings were also down for Fox News. Although it remained ahead of other networks overall, its morning show fell out of first place for the first time since 2001. Trump recommended OANN, which was gaining viewers. Newsmax was also increasing in popularity. Following a decline in ratings after the 2020 U.S. presidential election, in 2021, Fox News regained its lead in cable news ratings ahead of CNN and MSNBC. As indicated by a 2013 New York Times article, based on Nielsen statistics, Fox appears to have a mostly aged demographic. In March 2024, Fox was the most watched news network in total day and prime time viewers in primetime, with 2.135 million/1.306 million viewers respectively, compared to MSNBC with A25-54 demo, 1.307 million in primetime and 830,000 in day viewers, and CNN with 601,000 in primetime and 462,000 in day viewers. In the Adults age 25-54 category, Fox also leads with 246,000 in primetime and 158,000 in day viewers, followed by MSNBC with 133,000 viewers in primetime and 86,000 viewers in day, and CNN with 124,000 viewers in primetime and 85,000 in day viewers. According to the same Nielsen analysis, MSNBC is the second most watched news network. In 2008, in the 25–54 age group, Fox News had an average of 557,000 viewers, but dropped to 379,000 in 2013 while increasing its overall audience from 1.89 million in 2010 to 2.02 million in 2013. The median age of a prime-time viewer was 68 as of 2015[update]. A 2019 Pew Research Center survey showed that among those who named Fox News as their main source for political news, 69% are aged 50 or older. According to a 2013 Gallup poll, 94% of Fox viewers "either identify as or lean Republican". The 2019 Pew survey showed that among people who named Fox News as their main source for political and election news, 93% identify as Republicans. Among the top eight political news sources named by at least 2% of American adults, the results show Fox News and MSNBC as the two news channels with the most partisan audiences. Slogan Fox News Channel originally used the slogan "Fair and Balanced", which was coined by network co-founder Roger Ailes while the network was being established. The New York Times described the slogan as being a "blunt signal that Fox News planned to counteract what Mr. Ailes and many others viewed as a liberal bias ingrained in television coverage by establishment news networks". In a 2013 interview with Peter Robinson of the Hoover Institution, Rupert Murdoch defended the company's "Fair and Balanced" slogan, saying, "In fact, you'll find just as many Democrats as Republicans on and so on". In August 2003, Fox News sued comedian Al Franken over his use of the slogan as a subtitle for his book, Lies and the Lying Liars Who Tell Them: A Fair and Balanced Look at the Right, which is critical of Fox News Channel. The lawsuit was dropped three days later, after Judge Denny Chin refused its request for an injunction. In his decision, Chin ruled the case was "wholly without merit, both factually and legally". He went on to suggest that Fox News' trademark on the phrase "fair and balanced" could be invalid. In December 2003, FNC won a legal battle concerning the slogan, when AlterNet filed a cancellation petition with the United States Patent and Trademark Office (USPTO) to have FNC's trademark rescinded as inaccurate. AlterNet included Robert Greenwald's documentary film Outfoxed (2004) as supporting evidence in its case. After losing early motions, AlterNet withdrew its petition; the USPTO dismissed the case. In 2008, FNC used the slogan "We Report, You Decide", referring to "You Decide 2008" (FNC's original slogan for its coverage of election issues). In August 2016, Fox News Channel began to quietly phase out the "Fair and Balanced" slogan in favor of "Most Watched, Most Trusted"; when these changes were reported in June 2017 by Gabriel Sherman (a writer who had written a biography on Ailes), a network executive said the change "has nothing to do with programming or editorial decisions". It was speculated by media outlets that Fox News Channel was wishing to distance itself from Ailes' tenure at the network. In March 2018, the network introduced a new ad campaign, Real News. Real Honest Opinion. The ad campaign is intended to promote the network's opinion-based programming and counter perceptions surrounding "fake news". In mid-November 2020, following the election, Fox News began to use the slogan "Standing Up For What's Right" to promote its primetime lineup. Content Fox News provided extensive coverage of the 2012 Benghazi attack, which host Sean Hannity described in December 2012 as "the story that the mainstream media ignores" and "obviously, a cover-up. And we will get to the bottom of it." Programming analysis by media watchdog Media Matters, which has declared a "War on Fox News", found that during the twenty months following the Benghazi attacks, FNC ran 1,098 segments on the issue, including: Over nearly four years after the Benghazi attack, there were ten official investigations, including six by Republican-controlled House committees. None of the investigations found any evidence of scandal, cover-up or lying by Obama administration officials. From 2015 into 2018, Fox News broadcast extensive coverage of an alleged scandal surrounding the sale of Uranium One to Russian interests, which host Sean Hannity characterized as "one of the biggest scandals in American history". According to Media Matters, the Fox News coverage extended throughout the programming day, with particular emphasis by Hannity. The network promoted an ultimately unfounded narrative asserting that, as Secretary of State, Hillary Clinton personally approved the Uranium One sale in exchange for $145 million in bribes paid to the Clinton Foundation. Donald Trump repeated these allegations as a candidate and as president. No evidence of wrongdoing by Clinton had been found after four years of allegations, an FBI investigation, and the 2017 appointment of a Federal attorney to evaluate the investigation. In November 2017, Fox News host Shepard Smith concisely debunked the alleged scandal, infuriating viewers who suggested he should work for CNN or MSNBC. Hannity later called Smith "clueless", while Smith stated: "I get it, that some of our opinion programming is there strictly to be entertaining. I get that. I don't work there. I wouldn't work there." Fox News has been described as conservative media, and as providing biased reporting in favor of conservative political positions, the Republican Party, and President Donald Trump. Political scientist Jonathan Bernstein described Fox News as an expanded part of the Republican Party. Political scientists Matt Grossmann and David A. Hopkins wrote that Fox News helped "Republicans communicate with their base and spread their ideas, and they have been effective in mobilizing voters to participate in midterm elections (as in 2010 and 2014)." Prior to 2000, Fox News lacked an ideological tilt, and had more Democrats watch the channel than Republicans. During the 2004 United States presidential election, Fox News was markedly more hostile in its coverage of Democratic presidential nominee John Kerry, and distinguished itself among cable news outlets for heavy coverage of the Swift Boat smear campaign against Kerry. During President Obama's first term in office, Fox News helped launch and amplify the Tea Party movement, a conservative movement within the Republican Party that organized protests against Obama and his policies. In the 2004 documentary Outfoxed, four people identified as former employees said that Fox News made them "slant the news in favor of conservatives". Fox News said that the film misrepresented the employment of these employees. During the Republican primaries, Fox News was perceived as trying to prevent Trump from clinching the nomination. Under Trump's presidency, Fox News remade itself into his image, as hardly any criticism of Trump could be heard on Fox News' prime-time shows. In Fox News' news reporting, the network dedicated far more coverage to Hillary Clinton-related stories, which critics argued was intended to deflect attention from the investigation into Russian interference in the 2016 United States elections. Trump provided significant access to Fox News during his presidency, giving 19 interviews to the channel while only 6 in total to other news channels by November 2017; The New York Times described Trump's Fox News interviews as "softball interviews" and some of the interviewers' interview styles as "fawning". In July 2018, The Economist has described the network's coverage of Trump's presidency as "reliably fawning". From 2015 to 2017, the Fox News prime-time lineup changed from being skeptical and questioning of Trump to a "Trump safe space, with a dose of Bannonist populism once considered on the fringe". The Fox News website has also become more extreme in its rhetoric since Trump's election; according to Columbia University's Tow Center for Digital Journalism, the Fox News website has "gone a little Breitbart" over time. At the start of 2018, Fox News mostly ignored high-profile scandals in the Trump administration which received ample coverage in other national media outlets, such as White House Staff Secretary Rob Porter's resignation amid domestic abuse allegations, the downgrading of Jared Kushner's security clearance, and the existence of a non-disclosure agreement between Trump and the porn star Stormy Daniels. In March 2019, Jane Mayer reported in The New Yorker that Fox News.com reporter Diana Falzone had the story of the Stormy Daniels–Donald Trump scandal before the 2016 election, but that Fox News executive Ken LaCorte told her: "Good reporting, kiddo. But Rupert [Murdoch] wants Donald Trump to win. So just let it go." The story was killed; LaCorte denied making the statement to Falzone, but conceded: "I was the person who made the call. I didn't run it upstairs to Roger Ailes or others. ... I didn't do it to protect Donald Trump." She added that "[Falzone] had put up a story that just wasn't anywhere close to being something I was comfortable publishing." Nik Richie, who claimed to be one of the sources for the story, called LaCorte's account "complete bullshit", adding that "Fox News was culpable. I voted for Trump, and I like Fox, but they did their own 'catch and kill' on the story to protect him." A 2008 study found Fox News gave disproportionate attention to polls suggesting low approval for President Bill Clinton. A 2009 study found Fox News was less likely to pick up stories that reflected well on Democrats, and more likely to pick up stories that reflected well on Republicans. A 2010 study comparing Fox News Channel's Special Report With Brit Hume and NBC's Nightly News coverage of the wars in Iraq and Afghanistan during 2005 concluded "Fox News was much more sympathetic to the administration than NBC", suggesting "if scholars continue to find evidence of a partisan or ideological bias at FNC ... they should consider Fox as alternative, rather than mainstream, media". Research finds that Fox News increases Republican vote shares and makes Republican politicians more partisan. A 2007 study, using the introduction of Fox News into local markets (1996–2000) as an instrumental variable, found that in the 2000 presidential election "Republicans gained 0.4 to 0.7 percentage points in the towns that broadcast Fox News", suggesting "Fox News convinced 3 to 28 percent of its viewers to vote Republican, depending on the audience measure". These results were confirmed by a 2015 study. A 2014 study, using the same instrumental variable, found congressional "representatives become less supportive of President Clinton in districts where Fox News begins broadcasting than similar representatives in similar districts where Fox News was not broadcast." Another 2014 paper found Fox News viewing increased Republican vote shares among voters who identified as Republican or independent. A 2017 study, using channel positions as an instrumental variable, found "Fox News increases Republican vote shares by 0.3 points among viewers induced into watching 2.5 additional minutes per week by variation in position." This study used a different methodology for a later period and found an ever bigger effect and impact, leading Matthew Yglesias to write in the Political Communication academic journal that they "suggest that conventional wisdom may be greatly underestimating the significance of Fox as a factor in American politics." Fox News publicly denies it is biased, with Murdoch and Ailes saying to have included Murdoch's statement that Fox has "given room to both sides, whereas only one side had it before". In June 2009, Fox News host Chris Wallace said: "I think we are the counter-weight [to NBC News] ... they have a liberal agenda, and we tell the other side of the story." In 2004, Robert Greenwald's documentary film Outfoxed: Rupert Murdoch's War on Journalism argued Fox News had a conservative bias and featured clips from Fox News and internal memos from editorial vice president John Moody directing Fox News staff on how to report certain subjects. Fox News' most popular programs such as Sean Hannity and Tucker Carlson do not make any claims to be accurate or fact-checked, and have little to no distinction between news and commentary. A leaked memo from Fox News vice president Bill Sammon to news staff at the height of the health care reform in the United States debate has been cited as an example of the pro-Republican bias of Fox News. His memo asked the staff to "use the term 'government-run health insurance,' or, when brevity is a concern, 'government option,' whenever possible". The memo was sent shortly after Republican pollster Frank Luntz advised Sean Hannity on his Fox show: "If you call it a public option, the American people are split. If you call it the government option, the public is overwhelmingly against it." Surveys suggest Fox News is widely perceived to be ideological. A 2009 Pew survey found Fox News is viewed as the most ideological channel in America, with 47 percent of those surveyed said Fox News is "mostly conservative", 14 percent said "mostly liberal" and 24 percent said "neither". In comparison, MSNBC had 36 percent identify it as "mostly liberal", 11 percent as "mostly conservative" and 27 percent as "neither". CNN had 37 percent describe it as "mostly liberal", 11 percent as "mostly conservative" and 33 percent as "neither". A 2004 Pew Research Center survey found FNC was cited (unprompted) by 69 percent of national journalists as a conservative news organization. A Rasmussen poll found 31 percent of Americans felt Fox News had a conservative bias, and 15 percent that it had a liberal bias. It found 36 percent believed Fox News delivers news with neither a conservative or liberal bias, compared with 37 percent who said NPR delivers news with no conservative or liberal bias and 32 percent who said the same of CNN. David Carr, media critic for The New York Times, praised the 2012 United States presidential election results coverage on Fox News for the network's response to Republican adviser and Fox News contributor Karl Rove challenging its call that Barack Obama would win Ohio and the election. Fox's prediction was correct. Carr wrote: "Over many months, Fox lulled its conservative base with agitprop: that President Obama was a clear failure, that a majority of Americans saw [Mitt] Romney as a good alternative in hard times, and that polls showing otherwise were politically motivated and not to be believed. But on Tuesday night, the people in charge of Fox News were confronted with a stark choice after it became clear that Mr. Romney had fallen short: was Fox, first and foremost, a place for advocacy or a place for news? In this moment, at least, Fox chose news." A May 2017 study conducted by Harvard University's Shorenstein Center on Media, Politics and Public Policy examined coverage of Trump's first 100 days in office by several major mainstream media outlets including Fox. It found Trump received 80% negative coverage from the overall media, and received the least negative coverage on Fox – 52% negative and 48% positive. On March 14, 2017, Andrew Napolitano, a Fox News commentator, claimed on Fox & Friends that British intelligence agency GCHQ had wiretapped Trump on behalf of Barack Obama during the 2016 United States presidential election. On March 16, 2017, White House spokesman Sean Spicer repeated the claim. When Trump was questioned about the claim at a news conference, he said "All we did was quote a certain very talented legal mind who was the one responsible for saying that on television. I didn't make an opinion on it." On March 17, 2017, Shepard Smith, a Fox News anchor, admitted the network had no evidence that Trump was under surveillance. British officials said the White House was backing off the claim. Napolitano was later suspended by Fox News for making the claim. In June 2018, Fox News executives instructed producers to head off inappropriate remarks made on the shows aired by the network by hosts and commentators. The instructions came after a number of Fox News hosts and guests made incendiary comments about the Trump administration's policy of separating migrant children from their parents. Fox News host Laura Ingraham had likened the child detention centers that the children were in to "summer camps". Guest Corey Lewandowski mocked the story of a 10-year-old child with Down syndrome being separated from her mother; the Fox News host did not address Lewandowski's statement. Guest Ann Coulter falsely claimed that the separated children were "child actors"; the Fox News host did not challenge her claim. In a segment on Trump's alleged use of racial dog whistles, one Fox News contributor told an African-American whom he was debating: "You're out of your cotton-picking mind." According to the 2016 book Asymmetric Politics by political scientists Matt Grossmann and David A. Hopkins, "Fox News tends to raise the profile of scandals and controversies involving Democrats that receive scant attention in other media, such as the relationship between Barack Obama and William Ayers ... Hillary Clinton's role in the fatal 2012 attacks on the American consulate in Benghazi, Libya; the gun-running scandal known as 'Fast and Furious'; the business practices of federal loan guarantee recipient Solyndra; the past activism of Obama White House operative Van Jones; the 2004 attacks on John Kerry by the Swift Boat Veterans for Truth; the controversial sermons of Obama's Chicago pastor Jeremiah Wright; the filming of undercover videos of supposed wrongdoing by the liberal activist group ACORN; and the 'war on Christmas' supposedly waged every December by secular, multicultural liberals." In October 2018, Fox News ran laudatory coverage of a meeting between Trump-supporting rapper Kanye West and President Trump in the Oval Office. Fox News had previously run negative coverage of rappers and their involvement with Democratic politicians and causes, such as when Fox News ran headlines describing conscious hip-hop artist Common as "vile" and a "cop-killer rapper", and when Fox News ran negative coverage of Kanye West before he became a Trump supporter. On November 4, 2018, Trump's website, DonaldJTrump.com, announced in a press release that Fox News host Sean Hannity would make a "special guest appearance" with Trump at a midterm campaign rally the following night in Cape Girardeau, Missouri. The following morning, Hannity tweeted "To be clear, I will not be on stage campaigning with the President." Hannity appeared at the president's lectern on stage at the rally, immediately mocking the "fake news" at the back of the auditorium, Fox News reporters among them. Several Fox News employees expressed outrage at Hannity's actions, with one stating that "a new line was crossed". Hannity later asserted that his action was not pre-planned, and Fox News stated it "does not condone any talent participating in campaign events". Fox News host Jeanine Pirro also appeared on stage with Trump at the rally. The Trump press release was later removed from Trump's website. Fox News released a poll of registered voters, jointly conducted by two polling organizations, on June 16, 2019. The poll found some unfavorable results for Trump, including a record high 50% thought the Trump campaign had coordinated with the Russian government, and 50% thought he should be impeached – 43% saying he should also be removed from office – while 48% said they did not favor impeachment. The next morning on Fox & Friends First, host Heather Childers twice misrepresented the poll results, stating "a new Fox News poll shows most voters don't want impeachment" and "at least half of U.S. voters do not think President Trump should be impeached," while the on-screen display of the actual poll question was also incorrect. Later that morning on America's Newsroom, the on-screen display showed the correct poll question and results, but highlighted the 48% of respondents who opposed impeachment rather than the 50% who supported it (the latter being broken-out into two figures). As host Bill Hemmer drew guest Byron York's attention to the 48% opposed figure, they did not discuss the 50% support figure, while the on-screen chyron read: "Fox News Poll: 43% Support Trump's Impeachment and Removal, 48% Oppose." Later that day, Trump tweeted: "@FoxNews Polls are always bad for me...Something weird going on at Fox." In April 2017, it became known that former Obama administration national security advisor Susan Rice sought the unmasking of Trump associates who were unidentified in intelligence reports, notably Trump's incoming national security advisor Michael Flynn, during the presidential transition. In May 2020, acting Director of National Intelligence Richard Grenell, a Trump loyalist, declassified a list of Obama administration officials who had also requested unmasking of Trump associates, which was subsequently publicly released by Republican senators. That month, attorney general Bill Barr appointed federal prosecutor John Bash to examine the unmaskings. Fox News primetime hosts declared the unmaskings a "domestic spying operation" for which the Obama administration was "exposed" in the "biggest abuse of power" in American history. The Bash inquiry closed months later with no findings of substantive wrongdoing. However, certain Fox personalities have not had as much of a favorable reception from Trump: news anchors Shepard Smith (who retired from Fox in 2019) and Chris Wallace have been criticized by Trump for allegedly being adversarial, alongside Fox analyst Andrew Napolitano, who said Trump's actions in the Trump–Ukraine scandal were "both criminal and impeachable behavior". Trump was also critical of the network hiring former DNC chair Donna Brazile, in 2019. The relationship between Trump and Fox News, as well as other Rupert Murdoch-controlled outlets, soured following the 2020 United States presidential election, as Trump refused to concede that Joe Biden had been elected President-elect. This negative tonal shift led to increased viewership of Newsmax and One America News among Trump and his supporters due to their increased antipathy towards Fox; and as a result, Fox released promotional videos of their opinion hosts disputing the election results, promoting a Trump-affiliated conspiracy theory about voter fraud. By one measure, Newsmax saw a 497% spike in viewership, while Fox News saw a 38% decline. Writing for the Poynter Institute for Media Studies in February 2021, senior media writer Tom Jones argued that the primary distinction between Fox News and MSNBC is not right bias vs. left bias, but rather that much of the content on Fox News, especially during its primetime programs, "is not based in truth". The Tampa Bay Times reported in August 2021 that it had reviewed four months of emails indicating Fox News producers had coordinated with aides of Florida governor Ron DeSantis to promote his political prospects by inviting him for frequent network appearances, exchanging talking points and, in one case, helping him to stage an exclusive news event. In February 2024, Alan Rosenblatt of Johns Hopkins University said that Fox News "is an entertainment company that has a news division, not a news company", adding that it "not only does not provide that distinction, it goes out of its way to make it difficult to see the difference. They make their opinion programs look like news programs, and they incorporate enough opinion content on their news programs to further that deception." In early 2024, Fox News host Jesse Watters promoted a conspiracy theory involving Taylor Swift, Travis Kelce, and the Democratic Party in hopes of influencing voters ahead of the U.S. presidential primary season. Fox News has published headlines accusing the English Wikipedia of having a left-wing and socialist bias. On October 30, 2017, when special counsel Robert Mueller indicted Paul Manafort and Rick Gates, and revealed George Papadopoulos had pleaded guilty (all of whom were involved in the Trump 2016 campaign), this was the focus of most media's coverage, except Fox News'. Hosts and guests on Fox News called for Mueller to be fired. Sean Hannity and Tucker Carlson focused their shows on unsubstantiated allegations that Clinton sold uranium to Russia in exchange for donations to the Clinton Foundation and on the Clinton campaign's role in funding the Steele dossier. Hannity asserted: "The very thing they are accusing President Trump of doing, they did it themselves." During the segment, Hannity mistakenly referred to Clinton as President Clinton. Fox News dedicated extensive coverage to the uranium story, which Democrats said was an attempt to distract from Mueller's intensifying investigation. CNN described the coverage as "a tour de force in deflection and dismissal". On October 31, CNN reported Fox News employees were dissatisfied with their outlet's coverage of the Russia investigation, with employees calling it an "embarrassment", "laughable", and saying it "does the viewer a huge disservice and further divides the country" and that it is "another blow to journalists at Fox who come in every day wanting to cover the news in a fair and objective way". When the investigation by special counsel Robert Mueller into Russian interference in the 2016 presidential election intensified in October 2017, the focus of Fox News coverage turned "what they see as the scandal and wrongdoing of President Trump's political opponents. In reports like these, Bill and Hillary Clinton are prominent and recurring characters because they are considered the real conspirators working with the Russians to undermine American democracy." Paul Waldman of The Washington Post described the coverage as "No puppet. You're the puppet", saying it was a "careful, coordinated, and comprehensive strategy" to distract from Mueller's investigation. German Lopes of Vox said Fox News' coverage has reached "levels of self-parody" as it dedicated coverage to low-key stories, such as a controversial Newsweek op-ed and hamburger emojis, while other networks had wall-to-wall coverage of Mueller's indictments. A FiveThirtyEight analysis of Russia-related media coverage in cable news found most mentions of Russia on Fox News were spoken in close proximity to "uranium" and "dossier". On November 1, 2017, Vox analyzed the transcripts of Fox News, CNN and MSNBC, and found Fox News "was unable to talk about the Mueller investigation without bringing up Hillary Clinton", "talked significantly less about George Papadopoulos—the Trump campaign adviser whose plea deal with Mueller provides the most explicit evidence thus far that the campaign knew of the Russian government's efforts to help Trump—than its competitors", and "repeatedly called Mueller's credibility into question". In December 2017, Fox News escalated its attacks on the Mueller investigation, with hosts and guest commentators suggesting the investigation amounted to a coup. Guest co-host Kevin Jackson referred to a right-wing conspiracy theory claiming Strzok's messages are evidence of a plot by FBI agents to assassinate Trump, a claim which the other Fox co-hosts quickly said is not supported by any credible evidence. Fox News host Jeanine Pirro called the Mueller investigation team a "criminal cabal" and said the team ought to be arrested. Other Fox News figures referred to the investigation as "corrupt", "crooked", and "illegitimate", and likened the FBI to the KGB, the Soviet-era spy organization that routinely tortured and summarily executed people. Political scientists and scholars of coups described the Fox News rhetoric as scary and dangerous. Experts on coups rejected that the Mueller investigation amounted to a coup; rather, the Fox News rhetoric was dangerous to democracy and mirrored the kind of rhetoric that occurs before purges. A number of observers argued the Fox News rhetoric was intended to discredit the Mueller investigation and sway President Donald Trump to fire Mueller. In August 2018, Fox News was criticized for giving more prominent coverage of a murder committed by an undocumented immigrant than the convictions of Donald Trump's former campaign manager, Paul Manafort, and his long-term personal attorney, Michael Cohen. At the same time, most other national mainstream media gave wall-to-wall coverage of the convictions. Fox News hosts Dana Perrino and Jason Chaffetz argued that voters care far more about the murder than the convictions of the President's former top aides, and hosts Tucker Carlson and Sean Hannity downplayed the convictions. In November 2017, following the 2017 New York City truck attack wherein a terrorist shouted "Allahu Akbar", Fox News distorted a statement by Jake Tapper to make it appear as if he had said "Allahu Akbar" can be used under the most "beautiful circumstances". Fox News omitted that Tapper had said the use of "Allahu Akbar" in the terrorist attack was not one of these beautiful circumstances. A headline on FoxNews.com was preceded by a tag reading "OUTRAGEOUS". The Fox News Twitter account distorted the statement further, saying "Jake Tapper Says 'Allahu Akbar' Is 'Beautiful' Right After NYC Terror Attack" in a tweet that was later deleted. Tapper chastised Fox News for choosing to "deliberately lie" and said "there was a time when one could tell the difference between Fox and the nutjobs at Infowars. It's getting tougher and tougher. Lies are lies." In 2009, Tapper had come to the defense of Fox News while he was a White House correspondent for ABC News, after the Obama administration claimed that the network was not a legitimate news organization. Fox News guest host Jason Chaffetz apologized to Tapper for misrepresenting his statement. After Fox News had deleted the tweet, Sean Hannity repeated the misrepresentation and called Tapper "liberal fake news CNN's fake Jake Tapper" and mocked his ratings. In July 2017, a report by Fox & Friends falsely said The New York Times had disclosed intelligence in one of its stories and that this intelligence disclosure helped Abu Bakr al-Baghdadi, the leader of the Islamic State, to evade capture. The report cited an inaccurate assertion by Gen. Tony Thomas, the head of the United States Special Operations Command, that a major newspaper had disclosed the intelligence. Fox News said it was The New York Times, repeatedly running the chyron "NYT Foils U.S. Attempt To Take Out Al-Bahgdadi". Pete Hegseth, one of the show's hosts, criticized the "failing New York Times". President Donald Trump tweeted about the Fox & Friends report shortly after it first aired, saying "The Failing New York Times foiled U.S. attempt to kill the single most wanted terrorist, Al-Baghdadi. Their sick agenda over National Security." Fox News later updated the story, but without apologizing to The New York Times or responding directly to the inaccuracies. In a Washington Post column, Erik Wemple said Chris Wallace had covered The New York Times story himself on Fox News Sunday, adding: "Here's another case of the differing standards between Fox News's opinion operation", which has given "a state-run vibe on all matters related to Trump", compared to Fox News's news operation, which has provided "mostly sane coverage". Fox News has often been described as a major platform for climate change denial.[a] A 2011 study by Lauren Feldman and Anthony Leiserowitz found Fox News "takes a more dismissive tone toward climate change than CNN and MSNBC". A 2008 study found Fox News emphasized the scientific uncertainty of climate change more than CNN, was less likely to say climate change was real, and more likely to interview climate change skeptics. Leaked emails showed that in 2009 Bill Sammon, the Fox News Washington managing editor, instructed Fox News journalists to dispute the scientific consensus on climate change and "refrain from asserting that the planet has warmed (or cooled) in any given period without IMMEDIATELY pointing out that such theories are based upon data that critics have called into question." According to climate scientist Michael E. Mann, Fox News "has constructed an alternative universe where the laws of physics no longer apply, where the greenhouse effect is a myth, and where climate change is a hoax, the product of a massive conspiracy among scientists, who somehow have gotten the polar bears, glaciers, sea levels, superstorms, and megadroughts to play along." According to James Lawrence Powell's 2011 study of the climate science denial movement, Fox News provides "the deniers with a platform to say whatever they like without fear of contradiction." Fox News employs Steve Milloy, a prominent climate change denier with close financial and organizational ties to oil companies, as a contributor. In his columns about climate change for FoxNews.com, Fox News has failed to disclose his substantial funding from oil companies. In 2011, the hosts of Fox & Friends described climate change as "unproven science", a "disputed fact", and criticized the Department of Education for working together with the children's network Nickelodeon to teach children about climate change. In 2001, Sean Hannity described the scientific consensus on climate change as "phony science from the left". In 2004, he falsely alleged that "scientists still can't agree on whether the global warming is scientific fact or fiction". In 2010, Hannity said the so-called "Climategate" – the leaking of e-mails by climate scientist that climate change skeptics claimed demonstrated scientific misconduct but which all subsequent enquiries have found no evidence of misconduct or wrongdoing – a "scandal" that "exposed global warming as a myth cooked up by alarmists". Hannity frequently invites contrarian fringe scientists and critics of climate change to his shows. In 2019, a widely shared Fox News news report falsely claimed that new climate science research showed that the Earth might be heading to a new Ice Age; the author of the study that Fox News cited said that Fox News "utterly misrepresents our research" and the study did not in any way suggest that Earth was heading to an Ice Age. Fox News later corrected the story. Shepard Smith drew attention for being one of few voices formerly on Fox News to forcefully state that climate change is real, that human activities are a primary contributor to it and that there is a scientific consensus on the issue. His acceptance of the scientific consensus on climate change drew criticism from Fox News viewers and conservatives. Smith left Fox News in October 2019. In a 2021 interview with Christiane Amanpour on her eponymous show on CNN, he stated that his presence on Fox had become "untenable" due to the "falsehoods" and "lies" intentionally spread on the network's opinion shows. On May 16, 2017, a day when other news organizations were extensively covering Donald Trump's revelation of classified information to Russia, Fox News ran a lead story about a private investigator's uncorroborated claims about the murder of Seth Rich, a DNC staffer. The private investigator said he had uncovered evidence that Rich was in contact with WikiLeaks and law enforcement were covering it up. The killing of Rich has given rise to conspiracy theories in right-wing circles that Hillary Clinton and the Democratic Party had Seth Rich killed allegedly because he was the source of the DNC leaks. U.S. intelligence agencies determined Russia was the source of the leaks. In reporting the investigator's claims, the Fox News report reignited right-wing conspiracy theories about the killing. The Fox News story fell apart within hours. Other news organizations quickly revealed the investigator was a Donald Trump supporter and had according to NBC News "developed a reputation for making outlandish claims, such as one appearance on Fox News in 2007 in which he warned that underground networks of pink pistol-toting lesbian gangs were raping young women." The family of Seth Rich, the Washington D.C. police department, the Washington D.C. mayor's office, the FBI, and law enforcement sources familiar with the case rebuked the investigator's claims. Rich's relatives said: "We are a family who is committed to facts, not fake evidence that surfaces every few months to fill the void and distract law enforcement and the general public from finding Seth's murderers." The spokesperson for the family criticized Fox News for its reporting, alleging the outlet was motivated by a desire to deflect attention from the Trump-Russia story: "I think there's a very special place in hell for people that would use the memory of a murder victim in order to pursue a political agenda." The family has called for retractions and apologies from Fox News for the inaccurate reporting. Over the course of the day, Fox News altered the contents of the story and the headline, but did not issue corrections. When CNN contacted the private investigator later that day, the investigator said he had no evidence that Rich had contacted WikiLeaks. The investigator claimed he only learned about the possible existence of the evidence from a Fox News reporter. Fox News did not respond to inquiries by CNN, and the Washington Post. Fox News later on May 23, seven days after the story was published, retracted its original report, saying the original report did not meet its standards. Nicole Hemmer, then assistant professor at the Miller Center of Public Affairs, wrote that the promotion of the conspiracy theory demonstrated how Fox News was "remaking itself in the image of fringe media in the age of Trump, blurring the lines between real and fake news." Max Boot of the Council on Foreign Relations said while intent behind Fox News, as a counterweight to the liberal media was laudable, the culmination of those efforts have been to create an alternative news source that promotes hoaxes and myths, of which the promotion of the Seth Rich conspiracy is an example. Fox News was also criticized by conservative outlets, such as The Weekly Standard, National Review, and conservative columnists, such as Jennifer Rubin, Michael Gerson, and John Podhoretz. Rich's parents, Joel and Mary Rich, sued Fox News for the emotional distress it had caused them by its false reporting. In 2020, Fox News settled with Rich family, making a payment that was not officially disclosed but which was reported to be in the seven figures. Although the settlement had been agreed to earlier in the year, Fox News arranged to delay the public announcement until after the 2020 presidential election. Fox News hosts and contributors defended Trump's remarks that "many sides" were to blame for violence at a gathering of hundreds of white nationalists in Charlottesville, Virginia. Some criticized Trump. In a press conference on August 15, Trump used the term "alt-left" to describe counterprotesters at the white supremacist rally, a term which had been used in Fox News' coverage of the white supremacist rally. Several of Trump's comments at the press conference mirrored those appearing earlier on Fox News. According to Dylan Byers of CNN, Fox News' coverage on the day of the press conference "was heavy with "whataboutism". The average Fox viewer was likely left with the impression that the media's criticism of Trump and leftist protestors' toppling of some Confederate statues were far greater threats to America than white supremacism or the president's apparent defense of bigotry." Byers wrote "it showed that if Fox News has a line when it comes to Trump's presidency, it was not crossed on Tuesday." During Glenn Beck's tenure at Fox News, he became one of the most high-profile proponents of conspiracy theories about George Soros, a Jewish Hungarian-American businessman and philanthropist known for his donations to American liberal political causes. Beck regularly described Soros as a "puppet-master" and used common anti-Semitic tropes to describe Soros and his activities. In a 2010 three-part series, Beck depicted George Soros as a cartoonish villain trying to "form a shadow government, using humanitarian aid as a cover", and that Soros wanted a one-world government. Beck promoted the false and anti-Semitic conspiracy theory that Soros was a Nazi collaborator as a 14-year-old in Nazi-occupied Hungary. Beck also characterized Soros's mother as a "wildly anti-Semitic" Nazi collaborator. According to The Washington Post: "Beck's series was largely considered obscene and delusional, if not outright anti-Semitic", but Beck's conspiracy theory became common on the right-wing of American politics. Amid criticism of Beck's false smears, Fox News defended Beck, stating "information regarding Mr. Soros's experiences growing up were taken directly from his writings and from interviews given by him to the media, and no negative opinion was offered as to his actions as a child." Roger Ailes, then-head of Fox News, dismissed criticism levied at Beck by hundreds of rabbis, saying that they were "left-wing rabbis who basically don't think that anybody can ever use the word, Holocaust, on the air." During the first few weeks of the COVID-19 pandemic in the United States, Fox News was considerably more likely than other mainstream news outlets to promote misinformation about COVID-19. The network promoted the narrative that the emergency response to the pandemic was politically motivated or otherwise unwarranted, with Sean Hannity explicitly calling it a "hoax" (he later denied doing so) and other hosts downplaying it. This coverage was consistent with the messaging of Trump at the time. Only in mid March did the network change the tone of its coverage, after President Trump declared a national emergency. At the same time that Fox News commentators downplayed the threat of the virus in public, Fox's management and the Murdoch family took a broad range of internal measures to protect themselves and their employees against it. Sean Hannity and Laura Ingraham, two of Fox News's primetime hosts, promoted use of the drug hydroxychloroquine for the treatment of COVID-19, an off-label usage which at the time was supported only by anecdotal evidence, after it was touted by Trump as a possible cure. Fox News promoted a conspiracy theory that coronavirus death toll numbers were inflated with people who would have died anyway from preexisting conditions. This was disputed by White House coronavirus task force members Anthony Fauci and Deborah Birx, with Fauci describing conspiracy theories as "nothing but distractions" during public health crises. Later in the pandemic, Hannity, Ingraham and Carlson promoted the use of livestock dewormer ivermectin as a possible COVID-19 treatment. Studies have linked trust in Fox News, as well as viewership of Fox News, with fewer preventive behaviors and more risky behaviors related to COVID-19. Once a COVID-19 vaccine became widely available, Fox News consistently questioned the efficacy and safety of the vaccine, celebrated evidence-free skepticism, and blasted attempts to promote vaccinations. More than 90% of Fox Corporation's full-time employees had been fully vaccinated by September 2021. After Trump's defeat in the 2020 presidential election, Fox News host Jeanine Pirro promoted baseless allegations on her program that voting machine company Smartmatic and its competitor Dominion Voting Systems had conspired to rig the election against Trump. Hosts Lou Dobbs and Maria Bartiromo also promoted the allegations on their programs on sister network Fox Business. In December 2020, Smartmatic sent a letter to Fox News demanding retractions and threatening legal action, specifying that retractions "must be published on multiple occasions" so as to "match the attention and audience targeted with the original defamatory publications." Days later, each of the three programs aired the same three-minute video segment consisting of an interview with an election technology expert who refuted the allegations promoted by the hosts, responding to questions from an unseen and unidentified man. None of the three hosts personally issued retractions. Smartmatic filed a $2.7 billion defamation suit against the network, the three hosts, Powell and Trump attorney Rudy Giuliani in February 2021. In an April 2021 court brief seeking dismissal of the suit, Fox attorney Paul Clement argued that the network was simply "reporting allegations made by a sitting President and his lawyers." A New York State Supreme Court judge ruled in March 2022 that the suit could proceed, though he dismissed allegations against Sidney Powell and Pirro, and some claims against Giuliani. The judge allowed allegations against Bartiromo and Dobbs to stand. The New York Supreme Court, Appellate Division unanimously rejected a Fox News bid to dismiss the Smartmatic suit in February 2023. The court reinstated defamation allegations against Giuliani and Pirro. In December 2020, Dominion Voting Systems sent a similar letter demanding retractions to Trump attorney Sidney Powell, who had promoted the allegations on Fox programs. On March 26, 2021, Dominion filed a $1.6 billion defamation lawsuit against Fox News, alleging that Fox and some of its pundits spread conspiracy theories about Dominion, and allowed guests to make false statements about the company. On May 18, 2021, Fox News filed a motion to dismiss the Dominion Voting Systems lawsuit, asserting a First Amendment right "to inform the public about newsworthy allegations of paramount public concern." The motion to dismiss was denied on December 16, 2021, by a Delaware Superior Court judge. In addition to Bartiromo, Dobbs, and Pirro, the suit also names primetime hosts Tucker Carlson and Sean Hannity. Venezuelan businessman Majed Khalil sued Fox, Dobbs and Powell for $250 million in December 2021, alleging they had falsely implicated him in rigging Dominion and Smartmatic machines. Dobbs and Fox News reached a confidential settlement with Khalil in April 2023. Fox News was the only major network or cable news outlet to not carry the first televised prime time hearing of the January 6 committee live; its regular programming of Tucker Carlson Tonight and Hannity was aired without commercial breaks. During the weeks following the election, Carlson and Hannity often amplified Trump's election falsehoods on their programs; previously disclosed text messages between Hannity and White House press secretary Kayleigh McEnany were presented during the hearing. Hannity told his audience, "Unlike this committee and their cheerleaders in the media mob, we will actually be telling you the truth," while Carlson said, "This is the only hour on an American news channel that won't be covering their propaganda live. They are lying and we are not going to help them do it." In June 2022, a Delaware Superior Court judge again declined to dismiss the Dominion suit against Fox News, and also allowed Dominion to sue the network's corporate parent, Fox Corporation. The judge ruled that Rupert and Lachlan Murdoch may have acted with actual malice because there was a reasonable inference they "either knew Dominion had not manipulated the election or at least recklessly disregarded the truth when they allegedly caused Fox News to propagate its claims about Dominion." He noted a report that Rupert Murdoch spoke with Trump a few days after the election and informed him that he had lost. The New York Times reported in December 2022 that Dominion had acquired communications between Fox News executives and hosts, and between a Fox Corporation employee and the Trump White House, showing they knew that what the network was reporting was untrue. Dominion attorneys said hosts Sean Hannity and Tucker Carlson, and Fox executives, attested to this in sworn depositions. In November 2020, Hannity hosted Sidney Powell, who asserted Dominion machines had been rigged, but said in his deposition, "I did not believe it for one second." A February 2023 Dominion court filing showed Fox News primetime hosts messaging each other to insult and mock Trump advisers, indicating the hosts knew the allegations made by Powell and Giuliani were false. Rupert Murdoch messaged that Trump's voter fraud claims were "really crazy stuff," telling Fox News CEO Suzanne Scott that it was "terrible stuff damaging everybody, I fear." As a January 2021 Georgia runoff election approached that would determine party control of the U.S. Senate, Murdoch told Scott, "Trump will concede eventually and we should concentrate on Georgia, helping any way we can." After the 2016 election, the network developed a cutting-edge system to call elections, which proved very successful during the 2018 midterm elections. The network was the first to call the 2020 Arizona race for Biden, angering many viewers. Washington managing editor Bill Sammon supervised the network's Decision Desk that made the call. Bret Baier and Martha MacCallum, the network's main news anchors, suggested during a high-level conference call that relying solely on data to make the call was inadequate and that viewer reaction should also be considered; MacCallum said, "in a Trump environment, the game is just very, very different." Sammon stood by the 2020 call and was fired by the network after the January 2021 Georgia runoff. In 2023, Rupert Murdoch was deposed and testified that some Fox News commentators were endorsing election fraud claims they knew were false. In February 2023, Fox's internal communications were released, showing that its presenters and senior executives privately doubted Donald Trump's claims of a stolen election. Chairman Rupert Murdoch once described Trump's voter fraud claims as "really crazy stuff", and also said that Trump advisers Rudy Giuliani and Sidney Powell's television appearances were "terrible stuff damaging everybody". One November 2020 exchange showed Tucker Carlson accusing Powell of "lying ... I caught her. It's insane", with Laura Ingraham responding that "Sidney is a complete nut. No one will work with her. Ditto with Rudy". In another exchange that month, Carlson called for Fox journalist Jacqui Heinrich to be "fired" because she fact-checked Trump and said that there was no evidence of voter fraud from Dominion. Carlson said that Heinrich's actions "needs to stop immediately, like tonight. It's measurably hurting the company. The stock price is down", while Heinrich deleted the fact-check the next morning. In March 2023, more of Fox's internal communications were released. One November 2020 communication showed Fox CEO Suzanne Scott criticizing fact-checking, stating that she cannot "keep defending these reporters who don't understand our viewers and how to handle stories ... The audience feels like we crapped on" them, and Fox was losing their audience's "trust and belief" in them. Another December 2020 communication showed Scott responding to Fox presenter Eric Shawn's fact-checking of Donald Trump's false 2020 election claims by demanding that the fact-checking "has to stop now ... This is bad business ... The audience is furious." On March 31, 2023, Delaware Superior Court judge Eric Davis ruled in a summary judgment that it "is CRYSTAL clear that none of the statements relating to Dominion about the 2020 election are true" and ordered for the case to go to trial. On April 18, 2023, Fox News reached a settlement with Dominion just before the trial started, concluding the lawsuit; Fox agreed to pay Dominion $787.5 million, and further stated: "We acknowledge the Court's rulings finding certain claims about Dominion to be false". In April 2021, at least five Fox News and Fox Business personalities amplified a story published by the Daily Mail, a British tabloid, that incorrectly linked a university study to President Joe Biden's climate change agenda, to falsely assert that Americans would be compelled to dramatically reduce their meat consumption to mitigate greenhouse gas emissions caused by flatulence. Fox News aired a graphic detailing the supposed compulsory reductions, falsely indicating the information came from the Agriculture Department, which numerous Republican politicians and commentators tweeted. Fox News anchor John Roberts reported to "say goodbye to your burgers if you want to sign up to the Biden climate agenda." Days later, Roberts acknowledged on air that the story was false. According to analysis by Media Matters, on May 12, 2021, Fox News reported on its website: "Biden resumes border wall construction after promising to halt it". Correspondent Bill Melugin then appeared on Special Report with Bret Baier to report "the U.S. Army Corps of Engineers is actually going to be restarting border wall construction down in the Rio Grande Valley" after "a lot of blowback and pressure from local residents and local politicians." After the Corps of Engineers tweeted a clarification, Melugin deleted a tweet about the story and tweeted an "update" clarifying that a levee wall was being constructed to mitigate damage to flood control systems caused by uncompleted wall construction, and the website story headline was changed to "Biden administration to resume border wall levee construction as crisis worsens." Later on Fox News Primetime, host Brian Kilmeade briefly noted the levee but commented to former Trump advisor Stephen Miller: "They're going to restart building the wall again, Stephen." Fox News host Sean Hannity later broadcast the original Melugin story without any mention of the levee. Media Matters reported in September 2024 that during the Biden presidency Fox News had promoted a false "crime crisis" narrative, particularly directed toward undocumented migrants, which reflected Donald Trump's political rhetoric. The Fox News narrative consisted of reported violent crime anecdotes rather than FBI crime rate statistics showing violent crime had declined significantly since 2020. One Fox host, Ainsley Earhardt, said that even if the FBI data were right, "we're all a little bit more scared than we used to be." Later that month, weeks before the 2024 presidential election, the FBI released crime data for 2023 showing that violent crime had declined 3% from 2022. The report was widely covered by mainstream news outlets that day, though the Fox News coverage was limited to a 28-second segment by evening anchor Bret Baier. He reported "critics say the report is not accurate because it does not include big cities," echoing a false assertion made by Elon Musk and other Trump supporters on social media. Controversies The network has been accused of permitting sexual harassment and racial discrimination by on-air hosts, executives, and employees, paying out millions of dollars in legal settlements. Prominent Fox News figures such as Roger Ailes, Bill O'Reilly and Eric Bolling were fired after many women accused them of sexual harassment. At least four lawsuits alleged Fox News co-president Bill Shine ignored, enabled or concealed Roger Ailes' alleged sexual harassment. Fox News CEO Rupert Murdoch has dismissed the high-profile sexual misconduct allegations as "largely political" and speculated they were made "because we are conservative". Bill O'Reilly and Fox News settled six agreements, totaling $45 million, with women who accused O'Reilly of sexual harassment. In January 2017, shortly after Bill O'Reilly settled a sexual harassment lawsuit for $32 million ("an extraordinarily large amount for such cases"), Fox News renewed Bill O'Reilly's contract. Fox News's parent company, 21st Century Fox, said it was aware of the lawsuit. The contract between O'Reilly and Fox News read he could not be fired from the network unless sexual harassment allegations were proven in court. Fox News's extensive coverage of the Harvey Weinstein scandal in October 2017 was seen by some as hypocritical. Fox News dedicated at least 12 hours of coverage to the Weinstein scandal, yet only dedicated 20 minutes to Bill O'Reilly, who just like Weinstein had been accused of sexual harassment by a multitude of women. A few weeks later, when a number of females under the age of 18, including a 14-year-old, accused Alabama Senate candidate Roy Moore of making sexual advances, Hannity dismissed the sexual misconduct allegations and dedicated coverage on his television show to casting doubt on the accusers. Other prime-time Fox News hosts Tucker Carlson and Laura Ingraham queried The Washington Post's reporting or opted to bring up sexual misconduct allegations regarding show business figures such as Harvey Weinstein and Louis C.K. Fox News figures Jeanine Pirro and Gregg Jarrett questioned both the validity of The Washington Post's reporting and that of the women. In December 2017, a few days before the Alabama Senate election, Fox News, along with the conspiracy websites Breitbart News and The Gateway Pundit, ran an inaccurate headline which claimed one of Roy Moore's accusers admitted to forging an inscription by Roy Moore in her yearbook; Fox News later added a correction to the story. A number of Fox News hosts have welcomed Bill O'Reilly to their shows and paid tributes to Roger Ailes after his death. In May 2017, Hannity called Ailes "a second father" and said to Ailes's "enemies" that he was "preparing to kick your a** in the next life". Ailes had the year before been fired from Fox News after women alleged he sexually harassed them. In September 2017, several months after Bill O'Reilly was fired from Fox News in the wake of women alleging he sexually harassed them, Hannity hosted O'Reilly on his show. Some Fox News employees criticized the decision. According to CNN, during the interview, Hannity found kinship with O'Reilly as he appeared "to feel that he and O'Reilly have both become victims of liberals looking to silence them." In September 2009, the Obama administration engaged in a verbal conflict with Fox News Channel. On September 20, President Barack Obama appeared on all major news programs except Fox News, a snub partially in response to remarks about him by commentators Glenn Beck and Sean Hannity and Fox coverage of Obama's health-care proposal. In late September 2009, Obama's senior advisor David Axelrod and Roger Ailes met in secret to attempt to smooth out tensions between the two camps. Two weeks later, White House chief of staff Rahm Emanuel referred to FNC as "not a news network" and communications director Anita Dunn said "Fox News often operates as either the research arm or the communications arm of the Republican Party". Obama commented: "If media is operating basically as a talk radio format, then that's one thing, and if it's operating as a news outlet, then that's another." Emanuel said it was important "to not have the CNNs and the others in the world basically be led in following Fox". Within days, it was reported that Fox had been excluded from an interview with administration official Ken Feinberg, with bureau chiefs from the White House press pool (ABC, CBS, NBC, and CNN) coming to Fox's defense. A bureau chief said: "If any member had been excluded it would have been the same thing, it has nothing to do with Fox or the White House or the substance of the issues." Shortly after the story broke, the White House admitted to a low-level mistake, saying Fox had not made a specific request to interview Feinberg. Fox White House correspondent Major Garrett said he had not made a specific request, but had a "standing request from me as senior White House correspondent on Fox to interview any newsmaker at the Treasury at any given time news is being made". On November 8, 2009, the Los Angeles Times reported an unnamed Democratic consultant was warned by the White House not to appear on Fox News again. According to the article, Dunn claimed in an e-mail to have checked with colleagues who "deal with TV issues" who denied telling anyone to avoid Fox. Patrick Caddell, a Fox News contributor and former pollster for President Jimmy Carter, said he had spoken with other Democratic consultants who had received similar warnings from the White House. On October 2, 2013, Fox News host Anna Kooiman cited on the air a fake story from the National Report parody site, which claimed Obama had offered to keep the International Museum of Muslim Cultures open with cash from his own pocket. Fox News attracted controversy in April 2018 when it was revealed primetime host Sean Hannity had defended Trump's then personal attorney Michael Cohen on air without disclosing Cohen was his lawyer. On April 9, 2018, federal agents from the U.S. Attorney's office served a search warrant on Cohen's office and residence. On the air, Hannity defended Cohen and criticized the federal action, calling it "highly questionable" and "an unprecedented abuse of power". On April 16, 2018, in a court hearing, Cohen's lawyers told the judge that Cohen had ten clients in 2017–2018 but did "traditional legal tasks" for only three, including Trump. The federal judge ordered the revelation of the third client, whom Cohen's lawyers named as Hannity. Hannity was not sanctioned by Fox News for this breach of journalistic ethics, with Fox News releasing a statement that the channel was unaware of Hannity's relationship to Cohen and that it had "spoken to Sean and he continues to have our full support." Media ethics experts said that Hannity's disclosure failure was a major breach of journalistic ethics and that the network should have suspended or fired him for it. In mid-2021, Fox News agreed to pay a $1 million settlement to New York City after its Commission on Human Rights cited "a pattern of violating the NYC Human Rights Law". A Fox News spokesperson claimed that "FOX News Media has already been in full compliance across the board, but [settled] to continue enacting extensive preventive measures against all forms of discrimination and harassment." International transmission The Fox News Channel feed has international availability via multiple providers, while Fox Extra segments provide alternate programming. Fox News is carried in more than 40 countries. In Australia, FNC is broadcast on the dominant pay television provider Foxtel. FNC reached Brazil through Sky Brasil on November 1, 2002, after being introduced at ABTA 2002. Commercials on FNC are replaced with Fox Extra. It is available on Vivo TV. Fox had initially planned to launch a joint venture with Canwest's Global Television Network, tentatively named Fox News Canada, which would have featured a mixture of U.S. and Canadian news programming. As a result, the CRTC denied a 2003 application requesting permission for Fox News Channel to be carried in Canada. However, in March 2004, a Fox executive said the venture had been shelved; in November of that year, the CRTC added Fox News to its whitelist of foreign channels that may be carried by television providers. In May 2023, the CRTC announced that it would open a public consultation regarding the channel's carriage in Canada, acting upon complaints by the LGBT advocacy group Egale Canada surrounding an episode of Tucker Carlson Tonight that contained content described as "malicious misinformation" regarding trans, non-binary, gender non-conforming, and two-spirit communities, including "the inflammatory and false claim that trans people are 'targeting' Christians." It is available through streaming service Disney+ Hotstar. In Indonesia, it is available on Channel 397 on pay-TV provider First Media. In Israel, FNC is broadcast on Channel 105 of the satellite provider Yes, as well as being carried on Cellcom TV and Partner TV. It is also broadcast on channel 200 on cable operator HOT. In Italy, FNC is broadcast on Sky Italia. Fox News was launched on Stream TV in 2001, and moved to Sky Italia in 2003. Although service to Japan ceased in summer 2003, it can still be seen on Americable (distributor for American bases), Mediatti (Kadena Air Base) and Pan Global TV Japan. The channel's international feed is being carried by cable provider Izzi Telecom. In the Netherlands, Fox News has been carried by cable providers UPC Nederland and CASEMA, and satellite provider Canaldigitaal; all have dropped the channel in recent years. At this time, only cable provider Caiway (available in a limited number of towns in the central part of the country) is broadcasting the channel. The channel was also carried by IPTV provider KNIPPR (owned by T-Mobile). In New Zealand, FNC is broadcast on Channel 088 of pay satellite operator SKY Network Television's digital platform. It was formerly broadcast overnight on free-to-air UHF New Zealand TV channel Prime; this was discontinued in January 2010, reportedly due to an expiring broadcasting license. In Pakistan, Fox News Channel is available on PTCL Smart TV and a number of cable and IPTV operators. In the Philippines, Fox News Channel is available on Sky Cable, Cablelink and G Sat Channel 50. It was available on Cignal until January 1, 2021, due to contract expiration; however, the channel returned on June 16, 2022. In Portugal, Fox News was available on Meo. The channel is however no longer available on the operator and it is not carried by other Portuguese TV operators. Between 2003 and 2006, in Sweden and the other Scandinavian countries, FNC was broadcast 16 hours a day on TV8 (with Fox News Extra segments replacing U.S. advertising). Fox News was dropped by TV8 in September 2006. In Singapore, FNC is broadcast on pay-TV operator StarHub TV, as well on Singtel TV. In South Africa, FNC is broadcast on StarSat. The most popular pay television operator, DStv, does not offer FNC in its channel bouquet. In Spain, Fox News was available on Movistar Plus+. The channel was part of the operator since its first incarnation as Canal Satellite Digital in the early 2000s, but was later removed from the operator's satellite offer by March 2023, and ceased transmission to the remaining offers on July 9, 2024. The channel is not carried by other Spanish TV operators. FNC was carried in the United Kingdom by Sky. On August 29, 2017, Sky dropped Fox News; the broadcaster said its carriage was not "commercially viable" due to average viewership of fewer than 2,000 viewers per day. The company said the decision was unrelated to 21st Century Fox's proposed acquisition of the remainder of Sky plc (which ultimately led to a bidding war that resulted in its acquisition by Comcast instead). The potential co-ownership had prompted concerns from critics of the deal, who felt Sky News could similarly undergo a shift to an opinionated format with a right-wing viewpoint. However, such a move would violate Ofcom broadcast codes, which requires all news programming to show due impartiality. The channel's broadcasts in the country have violated this rule on several occasions. Notable personalities See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Internet#cite_ref-143] | [TOKENS: 9291] |
Contents Internet The Internet (or internet)[a] is the global system of interconnected computer networks that uses the Internet protocol suite (TCP/IP)[b] to communicate between networks and devices. It is a network of networks that comprises private, public, academic, business, and government networks of local to global scope, linked by electronic, wireless, and optical networking technologies. The Internet carries a vast range of information services and resources, such as the interlinked hypertext documents and applications of the World Wide Web (WWW), electronic mail, discussion groups, internet telephony, streaming media and file sharing. Most traditional communication media, including telephone, radio, television, paper mail, newspapers, and print publishing, have been transformed by the Internet, giving rise to new media such as email, online music, digital newspapers, news aggregators, and audio and video streaming websites. The Internet has enabled and accelerated new forms of personal interaction through instant messaging, Internet forums, and social networking services. Online shopping has also grown to occupy a significant market across industries, enabling firms to extend brick and mortar presences to serve larger markets. Business-to-business and financial services on the Internet affect supply chains across entire industries. The origins of the Internet date back to research that enabled the time-sharing of computer resources, the development of packet switching, and the design of computer networks for data communication. The set of communication protocols to enable internetworking on the Internet arose from research and development commissioned in the 1970s by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense in collaboration with universities and researchers across the United States and in the United Kingdom and France. The Internet has no single centralized governance in either technological implementation or policies for access and usage. Each constituent network sets its own policies. The overarching definitions of the two principal name spaces on the Internet, the Internet Protocol address (IP address) space and the Domain Name System (DNS), are directed by a maintainer organization, the Internet Corporation for Assigned Names and Numbers (ICANN). The technical underpinning and standardization of the core protocols is an activity of the non-profit Internet Engineering Task Force (IETF). Terminology The word internetted was used as early as 1849, meaning interconnected or interwoven. The word Internet was used in 1945 by the United States War Department in a radio operator's manual, and in 1974 as the shorthand form of Internetwork. Today, the term Internet most commonly refers to the global system of interconnected computer networks, though it may also refer to any group of smaller networks. The word Internet may be capitalized as a proper noun, although this is becoming less common. This reflects the tendency in English to capitalize new terms and move them to lowercase as they become familiar. The word is sometimes still capitalized to distinguish the global internet from smaller networks, though many publications, including the AP Stylebook since 2016, recommend the lowercase form in every case. In 2016, the Oxford English Dictionary found that, based on a study of around 2.5 billion printed and online sources, "Internet" was capitalized in 54% of cases. The terms Internet and World Wide Web are often used interchangeably; it is common to speak of "going on the Internet" when using a web browser to view web pages. However, the World Wide Web, or the Web, is only one of a large number of Internet services. It is the global collection of web pages, documents and other web resources linked by hyperlinks and URLs. History In the 1960s, computer scientists began developing systems for time-sharing of computer resources. J. C. R. Licklider proposed the idea of a universal network while working at Bolt Beranek & Newman and, later, leading the Information Processing Techniques Office at the Advanced Research Projects Agency (ARPA) of the United States Department of Defense. Research into packet switching,[c] one of the fundamental Internet technologies, started in the work of Paul Baran at RAND in the early 1960s and, independently, Donald Davies at the United Kingdom's National Physical Laboratory in 1965. After the Symposium on Operating Systems Principles in 1967, packet switching from the proposed NPL network was incorporated into the design of the ARPANET, an experimental resource sharing network proposed by ARPA. ARPANET development began with two network nodes which were interconnected between the University of California, Los Angeles and the Stanford Research Institute on 29 October 1969. The third site was at the University of California, Santa Barbara, followed by the University of Utah. By the end of 1971, 15 sites were connected to the young ARPANET. Thereafter, the ARPANET gradually developed into a decentralized communications network, connecting remote centers and military bases in the United States. Other user networks and research networks, such as the Merit Network and CYCLADES, were developed in the late 1960s and early 1970s. Early international collaborations for the ARPANET were rare. Connections were made in 1973 to Norway (NORSAR and, later, NDRE) and to Peter Kirstein's research group at University College London, which provided a gateway to British academic networks, the first internetwork for resource sharing. ARPA projects, the International Network Working Group and commercial initiatives led to the development of various protocols and standards by which multiple separate networks could become a single network, or a network of networks. In 1974, Vint Cerf at Stanford University and Bob Kahn at DARPA published a proposal for "A Protocol for Packet Network Intercommunication". Cerf and his graduate students used the term internet as a shorthand for internetwork in RFC 675. The Internet Experiment Notes and later RFCs repeated this use. The work of Louis Pouzin and Robert Metcalfe had important influences on the resulting TCP/IP design. National PTTs and commercial providers developed the X.25 standard and deployed it on public data networks. The ARPANET initially served as a backbone for the interconnection of regional academic and military networks in the United States to enable resource sharing. Access to the ARPANET was expanded in 1981 when the National Science Foundation (NSF) funded the Computer Science Network (CSNET). In 1982, the Internet Protocol Suite (TCP/IP) was standardized, which facilitated worldwide proliferation of interconnected networks. TCP/IP network access expanded again in 1986 when the National Science Foundation Network (NSFNet) provided access to supercomputer sites in the United States for researchers, first at speeds of 56 kbit/s and later at 1.5 Mbit/s and 45 Mbit/s. The NSFNet expanded into academic and research organizations in Europe, Australia, New Zealand and Japan in 1988–89. Although other network protocols such as UUCP and PTT public data networks had global reach well before this time, this marked the beginning of the Internet as an intercontinental network. Commercial Internet service providers emerged in 1989 in the United States and Australia. The ARPANET was decommissioned in 1990. The linking of commercial networks and enterprises by the early 1990s, as well as the advent of the World Wide Web, marked the beginning of the transition to the modern Internet. Steady advances in semiconductor technology and optical networking created new economic opportunities for commercial involvement in the expansion of the network in its core and for delivering services to the public. In mid-1989, MCI Mail and Compuserve established connections to the Internet, delivering email and public access products to the half million users of the Internet. Just months later, on 1 January 1990, PSInet launched an alternate Internet backbone for commercial use; one of the networks that added to the core of the commercial Internet of later years. In March 1990, the first high-speed T1 (1.5 Mbit/s) link between the NSFNET and Europe was installed between Cornell University and CERN, allowing much more robust communications than were capable with satellites. Later in 1990, Tim Berners-Lee began writing WorldWideWeb, the first web browser, after two years of lobbying CERN management. By Christmas 1990, Berners-Lee had built all the tools necessary for a working Web: the HyperText Transfer Protocol (HTTP) 0.9, the HyperText Markup Language (HTML), the first Web browser (which was also an HTML editor and could access Usenet newsgroups and FTP files), the first HTTP server software (later known as CERN httpd), the first web server, and the first Web pages that described the project itself. In 1991 the Commercial Internet eXchange was founded, allowing PSInet to communicate with the other commercial networks CERFnet and Alternet. Stanford Federal Credit Union was the first financial institution to offer online Internet banking services to all of its members in October 1994. In 1996, OP Financial Group, also a cooperative bank, became the second online bank in the world and the first in Europe. By 1995, the Internet was fully commercialized in the U.S. when the NSFNet was decommissioned, removing the last restrictions on use of the Internet to carry commercial traffic. As technology advanced and commercial opportunities fueled reciprocal growth, the volume of Internet traffic started experiencing similar characteristics as that of the scaling of MOS transistors, exemplified by Moore's law, doubling every 18 months. This growth, formalized as Edholm's law, was catalyzed by advances in MOS technology, laser light wave systems, and noise performance. Since 1995, the Internet has tremendously impacted culture and commerce, including the rise of near-instant communication by email, instant messaging, telephony (Voice over Internet Protocol or VoIP), two-way interactive video calls, and the World Wide Web. Increasing amounts of data are transmitted at higher and higher speeds over fiber optic networks operating at 1 Gbit/s, 10 Gbit/s, or more. The Internet continues to grow, driven by ever-greater amounts of online information and knowledge, commerce, entertainment and social networking services. During the late 1990s, it was estimated that traffic on the public Internet grew by 100 percent per year, while the mean annual growth in the number of Internet users was thought to be between 20% and 50%. This growth is often attributed to the lack of central administration, which allows organic growth of the network, as well as the non-proprietary nature of the Internet protocols, which encourages vendor interoperability and prevents any one company from exerting too much control over the network. In November 2006, the Internet was included on USA Today's list of the New Seven Wonders. As of 31 March 2011[update], the estimated total number of Internet users was 2.095 billion (30% of world population). It is estimated that in 1993 the Internet carried only 1% of the information flowing through two-way telecommunication. By 2000 this figure had grown to 51%, and by 2007 more than 97% of all telecommunicated information was carried over the Internet. Modern smartphones can access the Internet through cellular carrier networks, and internet usage by mobile and tablet devices exceeded desktop worldwide for the first time in October 2016. As of 2018[update], 80% of the world's population were covered by a 4G network. The International Telecommunication Union (ITU) estimated that, by the end of 2017, 48% of individual users regularly connect to the Internet, up from 34% in 2012. Mobile Internet connectivity has played an important role in expanding access in recent years, especially in Asia and the Pacific and in Africa. The number of unique mobile cellular subscriptions increased from 3.9 billion in 2012 to 4.8 billion in 2016, two-thirds of the world's population, with more than half of subscriptions located in Asia and the Pacific. The limits that users face on accessing information via mobile applications coincide with a broader process of fragmentation of the Internet. Fragmentation restricts access to media content and tends to affect the poorest users the most. One solution, zero-rating, is the practice of Internet service providers allowing users free connectivity to access specific content or applications without cost. Social impact The Internet has enabled new forms of social interaction, activities, and social associations, giving rise to the scholarly study of the sociology of the Internet. Between 2000 and 2009, the number of Internet users globally rose from 390 million to 1.9 billion. By 2010, 22% of the world's population had access to computers with 1 billion Google searches every day, 300 million Internet users reading blogs, and 2 billion videos viewed daily on YouTube. In 2014 the world's Internet users surpassed 3 billion or 44 percent of world population, but two-thirds came from the richest countries, with 78 percent of Europeans using the Internet, followed by 57 percent of the Americas. However, by 2018, Asia alone accounted for 51% of all Internet users, with 2.2 billion out of the 4.3 billion Internet users in the world. China's Internet users surpassed a major milestone in 2018, when the country's Internet regulatory authority, China Internet Network Information Centre, announced that China had 802 million users. China was followed by India, with some 700 million users, with the United States third with 275 million users. However, in terms of penetration, in 2022, China had a 70% penetration rate compared to India's 60% and the United States's 90%. In 2022, 54% of the world's Internet users were based in Asia, 14% in Europe, 7% in North America, 10% in Latin America and the Caribbean, 11% in Africa, 4% in the Middle East and 1% in Oceania. In 2019, Kuwait, Qatar, the Falkland Islands, Bermuda and Iceland had the highest Internet penetration by the number of users, with 93% or more of the population with access. As of 2022, it was estimated that 5.4 billion people use the Internet, more than two-thirds of the world's population. Early computer systems were limited to the characters in the American Standard Code for Information Interchange (ASCII), a subset of the Latin alphabet. After English (27%), the most requested languages on the World Wide Web are Chinese (25%), Spanish (8%), Japanese (5%), Portuguese and German (4% each), Arabic, French and Russian (3% each), and Korean (2%). Modern character encoding standards, such as Unicode, allow for development and communication in the world's widely used languages. However, some glitches such as mojibake (incorrect display of some languages' characters) still remain. Several neologisms exist that refer to Internet users: Netizen (as in "citizen of the net") refers to those actively involved in improving online communities, the Internet in general or surrounding political affairs and rights such as free speech, Internaut refers to operators or technically highly capable users of the Internet, digital citizen refers to a person using the Internet in order to engage in society, politics, and government participation. The Internet allows greater flexibility in working hours and location, especially with the spread of unmetered high-speed connections. The Internet can be accessed almost anywhere by numerous means, including through mobile Internet devices. Mobile phones, datacards, handheld game consoles and cellular routers allow users to connect to the Internet wirelessly.[citation needed] Educational material at all levels from pre-school (e.g. CBeebies) to post-doctoral (e.g. scholarly literature through Google Scholar) is available on websites. The internet has facilitated the development of virtual universities and distance education, enabling both formal and informal education. The Internet allows researchers to conduct research remotely via virtual laboratories, with profound changes in reach and generalizability of findings as well as in communication between scientists and in the publication of results. By the late 2010s the Internet had been described as "the main source of scientific information "for the majority of the global North population".: 111 Wikis have also been used in the academic community for sharing and dissemination of information across institutional and international boundaries. In those settings, they have been found useful for collaboration on grant writing, strategic planning, departmental documentation, and committee work. The United States Patent and Trademark Office uses a wiki to allow the public to collaborate on finding prior art relevant to examination of pending patent applications. Queens, New York has used a wiki to allow citizens to collaborate on the design and planning of a local park. The English Wikipedia has the largest user base among wikis on the World Wide Web and ranks in the top 10 among all sites in terms of traffic. The Internet has been a major outlet for leisure activity since its inception, with entertaining social experiments such as MUDs and MOOs being conducted on university servers, and humor-related Usenet groups receiving much traffic. Many Internet forums have sections devoted to games and funny videos. Another area of leisure activity on the Internet is multiplayer gaming. This form of recreation creates communities, where people of all ages and origins enjoy the fast-paced world of multiplayer games. These range from MMORPG to first-person shooters, from role-playing video games to online gambling. While online gaming has been around since the 1970s, modern modes of online gaming began with subscription services such as GameSpy and MPlayer. Streaming media is the real-time delivery of digital media for immediate consumption or enjoyment by end users. Streaming companies (such as Netflix, Disney+, Amazon's Prime Video, Mubi, Hulu, and Apple TV+) now dominate the entertainment industry, eclipsing traditional broadcasters. Audio streamers such as Spotify and Apple Music also have significant market share in the audio entertainment market. Video sharing websites are also a major factor in the entertainment ecosystem. YouTube was founded on 15 February 2005 and is now the leading website for free streaming video with more than two billion users. It uses a web player to stream and show video files. YouTube users watch hundreds of millions, and upload hundreds of thousands, of videos daily. Other video sharing websites include Vimeo, Instagram and TikTok.[citation needed] Although many governments have attempted to restrict both Internet pornography and online gambling, this has generally failed to stop their widespread popularity. A number of advertising-funded ostensible video sharing websites known as "tube sites" have been created to host shared pornographic video content. Due to laws requiring the documentation of the origin of pornography, these websites now largely operate in conjunction with pornographic movie studios and their own independent creator networks, acting as de-facto video streaming services. Major players in this field include the market leader Aylo, the operator of PornHub and numerous other branded sites, as well as other independent operators such as xHamster and Xvideos. As of 2023[update], Internet traffic to pornographic video sites rivalled that of mainstream video streaming and sharing services. Remote work is facilitated by tools such as groupware, virtual private networks, conference calling, videotelephony, and VoIP so that work may be performed from any location, such as the worker's home.[citation needed] The spread of low-cost Internet access in developing countries has opened up new possibilities for peer-to-peer charities, which allow individuals to contribute small amounts to charitable projects for other individuals. Websites, such as DonorsChoose and GlobalGiving, allow small-scale donors to direct funds to individual projects of their choice. A popular twist on Internet-based philanthropy is the use of peer-to-peer lending for charitable purposes. Kiva pioneered this concept in 2005, offering the first web-based service to publish individual loan profiles for funding. The low cost and nearly instantaneous sharing of ideas, knowledge, and skills have made collaborative work dramatically easier, with the help of collaborative software, which allow groups to easily form, cheaply communicate, and share ideas. An example of collaborative software is the free software movement, which has produced, among other things, Linux, Mozilla Firefox, and OpenOffice.org (later forked into LibreOffice).[citation needed] Content management systems allow collaborating teams to work on shared sets of documents simultaneously without accidentally destroying each other's work.[citation needed] The internet also allows for cloud computing, virtual private networks, remote desktops, and remote work.[citation needed] The online disinhibition effect describes the tendency of many individuals to behave more stridently or offensively online than they would in person. A significant number of feminist women have been the target of various forms of harassment, including insults and hate speech, to, in extreme cases, rape and death threats, in response to posts they have made on social media. Social media companies have been criticized in the past for not doing enough to aid victims of online abuse. Children also face dangers online such as cyberbullying and approaches by sexual predators, who sometimes pose as children themselves. Due to naivety, they may also post personal information about themselves online, which could put them or their families at risk unless warned not to do so. Many parents choose to enable Internet filtering or supervise their children's online activities in an attempt to protect their children from pornography or violent content on the Internet. The most popular social networking services commonly forbid users under the age of 13. However, these policies can be circumvented by registering an account with a false birth date, and a significant number of children aged under 13 join such sites.[citation needed] Social networking services for younger children, which claim to provide better levels of protection for children, also exist. Internet usage has been correlated to users' loneliness. Lonely people tend to use the Internet as an outlet for their feelings and to share their stories with others, such as in the "I am lonely will anyone speak to me" thread.[citation needed] Cyberslacking can become a drain on corporate resources; employees spend a significant amount of time surfing the Web while at work. Internet addiction disorder is excessive computer use that interferes with daily life. Nicholas G. Carr believes that Internet use has other effects on individuals, for instance improving skills of scan-reading and interfering with the deep thinking that leads to true creativity. Electronic business encompasses business processes spanning the entire value chain: purchasing, supply chain management, marketing, sales, customer service, and business relationship. E-commerce seeks to add revenue streams using the Internet to build and enhance relationships with clients and partners. According to International Data Corporation, the size of worldwide e-commerce, when global business-to-business and -consumer transactions are combined, equate to $16 trillion in 2013. A report by Oxford Economics added those two together to estimate the total size of the digital economy at $20.4 trillion, equivalent to roughly 13.8% of global sales. While much has been written of the economic advantages of Internet-enabled commerce, there is also evidence that some aspects of the Internet such as maps and location-aware services may serve to reinforce economic inequality and the digital divide. Electronic commerce may be responsible for consolidation and the decline of mom-and-pop, brick and mortar businesses resulting in increases in income inequality. A 2013 Institute for Local Self-Reliance report states that brick-and-mortar retailers employ 47 people for every $10 million in sales, while Amazon employs only 14. Similarly, the 700-employee room rental start-up Airbnb was valued at $10 billion in 2014, about half as much as Hilton Worldwide, which employs 152,000 people. At that time, Uber employed 1,000 full-time employees and was valued at $18.2 billion, about the same valuation as Avis Rent a Car and The Hertz Corporation combined, which together employed almost 60,000 people. Advertising on popular web pages can be lucrative, and e-commerce. Online advertising is a form of marketing and advertising which uses the Internet to deliver promotional marketing messages to consumers. It includes email marketing, search engine marketing (SEM), social media marketing, many types of display advertising (including web banner advertising), and mobile advertising. In 2011, Internet advertising revenues in the United States surpassed those of cable television and nearly exceeded those of broadcast television.: 19 Many common online advertising practices are controversial and increasingly subject to regulation. The Internet has achieved new relevance as a political tool. The presidential campaign of Howard Dean in 2004 in the United States was notable for its success in soliciting donation via the Internet. Many political groups use the Internet to achieve a new method of organizing for carrying out their mission, having given rise to Internet activism. Social media websites, such as Facebook and Twitter, helped people organize the Arab Spring, by helping activists organize protests, communicate grievances, and disseminate information. Many have understood the Internet as an extension of the Habermasian notion of the public sphere, observing how network communication technologies provide something like a global civic forum. However, incidents of politically motivated Internet censorship have now been recorded in many countries, including western democracies. E-government is the use of technological communications devices, such as the Internet, to provide public services to citizens and other persons in a country or region. E-government offers opportunities for more direct and convenient citizen access to government and for government provision of services directly to citizens. Cybersectarianism is a new organizational form that involves: highly dispersed small groups of practitioners that may remain largely anonymous within the larger social context and operate in relative secrecy, while still linked remotely to a larger network of believers who share a set of practices and texts, and often a common devotion to a particular leader. Overseas supporters provide funding and support; domestic practitioners distribute tracts, participate in acts of resistance, and share information on the internal situation with outsiders. Collectively, members and practitioners of such sects construct viable virtual communities of faith, exchanging personal testimonies and engaging in the collective study via email, online chat rooms, and web-based message boards. In particular, the British government has raised concerns about the prospect of young British Muslims being indoctrinated into Islamic extremism by material on the Internet, being persuaded to join terrorist groups such as the so-called "Islamic State", and then potentially committing acts of terrorism on returning to Britain after fighting in Syria or Iraq.[citation needed] Applications and services The Internet carries many applications and services, most prominently the World Wide Web, including social media, electronic mail, mobile applications, multiplayer online games, Internet telephony, file sharing, and streaming media services. The World Wide Web is a global collection of documents, images, multimedia, applications, and other resources, logically interrelated by hyperlinks and referenced with Uniform Resource Identifiers (URIs), which provide a global system of named references. URIs symbolically identify services, web servers, databases, and the documents and resources that they can provide. HyperText Transfer Protocol (HTTP) is the main access protocol of the World Wide Web. Web services also use HTTP for communication between software systems for information transfer, sharing and exchanging business data and logistics and is one of many languages or protocols that can be used for communication on the Internet. World Wide Web browser software, such as Microsoft Edge, Mozilla Firefox, Opera, Apple's Safari, and Google Chrome, enable users to navigate from one web page to another via the hyperlinks embedded in the documents. These documents may also contain computer data, including graphics, sounds, text, video, multimedia and interactive content. Client-side scripts can include animations, games, office applications and scientific demonstrations. Email is an important communications service available via the Internet. The concept of sending electronic text messages between parties, analogous to mailing letters or memos, predates the creation of the Internet. Internet telephony is a common communications service realized with the Internet. The name of the principal internetworking protocol, the Internet Protocol, lends its name to voice over Internet Protocol (VoIP).[citation needed] VoIP systems now dominate many markets, being as easy and convenient as a traditional telephone, while having substantial cost savings, especially over long distances. File sharing is the practice of transferring large amounts of data in the form of computer files across the Internet, for example via file servers. The load of bulk downloads to many users can be eased by the use of "mirror" servers or peer-to-peer networks. Access to the file may be controlled by user authentication, the transit of the file over the Internet may be obscured by encryption, and money may change hands for access to the file. The price can be paid by the remote charging of funds from, for example, a credit card whose details are also passed—usually fully encrypted—across the Internet. The origin and authenticity of the file received may be checked by a digital signature. Governance The Internet is a global network that comprises many voluntarily interconnected autonomous networks. It operates without a central governing body. The technical underpinning and standardization of the core protocols (IPv4 and IPv6) is an activity of the Internet Engineering Task Force (IETF), a non-profit organization of loosely affiliated international participants that anyone may associate with by contributing technical expertise. While the hardware components in the Internet infrastructure can often be used to support other software systems, it is the design and the standardization process of the software that characterizes the Internet and provides the foundation for its scalability and success. The responsibility for the architectural design of the Internet software systems has been assumed by the IETF. The IETF conducts standard-setting work groups, open to any individual, about the various aspects of Internet architecture. The resulting contributions and standards are published as Request for Comments (RFC) documents on the IETF web site. The principal methods of networking that enable the Internet are contained in specially designated RFCs that constitute the Internet Standards. Other less rigorous documents are simply informative, experimental, or historical, or document the best current practices when implementing Internet technologies. To maintain interoperability, the principal name spaces of the Internet are administered by the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN is governed by an international board of directors drawn from across the Internet technical, business, academic, and other non-commercial communities. The organization coordinates the assignment of unique identifiers for use on the Internet, including domain names, IP addresses, application port numbers in the transport protocols, and many other parameters. Globally unified name spaces are essential for maintaining the global reach of the Internet. This role of ICANN distinguishes it as perhaps the only central coordinating body for the global Internet. The National Telecommunications and Information Administration, an agency of the United States Department of Commerce, had final approval over changes to the DNS root zone until the IANA stewardship transition on 1 October 2016. Regional Internet registries (RIRs) were established for five regions of the world to assign IP address blocks and other Internet parameters to local registries, such as Internet service providers, from a designated pool of addresses set aside for each region:[citation needed] The Internet Society (ISOC) was founded in 1992 with a mission to "assure the open development, evolution and use of the Internet for the benefit of all people throughout the world". Its members include individuals as well as corporations, organizations, governments, and universities. Among other activities ISOC provides an administrative home for a number of less formally organized groups that are involved in developing and managing the Internet, including: the Internet Engineering Task Force (IETF), Internet Architecture Board (IAB), Internet Engineering Steering Group (IESG), Internet Research Task Force (IRTF), and Internet Research Steering Group (IRSG). On 16 November 2005, the United Nations-sponsored World Summit on the Information Society in Tunis established the Internet Governance Forum (IGF) to discuss Internet-related issues.[citation needed] Infrastructure The communications infrastructure of the Internet consists of its hardware components and a system of software layers that control various aspects of the architecture. As with any computer network, the Internet physically consists of routers, media (such as cabling and radio links), repeaters, and modems. However, as an example of internetworking, many of the network nodes are not necessarily Internet equipment per se. Internet packets are carried by other full-fledged networking protocols, with the Internet acting as a homogeneous networking standard, running across heterogeneous hardware, with the packets guided to their destinations by IP routers.[citation needed] Internet service providers (ISPs) establish worldwide connectivity between individual networks at various levels of scope. At the top of the routing hierarchy are the tier 1 networks, large telecommunication companies that exchange traffic directly with each other via very high speed fiber-optic cables and governed by peering agreements. Tier 2 and lower-level networks buy Internet transit from other providers to reach at least some parties on the global Internet, though they may also engage in peering. End-users who only access the Internet when needed to perform a function or obtain information, represent the bottom of the routing hierarchy.[citation needed] An ISP may use a single upstream provider for connectivity, or implement multihoming to achieve redundancy and load balancing. Internet exchange points are major traffic exchanges with physical connections to multiple ISPs. Large organizations, such as academic institutions, large enterprises, and governments, may perform the same function as ISPs, engaging in peering and purchasing transit on behalf of their internal networks. Research networks tend to interconnect with large subnetworks such as GEANT, GLORIAD, Internet2, and the UK's national research and education network, JANET.[citation needed] Common methods of Internet access by users include broadband over coaxial cable, fiber optics or copper wires, Wi-Fi, satellite, and cellular telephone technology.[citation needed] Grassroots efforts have led to wireless community networks. Commercial Wi-Fi services that cover large areas are available in many cities, such as New York, London, Vienna, Toronto, San Francisco, Philadelphia, Chicago and Pittsburgh. Most servers that provide internet services are today hosted in data centers, and content is often accessed through high-performance content delivery networks. Colocation centers often host private peering connections between their customers, internet transit providers, cloud providers, meet-me rooms for connecting customers together, Internet exchange points, and landing points and terminal equipment for fiber optic submarine communication cables, connecting the internet. Internet Protocol Suite The Internet standards describe a framework known as the Internet protocol suite (also called TCP/IP, based on the first two components.) This is a suite of protocols that are ordered into a set of four conceptional layers by the scope of their operation, originally documented in RFC 1122 and RFC 1123:[citation needed] The most prominent component of the Internet model is the Internet Protocol. IP enables internetworking, essentially establishing the Internet itself. Two versions of the Internet Protocol exist, IPv4 and IPv6.[citation needed] Aside from the complex array of physical connections that make up its infrastructure, the Internet is facilitated by bi- or multi-lateral commercial contracts (e.g., peering agreements), and by technical specifications or protocols that describe the exchange of data over the network.[citation needed] For locating individual computers on the network, the Internet provides IP addresses. IP addresses are used by the Internet infrastructure to direct internet packets to their destinations. They consist of fixed-length numbers, which are found within the packet. IP addresses are generally assigned to equipment either automatically via Dynamic Host Configuration Protocol, or are configured.[citation needed] Domain Name Systems convert user-inputted domain names (e.g. "en.wikipedia.org") into IP addresses.[citation needed] Internet Protocol version 4 (IPv4) defines an IP address as a 32-bit number. IPv4 is the initial version used on the first generation of the Internet and is still in dominant use. It was designed in 1981 to address up to ≈4.3 billion (109) hosts. However, the explosive growth of the Internet has led to IPv4 address exhaustion, which entered its final stage in 2011, when the global IPv4 address allocation pool was exhausted. Because of the growth of the Internet and the depletion of available IPv4 addresses, a new version of IP IPv6, was developed in the mid-1990s, which provides vastly larger addressing capabilities and more efficient routing of Internet traffic. IPv6 uses 128 bits for the IP address and was standardized in 1998. IPv6 deployment has been ongoing since the mid-2000s and is currently in growing deployment around the world, since Internet address registries began to urge all resource managers to plan rapid adoption and conversion. By design, IPv6 is not directly interoperable with IPv4. Instead, it establishes a parallel version of the Internet not directly accessible with IPv4 software. Thus, translation facilities exist for internetworking, and some nodes have duplicate networking software for both networks. Essentially all modern computer operating systems support both versions of the Internet Protocol.[citation needed] Network infrastructure, however, has been lagging in this development.[citation needed] A subnet or subnetwork is a logical subdivision of an IP network.: 1, 16 Computers that belong to a subnet are addressed with an identical most-significant bit-group in their IP addresses. This results in the logical division of an IP address into two fields, the network number or routing prefix and the rest field or host identifier. The rest field is an identifier for a specific host or network interface.[citation needed] The routing prefix may be expressed in Classless Inter-Domain Routing (CIDR) notation written as the first address of a network, followed by a slash character (/), and ending with the bit-length of the prefix. For example, 198.51.100.0/24 is the prefix of the Internet Protocol version 4 network starting at the given address, having 24 bits allocated for the network prefix, and the remaining 8 bits reserved for host addressing. Addresses in the range 198.51.100.0 to 198.51.100.255 belong to this network. The IPv6 address specification 2001:db8::/32 is a large address block with 296 addresses, having a 32-bit routing prefix.[citation needed] For IPv4, a network may also be characterized by its subnet mask or netmask, which is the bitmask that when applied by a bitwise AND operation to any IP address in the network, yields the routing prefix. Subnet masks are also expressed in dot-decimal notation like an address. For example, 255.255.255.0 is the subnet mask for the prefix 198.51.100.0/24.[citation needed] Computers and routers use routing tables in their operating system to forward IP packets to reach a node on a different subnetwork. Routing tables are maintained by manual configuration or automatically by routing protocols. End-nodes typically use a default route that points toward an ISP providing transit, while ISP routers use the Border Gateway Protocol to establish the most efficient routing across the complex connections of the global Internet.[citation needed] The default gateway is the node that serves as the forwarding host (router) to other networks when no other route specification matches the destination IP address of a packet. Security Internet resources, hardware, and software components are the target of criminal or malicious attempts to gain unauthorized control to cause interruptions, commit fraud, engage in blackmail or access private information. Malware is malicious software used and distributed via the Internet. It includes computer viruses which are copied with the help of humans, computer worms which copy themselves automatically, software for denial of service attacks, ransomware, botnets, and spyware that reports on the activity and typing of users.[citation needed] Usually, these activities constitute cybercrime. Defense theorists have also speculated about the possibilities of hackers using cyber warfare using similar methods on a large scale. Malware poses serious problems to individuals and businesses on the Internet. According to Symantec's 2018 Internet Security Threat Report (ISTR), malware variants number has increased to 669,947,865 in 2017, which is twice as many malware variants as in 2016. Cybercrime, which includes malware attacks as well as other crimes committed by computer, was predicted to cost the world economy US$6 trillion in 2021, and is increasing at a rate of 15% per year. Since 2021, malware has been designed to target computer systems that run critical infrastructure such as the electricity distribution network. Malware can be designed to evade antivirus software detection algorithms. The vast majority of computer surveillance involves the monitoring of data and traffic on the Internet. In the United States for example, under the Communications Assistance For Law Enforcement Act, all phone calls and broadband Internet traffic (emails, web traffic, instant messaging, etc.) are required to be available for unimpeded real-time monitoring by Federal law enforcement agencies. Under the Act, all U.S. telecommunications providers are required to install packet sniffing technology to allow Federal law enforcement and intelligence agencies to intercept all of their customers' broadband Internet and VoIP traffic.[d] The large amount of data gathered from packet capture requires surveillance software that filters and reports relevant information, such as the use of certain words or phrases, the access to certain types of web sites, or communicating via email or chat with certain parties. Agencies, such as the Information Awareness Office, NSA, GCHQ and the FBI, spend billions of dollars per year to develop, purchase, implement, and operate systems for interception and analysis of data. Similar systems are operated by Iranian secret police to identify and suppress dissidents. The required hardware and software were allegedly installed by German Siemens AG and Finnish Nokia. Some governments, such as those of Myanmar, Iran, North Korea, Mainland China, Saudi Arabia and the United Arab Emirates, restrict access to content on the Internet within their territories, especially to political and religious content, with domain name and keyword filters. In Norway, Denmark, Finland, and Sweden, major Internet service providers have voluntarily agreed to restrict access to sites listed by authorities. While this list of forbidden resources is supposed to contain only known child pornography sites, the content of the list is secret. Many countries, including the United States, have enacted laws against the possession or distribution of certain material, such as child pornography, via the Internet but do not mandate filter software. Many free or commercially available software programs, called content-control software are available to users to block offensive specific on individual computers or networks in order to limit access by children to pornographic material or depiction of violence.[citation needed] Performance As the Internet is a heterogeneous network, its physical characteristics, including, for example the data transfer rates of connections, vary widely. It exhibits emergent phenomena that depend on its large-scale organization. PB per monthYear020,00040,00060,00080,000100,000120,000140,000199019952000200520102015Petabytes per monthGlobal Internet Traffic Volume The volume of Internet traffic is difficult to measure because no single point of measurement exists in the multi-tiered, non-hierarchical topology. Traffic data may be estimated from the aggregate volume through the peering points of the Tier 1 network providers, but traffic that stays local in large provider networks may not be accounted for.[citation needed] An Internet blackout or outage can be caused by local signaling interruptions. Disruptions of submarine communications cables may cause blackouts or slowdowns to large areas, such as in the 2008 submarine cable disruption. Less-developed countries are more vulnerable due to the small number of high-capacity links. Land cables are also vulnerable, as in 2011 when a woman digging for scrap metal severed most connectivity for the nation of Armenia. Internet blackouts affecting almost entire countries can be achieved by governments as a form of Internet censorship, as in the blockage of the Internet in Egypt, whereby approximately 93% of networks were without access in 2011 in an attempt to stop mobilization for anti-government protests. Estimates of the Internet's electricity usage have been the subject of controversy, according to a 2014 peer-reviewed research paper that found claims differing by a factor of 20,000 published in the literature during the preceding decade, ranging from 0.0064 kilowatt hours per gigabyte transferred (kWh/GB) to 136 kWh/GB. The researchers attributed these discrepancies mainly to the year of reference (i.e. whether efficiency gains over time had been taken into account) and to whether "end devices such as personal computers and servers are included" in the analysis. In 2011, academic researchers estimated the overall energy used by the Internet to be between 170 and 307 GW, less than two percent of the energy used by humanity. This estimate included the energy needed to build, operate, and periodically replace the estimated 750 million laptops, a billion smart phones and 100 million servers worldwide as well as the energy that routers, cell towers, optical switches, Wi-Fi transmitters and cloud storage devices use when transmitting Internet traffic. According to a non-peer-reviewed study published in 2018 by The Shift Project (a French think tank funded by corporate sponsors), nearly 4% of global CO2 emissions could be attributed to global data transfer and the necessary infrastructure. The study also said that online video streaming alone accounted for 60% of this data transfer and therefore contributed to over 300 million tons of CO2 emission per year, and argued for new "digital sobriety" regulations restricting the use and size of video files. See also Notes References Sources Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Huddles_(app)] | [TOKENS: 1008] |
Contents Huddles (app) Huddles (originally Byte, and later Clash (via acquisition)) was an American short-form video hosting service and creator monetization platform social network where users could create looping videos between 2–16 seconds long. It was created by a team led by Brendon McNerney and PJ Leimgruber who formerly worked together at NeoReach, Inc. Dom Hofmann was involved as the architect of much of the code, as the founder of Byte, a successor to Vine, which Hofmann co-founded, until the project was sold to Clash App, Inc., and subsequently renamed. Initially teased as v2, it was branded as Byte in November 2018. After a three-year closed beta, it officially launched on Apple's App Store and the Google Play Store on January 24, 2020. It was later sold to Clash, another short-form video app, a year later. Both apps thus merged into a single app called Clash, which was then later renamed to Huddles. It was discontinued on May 3, 2023. History Byte's predecessor, Vine, was founded in June 2012. It was acquired by Twitter in October 2012. It underwent a staggered update on iOS, Android, and Windows Phone systems throughout much of 2013. The main Vine app was shut down by Twitter in January 2017, disallowing all new videos to be uploaded. The Vine homepage was made into an archive, with users being able to view previously uploaded content. As of 2019, the archive is no longer available, though individual videos are still able to be accessed via their direct link. Vine co-founder Dom Hofmann announced in December 2017 that he intended to launch a successor to Vine. At the time, he called it "v2". In May 2018, he posted an update that the project was being put on hold. Among other things, he said that the biggest reason for this was "financial and legal hurdles". He said that his intention was to fund the new service himself as a personal project, but the attention that the announcement generated suggested that the cost to build and run a service that was sustainable at launch would be too high. In November, he announced that the project was moving forward again with funding and a team, under the new "Byte" branding. At the time, the website invited users to sign up for updates and for content creators to join its "creator program". The partner program was shut down in August, with the byte team announcing that they "will be using this time to take everything [they've] learned and apply it toward future opportunities and programs". Byte was officially launched to the public on the iOS and Android platforms in over 40 countries on January 24, 2020, with the tagline "creativity first". Additionally, the company has promised a program that intends to compensate creators for their work. In the media Byte was referred to as a direct competitor to TikTok and Likee, similar video sharing platforms popular with teens. On January 26, 2021, it was announced that Clash, another short-form video app, would be acquiring Byte. The deal was finalized the following month, with both apps merging into a single one called Clash. After months of beta testing, Clash was publicly available on App Store on October 12, 2021. It became available for Android two months later in 41 languages. On May 3, 2023, Huddles announced its discontinuation as a standalone service via a series of tweets and a Medium blog post. The company began to remove the Huddles app from the Apple's App Store and Google Play Store in a phased manner, with the process commencing immediately upon announcement. According to Huddles, the aim of this decision was to avoid having an inactive login screen visible to users. The Medium post titled "Huddles is joining a larger Creator family" provided more context on the development, revealing that Huddles was transitioning towards becoming part of a broader 'Creator family'. Further details about what this 'Creator family' entails or how Huddles' integration would unfold were not disclosed in the initial announcement. Features Huddles allowed users to publish videos between 2–16 seconds long, either captured through the app or previously recorded and stored on their devices. Similarly to other social media platforms, Huddles allowed users to follow other accounts. New accounts automatically followed Huddles's official account on their service. The main home screen used to feature a scrollable feed of content from accounts that the user was following. The platform also supported the ability to "like" and "rebyte" videos. In November 2020, a color customizer and a chat feature were added.[citation needed] The app also featured a search screen with tiles for popular and latest content along with video categories like comedy, animation and others. See also References External links |
======================================== |
[SOURCE: https://www.theverge.com/news/760162/roblox-louisiana-lawsuit-child-predators-safety] | [TOKENS: 2289] |
NewsCloseNewsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All NewsEntertainmentCloseEntertainmentPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All EntertainmentGamingCloseGamingPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All GamingLouisiana sues Roblox for creating an environment where ‘child predators thrive’In a lawsuit, the state alleges that Roblox ‘fails to implement basic safety controls to protect child users.’In a lawsuit, the state alleges that Roblox ‘fails to implement basic safety controls to protect child users.’by Jay PetersCloseJay PetersSenior ReporterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Jay PetersAug 15, 2025, 6:38 PM UTCLinkShareGiftIllustration: The VergePart OfRoblox: all the news about the popular social and gaming platformsee all updates Jay PetersCloseJay PetersPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Jay Peters is a senior reporter covering technology, gaming, and more. He joined The Verge in 2019 after nearly two years at Techmeme.The state of Louisiana has filed a lawsuit against Roblox, alleging that the company has “permitted and perpetuated an online environment in which child predators thrive, directly contributing to the widespread victimization of minor children in Louisiana.”Roblox sees more than 111.8 million daily active users, and it’s hugely popular with children, with users under 13 comprising nearly 40 percent of players last quarter. However, the platform has come under significant scrutiny over reported failures to protect children on the platform, with Bloomberg publishing a major report last year about predators on Roblox and the investment firm Hindenburg Research alleging that its research revealed “an X-rated pedophile hellscape.” In recent months, Roblox has taken steps to bolster its child safety features, including introducing parent accounts that can manage their child’s account and the ability for parents to block people on their child’s friend list.But Louisiana alleges that Roblox’s “deliberate failure to implement effective safety measures to protect child users from well-documented predatory threats, along with its ongoing failure to warn parents and children of the foreseeable dangers posed by its platform, has directly facilitated the widespread sexual exploitation of minors and inflicted severe, lasting harm upon the children of Louisiana,” according to the complaint.“Roblox is overrun with harmful content and child predators because it prioritizes user growth, revenue, and profits over child safety,” Attorney General Liz Murrill says. “Every parent should be aware of the clear and present danger poised to their children by Roblox so they can prevent the unthinkable from ever happening in their own home.”Roblox spokesperson Kadia Koroma sent the following statement about the lawsuit to The Verge:The assertion that Roblox would intentionally put our users at risk of exploitation is categorically untrue. Every day, tens of millions of people around the world use Roblox to learn stem skills, play, and imagine and have a safe experience on our platform. We dedicate vast resources to supporting a safe infrastructure including advanced technology and 24/7 human moderation, to detect and prevent inappropriate content and behavior — not only because it’s important to us but because it is such a critical issue and so important to our community. We dispute these allegations and we remain committed to working with Attorney General Murrill to keep kids safe.While no system is perfect, Roblox has implemented rigorous safeguards—such as restrictions on sharing personal information, links, and user-to-user image sharing—to help protect our community. Unfortunately, bad actors will try to circumvent our systems to try to direct users off the platform, where safety standards and moderation practices may differ. We continuously work to block those efforts and to enhance our moderation approaches to promote a safe and enjoyable environment for all users.The company also published an article responding to the lawsuit on its newsroom.Seven lawsuits regarding child predator issues have been filed against Roblox in a “little over two weeks,” Bloomberg reports.This week, Roblox published a post on its website about why it removes what it describes as “vigilantes” who try to catch bad actors on Roblox by using tactics “similar to actual predators.” One user, who goes by “Schlep” and says on his X profile that he has contributed to “6 Roblox arrests and counting,” posted screenshots on X last week of what he says was a cease-and-desist notice by Roblox.Update, August 15th: Added statement from Roblox. Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Jay PetersCloseJay PetersSenior ReporterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Jay PetersEntertainmentCloseEntertainmentPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All EntertainmentGamingCloseGamingPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All GamingNewsCloseNewsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All NewsMore in: Roblox: all the news about the popular social and gaming platformLos Angeles County is suing Roblox.Jay PetersFeb 20Roblox is working to pull in adult playersStevie BonifieldFeb 5Roblox calls its take on AI world models ‘real-time dreaming’Jay PetersFeb 4Most PopularMost PopularXbox chief Phil Spencer is leaving MicrosoftRead Microsoft gaming CEO Asha Sharma’s first memo on the future of XboxThe RAM shortage is coming for everything you care aboutAmazon blames human employees for an AI coding agent’s mistakeWill Stancil, man of the people or just an annoying guy?The Verge DailyA free daily digest of the news that matters most.Email (required)Sign UpBy submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.Advertiser Content FromThis is the title for the native ad Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Entertainment Posts from this topic will be added to your daily email digest and your homepage feed. See All Gaming Louisiana sues Roblox for creating an environment where ‘child predators thrive’ In a lawsuit, the state alleges that Roblox ‘fails to implement basic safety controls to protect child users.’ In a lawsuit, the state alleges that Roblox ‘fails to implement basic safety controls to protect child users.’ Posts from this author will be added to your daily email digest and your homepage feed. See All by Jay Peters Posts from this author will be added to your daily email digest and your homepage feed. See All by Jay Peters The state of Louisiana has filed a lawsuit against Roblox, alleging that the company has “permitted and perpetuated an online environment in which child predators thrive, directly contributing to the widespread victimization of minor children in Louisiana.” Roblox sees more than 111.8 million daily active users, and it’s hugely popular with children, with users under 13 comprising nearly 40 percent of players last quarter. However, the platform has come under significant scrutiny over reported failures to protect children on the platform, with Bloomberg publishing a major report last year about predators on Roblox and the investment firm Hindenburg Research alleging that its research revealed “an X-rated pedophile hellscape.” In recent months, Roblox has taken steps to bolster its child safety features, including introducing parent accounts that can manage their child’s account and the ability for parents to block people on their child’s friend list. But Louisiana alleges that Roblox’s “deliberate failure to implement effective safety measures to protect child users from well-documented predatory threats, along with its ongoing failure to warn parents and children of the foreseeable dangers posed by its platform, has directly facilitated the widespread sexual exploitation of minors and inflicted severe, lasting harm upon the children of Louisiana,” according to the complaint. “Roblox is overrun with harmful content and child predators because it prioritizes user growth, revenue, and profits over child safety,” Attorney General Liz Murrill says. “Every parent should be aware of the clear and present danger poised to their children by Roblox so they can prevent the unthinkable from ever happening in their own home.” Roblox spokesperson Kadia Koroma sent the following statement about the lawsuit to The Verge: The assertion that Roblox would intentionally put our users at risk of exploitation is categorically untrue. Every day, tens of millions of people around the world use Roblox to learn stem skills, play, and imagine and have a safe experience on our platform. We dedicate vast resources to supporting a safe infrastructure including advanced technology and 24/7 human moderation, to detect and prevent inappropriate content and behavior — not only because it’s important to us but because it is such a critical issue and so important to our community. We dispute these allegations and we remain committed to working with Attorney General Murrill to keep kids safe. While no system is perfect, Roblox has implemented rigorous safeguards—such as restrictions on sharing personal information, links, and user-to-user image sharing—to help protect our community. Unfortunately, bad actors will try to circumvent our systems to try to direct users off the platform, where safety standards and moderation practices may differ. We continuously work to block those efforts and to enhance our moderation approaches to promote a safe and enjoyable environment for all users. The company also published an article responding to the lawsuit on its newsroom. Seven lawsuits regarding child predator issues have been filed against Roblox in a “little over two weeks,” Bloomberg reports. This week, Roblox published a post on its website about why it removes what it describes as “vigilantes” who try to catch bad actors on Roblox by using tactics “similar to actual predators.” One user, who goes by “Schlep” and says on his X profile that he has contributed to “6 Roblox arrests and counting,” posted screenshots on X last week of what he says was a cease-and-desist notice by Roblox. Update, August 15th: Added statement from Roblox. Posts from this author will be added to your daily email digest and your homepage feed. See All by Jay Peters Posts from this topic will be added to your daily email digest and your homepage feed. See All Entertainment Posts from this topic will be added to your daily email digest and your homepage feed. See All Gaming Posts from this topic will be added to your daily email digest and your homepage feed. See All News More in: Roblox: all the news about the popular social and gaming platform Most Popular The Verge Daily A free daily digest of the news that matters most. This is the title for the native ad More in News This is the title for the native ad Top Stories © 2026 Vox Media, LLC. All Rights Reserved |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_ref-GPro87_185-0] | [TOKENS: 10728] |
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006—over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to £700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, — with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) — as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999–2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256×224 to 640×480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consoles—including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears out—usually unevenly—due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Manhattan] | [TOKENS: 15735] |
Contents Manhattan Manhattan[b] is the most densely populated and geographically smallest of the five boroughs of New York City. Coextensive with New York County, Manhattan is the smallest county by area in the U.S. state of New York, and one of the smallest in the United States. Located almost entirely on Manhattan Island near the southern tip of the state, Manhattan is centrally located in the Northeast megalopolis and represents the urban core of the New York metropolitan area. Manhattan serves as New York City's economic and administrative center and has been described as the cultural, financial, media, and entertainment capital of the world. Before European colonization, present-day Manhattan was part of Lenape territory. European settlement began with the establishment of a trading post by Dutch colonists in 1624 on Manhattan Island; the post was named New Amsterdam in 1626. The territory came under English control in 1664 and was renamed New York after King Charles II of England granted the lands to his brother, the Duke of York. New York, based in present-day Lower Manhattan, served as the capital of the United States from 1785 until 1790. The Statue of Liberty in New York Harbor greeted millions of arriving immigrants in the late 19th century and is a world symbol of the United States and its ideals. Manhattan became a borough during the consolidation of New York City in 1898, and houses New York City Hall, the seat of the city's government. Harlem in Upper Manhattan became the center of what is now known as the cultural Harlem Renaissance in the 1920s. The Stonewall Inn in Greenwich Village, part of the Stonewall National Monument, is considered the birthplace in 1969 of the modern gay-rights movement, cementing Manhattan's central role in LGBTQ culture. Manhattan was the site of the original World Trade Center, which was destroyed during the September 11 terrorist attacks in 2001. Situated on one of the world's largest natural harbors, the borough is bounded by the Hudson, East, and Harlem rivers and includes several small adjacent islands, including Roosevelt, U Thant, and Randalls and Wards Islands. It also includes the small neighborhood of Marble Hill, now on the U.S. mainland. Manhattan Island is divided into three informally bounded components, each cutting across the borough's long axis: Lower Manhattan, Midtown, and Upper Manhattan. Manhattan is one of the most densely populated locations in the world, with a 2020 census population of 1,694,250 living in a land area of 22.66 square miles (58.69 km2), or 72,918 residents per square mile (28,154 residents/km2), and its residential property has the highest sale price per square foot in the United States. Manhattan is home to Wall Street as well as the world's two largest stock exchanges by total market capitalization, the New York Stock Exchange and Nasdaq. Many multinational media conglomerates are based in Manhattan, as are numerous colleges and universities, such as Columbia University, New York University, Rockefeller University, and the City University of New York. The headquarters of the United Nations is located in the Turtle Bay neighborhood of Midtown Manhattan. Manhattan hosts three of the world's top 10 most-visited tourist attractions: Times Square, Central Park, and Grand Central Terminal. New York Penn Station is the busiest transportation hub in the Western Hemisphere. Chinatown has the highest concentration of Chinese people in the Western Hemisphere. Fifth Avenue has been ranked as the most expensive shopping street in the world, before falling to second in 2024. The borough hosts many prominent bridges, tunnels, and skyscrapers including the Empire State Building, Chrysler Building, and One World Trade Center. It is also home to the National Basketball Association's New York Knicks and the National Hockey League's New York Rangers. History Manhattan was historically part of the Lenapehoking territory inhabited by the Munsee, Lenape, and Wappinger tribes. There were several Lenape settlements in the area including Sapohanikan, Nechtanc, and Konaande Kongh, which were interconnected by a series of trails. The primary trail on the island, which would later become Broadway, ran from what is now Inwood in the north to Battery Park in the south. There were various sites for fishing and planting established by the Lenape throughout Manhattan. The etymology of the name Manhattan is most likely — amongst other theories, and via loaning by Dutch — from the Lenape's local language Munsee, manaháhtaan (where manah- means "gather", -aht- means "bow", and -aan is used to form verb stems). The Lenape word has been translated as "the place where we get bows" or "place for gathering the (wood to make) bows". According to a Munsee tradition recorded by Albert Seqaqkind Anthony in the 19th century, the island was named for a grove of hickory trees that was considered ideal for bowmaking. An alternate theory claims a "Delaware source akin to Munsee munahan ("island")." In April 1524, Florentine explorer Giovanni da Verrazzano, sailing in service of Francis I of France, became the first documented European to visit the area that would become New York City. Verrazzano entered the tidal strait now known as The Narrows and named the land around Upper New York Harbor New Angoulême, in reference to the family name of King Francis I; he sailed far enough into the harbor to sight the Hudson River, and he named the Bay of Santa Margarita – what is now Upper New York Bay – after Marguerite de Navarre, the elder sister of the king. Manhattan was first mapped during a 1609 voyage of Henry Hudson. Hudson came across Manhattan Island and the native people living there, and continued up the river that would later bear his name, the Hudson River. Manhattan was first recorded in writing as Manna-hata, in the logbook of Robert Juet, an officer on the voyage. A permanent European presence in New Netherland began in 1624, with the founding of a Dutch fur trading settlement on Governors Island. In 1625, construction was started on the citadel of Fort Amsterdam on Manhattan Island, later called New Amsterdam (Nieuw Amsterdam), in what is now Lower Manhattan. The establishment of Fort Amsterdam is recognized as the birth of New York City. In 1647, Peter Stuyvesant was appointed as the last Dutch Director-General of the colony. New Amsterdam was formally incorporated as a city on February 2, 1653. In 1664, English forces conquered New Netherland and renamed it "New York" after the English Duke of York and Albany, the future King James II. In August 1673, the Dutch reconquered the colony, renaming it "New Orange", but permanently relinquished it back to England the following year under the terms of the Treaty of Westminster that ended the Third Anglo-Dutch War. Manhattan was at the heart of the New York Campaign, a series of major battles in the early stages of the American Revolutionary War. The Continental Army was forced to abandon Manhattan after the Battle of Fort Washington on November 16, 1776. The city, greatly damaged by the Great Fire of New York during the campaign, became the British military and political center of operations in North America for the remainder of the war. British occupation lasted until November 25, 1783, when George Washington returned to Manhattan, a day celebrated as Evacuation Day, marking when the last British forces left the city. From January 11, 1785, until 1789, New York City was the fifth of five capitals of the United States under the Articles of Confederation, with the Continental Congress meeting at New York City Hall (then at Fraunces Tavern). New York was the first capital under the newly enacted Constitution of the United States, from March 4, 1789, to August 12, 1790, at Federal Hall. Federal Hall was where the United States Supreme Court met for the first time, the United States Bill of Rights were drafted and ratified, and where the Northwest Ordinance was adopted, establishing measures for admission to the Union of new states. New York grew as an economic center, first as a result of Alexander Hamilton's policies and practices as the first Secretary of the Treasury to expand the city's role as a center of commerce and industry. In 1810, New York City, then confined to Manhattan, had surpassed Philadelphia as the most populous city in the United States. The Commissioners' Plan of 1811 laid out the island of Manhattan in its familiar grid plan. The city's role as an economic center grew with the opening of the Erie Canal in 1825, cutting transportation costs by 90% compared to road transport and connecting the Atlantic port to the vast agricultural markets of the Midwestern United States and Canada. Tammany Hall, a Democratic Party political machine, began to grow in influence with the support of many of the immigrant Irish, culminating in the election of the first Tammany mayor, Fernando Wood, in 1854. Covering 840 acres (340 ha) in the center of the island, Central Park, which opened its first portions to the public in 1858, became the first landscaped public park in an American city. New York City played a complex role in the American Civil War. The city had strong commercial ties to the South, but anger around conscription, resentment against Lincoln's war policies and paranoia about free Blacks taking the jobs of poor immigrants culminated in the three-day-long New York Draft Riots of July 1863, among the worst incidents of civil disorder in American history. The rate of immigration from Europe grew steeply after the Civil War, and Manhattan became the first stop for millions seeking a new life in the United States, a role acknowledged by the dedication of the Statue of Liberty in 1886. This immigration brought further social upheaval. In a city of tenements packed with poorly paid laborers from dozens of nations, the city became a hotbed of revolution (including anarchists and communists among others), syndicalism, racketeering, and unionization.[citation needed] In 1883, the opening of the Brooklyn Bridge across the East River established a road connection to Brooklyn and the rest of Long Island. In 1898, New York City consolidated with three neighboring counties to form "the City of Greater New York", and Manhattan was established as one of the five boroughs of New York City. The Bronx remained part of New York County until 1914, when Bronx County was established. The construction of the New York City Subway, which opened in 1904, helped bind the new city together, as did the completion of the Williamsburg Bridge (1903) and Manhattan Bridge (1909) connecting to Brooklyn and the Queensboro Bridge (1909) connecting to Queens. In the 1920s, Manhattan experienced large arrivals of African-Americans as part of the Great Migration from the southern United States, and the Harlem Renaissance, part of a larger boom time in the Prohibition era that included new skyscrapers competing for the skyline, with the Woolworth Building (1913), 40 Wall Street (1930), the Chrysler Building (1930), and the Empire State Building (1931) leapfrogging each other to take their place as the world's tallest building. Manhattan's majority white ethnic group declined from 98.7% in 1900 to 58.3% by 1990. On March 25, 1911, the Triangle Shirtwaist Factory fire in Greenwich Village killed 146 garment workers, leading to overhauls of the city's fire department, building codes, and workplace safety regulations. In 1912, about 20,000 workers, a quarter of them women, marched upon Washington Square Park to commemorate the fire. Many of the women wore fitted tucked-front blouses like those manufactured by the company, a clothing style that became the working woman's uniform and a symbol of women's liberation, reflecting the alliance of the labor and suffrage movements. Despite the Great Depression, some of the world's tallest skyscrapers were completed in Manhattan during the 1930s, including numerous Art Deco masterpieces that are still part of the city's skyline, most notably the Empire State Building, the Chrysler Building, and 30 Rockefeller Plaza. A postwar economic boom led to the development of huge housing developments targeted at returning veterans, the largest being Stuyvesant Town–Peter Cooper Village, which opened in 1947. The United Nations relocated to a new headquarters that was completed in 1952 along the East River. The Stonewall riots were a series of spontaneous, violent protests by members of the gay community against a police raid that took place in the early morning hours of June 28, 1969, at the Stonewall Inn in the Greenwich Village neighborhood of Lower Manhattan. They are widely considered to constitute the single most important event leading to the gay liberation movement and the modern fight for LGBT rights. In the 1970s, job losses due to industrial restructuring caused New York City, including Manhattan, to suffer from economic problems and rising crime rates. While a resurgence in the financial industry greatly improved the city's economic health in the 1980s, New York's crime rate continued to increase through the decade and into the beginning of the 1990s. The 1980s saw a rebirth of Wall Street, and Manhattan reclaimed its role as the world's financial center, with Wall Street employment doubling from 1977 to 1987. The 1980s also saw Manhattan at the heart of the AIDS crisis, with Greenwich Village at its epicenter. In the 1970s, Times Square and 42nd Street – with its sex shops, peep shows, and adult theaters, along with its sex trade, street crime, and public drug use – became emblematic of the city's decline, with a 1981 article in Rolling Stone magazine calling the stretch of West 42nd Street between 7th and 8th Avenues the "sleaziest block in America". By the late 1990s, led by efforts by the city and the Walt Disney Company, the area had been revived as a center of tourism to the point where it was described by The New York Times as "arguably the most sought-after 13 acres of commercial property in the world." By the 1990s, crime rates began to drop dramatically and the city once again became the destination of immigrants from around the world, joining with low interest rates and Wall Street bonus payments to fuel the growth of the real estate market. Important new sectors, such as Silicon Alley, emerged in the Flatiron District, cementing technology as a key component of Manhattan's economy. The 1993 World Trade Center bombing, described by the FBI as "something of a deadly dress rehearsal for 9/11", was a terrorist attack in which six people were killed when a van bomb filled with explosives was detonated in a parking lot below the North Tower of the World Trade Center complex. On September 11, 2001, the Twin Towers of the original World Trade Center were struck by hijacked aircraft and collapsed in the September 11 attacks launched by al-Qaeda terrorists. The collapse caused extensive damage to surrounding buildings and skyscrapers in Lower Manhattan, and resulted in the deaths of 2,606 of the 17,400 who had been in the buildings when the planes hit, in addition to those on the planes. Since 2001, most of Lower Manhattan has been restored, although there has been controversy surrounding the rebuilding. In 2014, the new One World Trade Center, at 1,776 feet (541 m) measured to the top of its spire, became the tallest building in the Western Hemisphere and is the world's seventh-tallest building (as of 2023). The Occupy Wall Street protests in Zuccotti Park in the Financial District of Lower Manhattan began on September 17, 2011, receiving global attention and spawning the Occupy movement against social and economic inequality worldwide. On October 29 and 30, 2012, Hurricane Sandy caused extensive destruction in the borough, ravaging portions of Lower Manhattan with record-high storm surge from New York Harbor, severe flooding, and high winds, causing power outages for hundreds of thousands of city residents and leading to gasoline shortages and disruption of mass transit systems. The storm and its profound impacts have prompted discussion of constructing seawalls and other coastal barriers around the shorelines of the borough and the metropolitan area to minimize the risk of destructive consequences from another such event in the future. Geography According to the United States Census Bureau, New York County has a total area of 33.6 square miles (87 km2), of which 22.8 square miles (59 km2) is land and 10.8 square miles (28 km2) (32%) is water. The northern segment of Upper Manhattan represents a geographic panhandle. Manhattan Island is 22.7 square miles (59 km2) in area, 13.4 miles (21.6 km) long and 2.3 miles (3.7 km) wide, at its widest point, near 14th Street. The borough consists primarily of Manhattan Island, along with the Marble Hill neighborhood and several small islands, including Randalls Island and Wards Island and Roosevelt Island in the East River; and Governors Island and Liberty Island to the south in New York Harbor. The Island is about 13.4 miles (21.6 km) from north to south, and at its widest, 2.3 miles (3.7 km). Manhattan Island is loosely divided into Downtown (Lower Manhattan), Midtown (Midtown Manhattan), and Uptown (Upper Manhattan), with Fifth Avenue dividing Manhattan lengthwise into its East Side and West Side. Manhattan Island is bounded by the Hudson River to the west and the East River to the east. To the north, the Harlem River divides Manhattan Island from the Bronx and the mainland United States. Early in the 19th century, land reclamation was used to expand Lower Manhattan from the natural Hudson shoreline at Greenwich Street to West Street. When building the World Trade Center in 1968, 1.2 million cubic yards (920,000 m3) of material excavated from the site was used to expand the Manhattan shoreline across West Street, creating Battery Park City. Constructed on piers at a cost of $260 million, Little Island opened on the Hudson River in May 2021, connected to the western termini of 13th and 14th Streets by footbridges. Marble Hill was part of the northern tip of Manhattan Island, but the Harlem River Ship Canal, dug in 1895 to better connect the Harlem and Hudson rivers, separated it from the remainder of Manhattan. Before World War I, the section of the original Harlem River channel separating Marble Hill from the Bronx was filled in, and Marble Hill became part of the mainland. After a May 1984 court ruling that Marble Hill was simultaneously part of the Borough of Manhattan (not the Borough of the Bronx) and part of Bronx County (not New York County), the matter was definitively settled later that year when the New York Legislature overwhelmingly passed legislation declaring the neighborhood part of both New York County and the Borough of Manhattan. Within New York Harbor, there are three smaller islands: Other smaller islands, in the East River, include (from north to south): The bedrock underlying much of Manhattan consists of three rock formations: Inwood marble, Fordham gneiss, and Manhattan schist, and is well suited for the foundations of Manhattan's skyscrapers. It is part of the Manhattan Prong physiographic region. Under the Köppen climate classification, New York City features both a humid subtropical climate (Cfa) and a humid continental climate (Dfa). It is the northernmost major city on the North American continent with a humid subtropical climate. The city averages 234 days with at least some sunshine annually. Winters are cold and damp, and prevailing wind patterns that blow offshore temper the moderating effects of the Atlantic Ocean. The Atlantic and the partial shielding from colder air by the Appalachians keep the city warmer in the winter than inland North American cities at similar or lesser latitudes. The daily mean temperature in January, the area's coldest month, is 32.6 °F (0.3 °C). Temperatures usually drop to 10 °F (−12 °C) several times per winter, and reach 60 °F (16 °C) several days in the coldest winter month. Spring and autumn are unpredictable and can range from chilly to warm, although they are usually mild with low humidity. Summers are typically warm to hot and humid, with a daily mean temperature of 76.5 °F (24.7 °C) in July. Nighttime conditions are often exacerbated by the urban heat island phenomenon, which causes heat absorbed during the day to be radiated back at night, raising temperatures by as much as 7 °F (4 °C) when winds are slow. Daytime temperatures exceed 90 °F (32 °C) on average of 17 days each summer and in some years exceed 100 °F (38 °C). Extreme temperatures have ranged from −15 °F (−26 °C), recorded on February 9, 1934, up to 106 °F (41 °C) on July 9, 1936. Manhattan lies in USDA plant hardiness zone 7b (5 to 10 °F/-15 to -12.2 °C). Manhattan receives 49.9 inches (1,270 mm) of precipitation annually, which is relatively evenly spread throughout the year. Average winter snowfall between 1981 and 2010 has been 25.8 inches (66 cm). This varies considerably from year to year. Manhattan's many neighborhoods are not named according to any particular convention, nor do they have official boundaries. Some are geographical (the Upper East Side), or ethnically descriptive (Little Italy). Others are acronyms, such as TriBeCa (for "TRIangle BElow CAnal Street") or SoHo ("SOuth of HOuston"), NoLIta ("NOrth of Little ITAly"), and NoMad ("NOrth of MADison Square Park"). Harlem is a name from the Dutch colonial era after Haarlem, a city in the Netherlands. Some have simple folkloric names, such as Hell's Kitchen, alongside their more official but lesser used title (in this case, Clinton). Some neighborhoods, such as SoHo, which is mixed use, are known for upscale shopping as well as residential use. Others, such as Greenwich Village, the Lower East Side, Alphabet City and the East Village, have long been associated with the Bohemian subculture. Chelsea is one of several Manhattan neighborhoods with large gay populations and has become a center of both the international art industry and New York's nightlife. Chinatown has the highest concentration of people of Chinese descent outside of Asia. Koreatown is roughly centered on 32nd Street between Fifth and Sixth Avenues. Rose Hill features a growing number of Indian restaurants and spice shops along a stretch of Lexington Avenue between 25th and 30th Streets which has become known as Curry Hill. Washington Heights in Uptown Manhattan is home to the largest Dominican immigrant community in the United States. Harlem, also in Upper Manhattan, is the historical epicenter of African American culture. Since 2010, a Little Australia has emerged and is growing in Nolita, Lower Manhattan. Manhattan has two central business districts, the Financial District at the southern tip of the island, and Midtown Manhattan. The term uptown also refers to the northern part of Manhattan above 72nd Street and downtown to the southern portion below 14th Street, with Midtown covering the area in between, though definitions can be fluid. Fifth Avenue roughly bisects Manhattan Island and acts as the demarcation line for east/west designations. South of Waverly Place, Fifth Avenue terminates and Broadway becomes the east/west demarcation line.[citation needed] In Manhattan, uptown means north and downtown means south. This usage differs from that of most American cities, where downtown refers to the central business district. Demographics As of the 2020 census, Manhattan's population had increased by 6.8% over the decade to 1,694,251, representing 19.2% of New York City's population of 8,804,190 and 8.4% of New York State's population of 20,201,249. The population density of New York County was 70,450.8 inhabitants per square mile (27,201.2/km2) in 2022, the highest population density of any county in the United States and higher than the density of any individual U.S. city. At the 2010 census, there were 1,585,873 people living in Manhattan, an increase of 3.2% from the 1,537,195 counted in the 2000 census. In 2010, the largest organized religious group in Manhattan was the Archdiocese of New York, with 323,325 Catholics worshiping at 109 parishes, followed by 64,000 Orthodox Jews with 77 congregations, an estimated 42,545 Muslims with 21 congregations, 42,502 non-denominational adherents with 54 congregations, 26,178 TEC Episcopalians with 46 congregations, 25,048 ABC-USA Baptists with 41 congregations, 24,536 Reform Jews with 10 congregations, 23,982 Mahayana Buddhists with 35 congregations, 10,503 PC-USA Presbyterians with 30 congregations, and 10,268 RCA Presbyterians with 10 congregations. Altogether, 44.0% of the population was claimed as members by religious congregations, although members of historically African-American denominations were underrepresented due to incomplete information. In 2014, Manhattan had 703 religious organizations, the seventeenth most out of all US counties. There is a large Buddhist temple in Manhattan located at the foot of the Manhattan Bridge in Chinatown. As of 2015, 60.0% (927,650) of Manhattan residents, aged five and older, spoke only English at home, while 22.63% (350,112) spoke Spanish, 5.37% (83,013) Chinese, 2.21% (34,246) French, 0.85% (13,138) Korean, 0.72% (11,135) Russian, and 0.70% (10,766) Japanese. In total, 40.0% of Manhattan's population, aged five and older, spoke a language other than English at home. Landmarks and architecture Points of interest on Manhattan Island include the American Museum of Natural History; the Battery; Broadway and the Theater District; Bryant Park; Central Park, Chinatown; the Chrysler Building; The Cloisters; Columbia University; Curry Hill; the Empire State Building; Flatiron Building; the Financial District (including the New York Stock Exchange Building; Wall Street; and the South Street Seaport); Grand Central Terminal; Greenwich Village (including New York University; Washington Square Arch; and Stonewall Inn); Harlem and Spanish Harlem; the High Line; Koreatown; Lincoln Center; Little Australia; Little Italy; Madison Square Garden; Museum Mile on Fifth Avenue (including the Metropolitan Museum of Art); New York Penn Station, Port Authority Bus Terminal; Rockefeller Center (including Radio City Music Hall); Times Square; and the World Trade Center (including the National September 11 Museum and One World Trade Center). There are also numerous iconic bridges across rivers that connect to Manhattan Island, as well as an emerging number of supertall skyscrapers. The Statue of Liberty rests on Liberty Island, an exclave of Manhattan, and part of Ellis Island is also an exclave of Manhattan. The borough has many energy-efficient office buildings, such as the Hearst Tower, the rebuilt 7 World Trade Center, and the Bank of America Tower—the first skyscraper designed to attain a Platinum LEED Certification. The skyscraper, which has shaped Manhattan's distinctive skyline, has been closely associated with New York City's identity since the end of the 19th century. Structures such as the Equitable Building of 1915, which rises vertically forty stories from the sidewalk, prompted the passage of the 1916 Zoning Resolution, requiring new buildings to contain setbacks withdrawing progressively at a defined angle from the street as they rose, in order to preserve a view of the sky at street level. Manhattan's skyline includes several buildings that are symbolic of New York, in particular the Chrysler Building: 14 and the Empire State Building, which sees about 4 million visitors a year. In 1961, the struggling Pennsylvania Railroad unveiled plans to tear down the old Penn Station and replace it with a new Madison Square Garden and office building complex. Organized protests were aimed at preserving the McKim, Mead & White-designed structure completed in 1910, widely considered a masterpiece of the Beaux-Arts style and one of the architectural jewels of New York City. Despite these efforts, demolition of the structure began in October 1963. The loss of Penn Station led directly to the enactment in 1965 of a local law establishing the New York City Landmarks Preservation Commission, which is responsible for preserving the "city's historic, aesthetic, and cultural heritage". The historic preservation movement triggered by Penn Station's demise has been credited with the retention of some one million structures nationwide, including over 1,000 in New York City. In 2017, a multibillion-dollar rebuilding plan was unveiled to restore the historic grandeur of Penn Station, in the process of upgrading the landmark's status as a critical transportation hub. The 700,000 ft2 (65,000 m2) Moynihan Train Hall, developed as a $1.6 billion renovation and expansion of Penn Station into the James A. Farley Building, the city's former main post office building, was opened in January 2021. Parkland covers a total of 2,659 acres (10.76 km2), accounting for 18.2% of the borough's land area; the 840-acre (3.4 km2) Central Park is the borough's largest park, comprising 31.6% of Manhattan's parkland. Designed by Frederick Law Olmsted and Calvert Vaux, the park is anchored by the 12-acre (4.9 ha) Great Lawn and offers extensive walking tracks, two ice-skating rinks, a wildlife sanctuary, and several lawns and sporting areas, as well as 21 playgrounds, and a 6-mile (9.7 km) road from which automobile traffic has been banned since 2018. While much of the park looks natural, it is almost entirely landscaped; the construction of Central Park in the 1850s was one of the era's most massive public works projects, with some 20,000 workers moving 5 million cubic yards (3.8 million cubic meters) of material to shape the topography and create the English-style pastoral landscape that Olmsted and Vaux sought. The remaining 70% of Manhattan's parkland includes 204 playgrounds, 251 Greenstreets, 371 basketball courts, and many other amenities. The next-largest park in Manhattan, the Hudson River Park, stretches 4.5 miles (7.2 km) along the Hudson River and comprises 550 acres (220 ha). Other major parks include: Economy Manhattan is the economic engine of New York City, with its 2.45 million workers drawn from the entire New York metropolitan area accounting for approximately half of all jobs in New York City. Manhattan's workforce is overwhelmingly focused on white collar professions. In 2010, Manhattan's daytime population was swelling to 3.94 million, with commuters adding a net 1.48 million people to the population, along with visitors, tourists, and commuting students. The commuter influx of 1.61 million workers coming into Manhattan was the largest of any county or city in the country. Anchored by Manhattan's financial institutions, New York City has been described as the financial capital of the world. Manhattan's most important economic sector lies in its role as the headquarters for the U.S. financial industry, metonymously known as Wall Street. Manhattan is home to the New York Stock Exchange (NYSE), at 11 Wall Street in Lower Manhattan, and the Nasdaq, now located at 4 Times Square in Midtown Manhattan, representing the world's largest and second-largest stock exchanges, respectively, when measured both by overall share trading value and by total market capitalization of their listed companies in 2023. The NYSE American (formerly the American Stock Exchange, AMEX), New York Board of Trade, and the New York Mercantile Exchange (NYMEX) are also located downtown. New York City is home to the most corporate headquarters of any city in the United States, the overwhelming majority based in Manhattan. Manhattan had more than 520 million square feet (48 million square meters) of office space in 2022, making it the largest office market in the United States; while Midtown Manhattan, with more than 400 million square feet (37 million square meters) is the largest central business district in the world. Lower Manhattan is the third-largest U.S. central business district (following the Chicago Loop). New York City's role as the top global center for the advertising industry is metonymously known as "Madison Avenue". Manhattan has driven New York's status as a top-tier global high technology hub. Silicon Alley, once a metonym for the sphere encompassing the metropolitan region's high tech industries, is no longer a relevant moniker as the city's tech environment has expanded dramatically both in location and in its scope. New York City's current tech sphere encompasses a universal array of applications involving artificial intelligence, the internet, new media, financial technology (fintech) and cryptocurrency, biotechnology, game design, and other fields within information technology that are supported by its entrepreneurship ecosystem and venture capital investments. As of 2014[update], New York City hosted 300,000 employees in the tech sector. In 2015, Silicon Alley generated over US$7.3 billion in venture capital investment, most based in Manhattan, as well as in Brooklyn, Queens, and elsewhere in the region. High technology startup companies and employment are growing in Manhattan and across New York City, bolstered by the city's emergence as a global node of creativity and entrepreneurship, social tolerance, and environmental sustainability, as well as New York's position as the leading Internet hub and telecommunications center in North America, including its vicinity to several transatlantic fiber optic trunk lines, the city's intellectual capital, and its extensive outdoor wireless connectivity. Verizon Communications, headquartered at 140 West Street in Lower Manhattan, was at the final stages in 2014 of completing a US$3 billion fiberoptic telecommunications upgrade throughout New York City. The biotechnology sector is also growing in Manhattan based upon the city's strength in academic scientific research and public and commercial financial support. By mid-2014, Accelerator, a biotech investment firm, had raised more than US$30 million from investors, including Eli Lilly and Company, Pfizer, and Johnson & Johnson, for initial funding to create biotechnology startups at the Alexandria Center for Life Science, which encompasses more than 700,000 square feet (65,000 m2) on East 29th Street and promotes collaboration among scientists and entrepreneurs at the center and with nearby academic, medical, and research institutions. The New York City Economic Development Corporation's Early Stage Life Sciences Funding Initiative and venture capital partners, including Celgene, General Electric Ventures, and Eli Lilly, committed a minimum of US$100 million to help launch 15 to 20 ventures in life sciences and biotechnology. In 2011, Mayor Michael R. Bloomberg had announced his choice of Cornell University and Technion-Israel Institute of Technology to build a US$2 billion graduate school of applied sciences on Roosevelt Island, Manhattan, with the goal of transforming New York City into the world's premier technology capital.[needs update] Tourism is vital to Manhattan's economy, and the landmarks of Manhattan are the focus of New York City's tourists, with a record 66.6 million visiting the city in 2019, bringing in $47.4 billion in tourism revenue. Visitor numbers dropped by two-thirds in 2020 during the COVID-19 pandemic, climbing back to 63.3 million visitors in 2023. According to The Broadway League, shows on Broadway sold approximately US$1.54 billion worth of tickets in the 2022–2023 and the 2023–2024 seasons with attendance of approximately 12.3 million each. Real estate is a major force driving Manhattan's economy. Manhattan has perennially been home to some of the world's most valuable real estate, including the Time Warner Center, which had the highest-listed market value in the city in 2006 at US$1.1 billion, to be subsequently surpassed in October 2014 by the Waldorf Astoria New York, which became the most expensive hotel ever sold after being purchased by the Anbang Insurance Group, based in China, for US$1.95 billion. When 450 Park Avenue was sold on July 2, 2007, for US$510 million, about US$1,589 per square foot (US$17,104/m²), it broke the barely month-old record for an American office building of US$1,476 per square foot (US$15,887/m²) based on the sale of 660 Madison Avenue. In 2014, Manhattan was home to six of the top ten zip codes in the United States by median housing price. In 2019, the most expensive home sale ever in the United States occurred in Manhattan, at a selling price of US$238 million, for a 24,000-square-foot (2,200 m2) penthouse apartment overlooking Central Park, while Central Park Tower, topped out at 1,550 feet (472 m) in 2019, is the world's tallest residential building, followed globally in height by 111 West 57th Street and 432 Park Avenue, both also located in Midtown Manhattan. Manhattan has been described as the media capital of the world. A significant array of media outlets and their journalists report about international, American, business, entertainment, and New York metropolitan area–related matters from Manhattan. Manhattan is served by the major New York City daily news publications, including The New York Times, which has won the most Pulitzer Prizes for journalism and is considered the U.S. media's newspaper of record; the New York Daily News; and the New York Post, which are all headquartered in the borough. The nation's largest newspaper by circulation, The Wall Street Journal, is also based in Manhattan. Other daily newspapers include AM New York and The Villager. The New York Amsterdam News, based in Harlem, is one of the leading Black-owned weekly newspapers in the United States. The Village Voice, historically the largest alternative newspaper in the United States, announced in 2017 that it would cease publication of its print edition and convert to a fully digital venture. The television industry developed in Manhattan and is a significant employer in the borough's economy. The four major American broadcast networks, ABC, CBS, NBC, and Fox, as well as Univision, are all headquartered in Manhattan, as are many cable channels, including CNN, MSNBC, MTV, Fox News, HBO, and Comedy Central. In 1971, WLIB became New York City's first Black-owned radio station and began broadcasts geared toward the African-American community in 1949. WQHT, also known as Hot 97, claims to be the premier hip-hop station in the United States. WNYC, broadcasting on both an AM and FM signal, has the largest public radio audience in the nation and is the most-listened to commercial or non-commercial radio station in Manhattan. WBAI, owned by the non-profit Pacifica Foundation, broadcasts eclectic music, as well as political news, talk and opinion from a left-leaning viewpoint. The oldest public-access television cable TV channel in the United States is the Manhattan Neighborhood Network, founded in 1971, offers eclectic local programming that ranges from a jazz hour to discussions of labor issues to foreign language and religious programming. NY1, Charter Communications's local news channel, is known for its beat coverage of City Hall and state politics. Education Education in Manhattan is provided by a vast number of public and private institutions. Non-charter public schools in the borough are operated by the New York City Department of Education, the largest public school system in the United States. Charter schools include Success Academy Harlem 1 through 5, Success Academy Upper West, and Public Prep. Several notable New York City public high schools are located in Manhattan, including A. Philip Randolph Campus High School, Beacon High School, Stuyvesant High School, Fiorello H. LaGuardia High School, High School of Fashion Industries, Eleanor Roosevelt High School, NYC Lab School, Manhattan Center for Science and Mathematics, Hunter College High School, and High School for Math, Science and Engineering at City College. Bard High School Early College, a hybrid school created by Bard College, serves students from around the city. Many private preparatory schools are also situated in Manhattan, including the Upper East Side's Brearley School, Dalton School, Browning School, Spence School, Chapin School, Nightingale-Bamford School, Convent of the Sacred Heart, Hewitt School, Saint David's School, Loyola School, and Regis High School. The Upper West Side is home to the Collegiate School and Trinity School. The borough is also home to Manhattan Country School, Trevor Day School, Xavier High School and the United Nations International School. Based on data from the 2011–2015 American Community Survey, 59.9% of Manhattan residents over age 25 have a bachelor's degree. As of 2005, about 60% of residents were college graduates and some 25% had earned advanced degrees, giving Manhattan one of the nation's densest concentrations of highly educated people. Manhattan has various colleges and universities, including Columbia University (and its affiliate Barnard College), Cooper Union, Marymount Manhattan College, New York Institute of Technology, New York University (NYU), The Juilliard School, Pace University, Berkeley College, The New School, Yeshiva University, and a campus of Fordham University. Other schools include Bank Street College of Education, Boricua College, Jewish Theological Seminary of America, Manhattan School of Music, Metropolitan College of New York, Parsons School of Design, School of Visual Arts, Touro College, and Union Theological Seminary. Several other private institutions maintain a Manhattan presence, among them Adelphi University, Mercy University, King's College, St. John's University, and Pratt Institute. Cornell Tech, part of Cornell University, is developing on Roosevelt Island. The City University of New York (CUNY), the municipal college system of New York City, is the largest urban university system in the United States, serving more than 226,000 degree students and a roughly equal number of adult, continuing and professional education students. A third of college graduates in New York City graduate from CUNY, with the institution enrolling about half of all college students in New York City. CUNY senior colleges located in Manhattan include: Baruch College, City College of New York, Hunter College, John Jay College of Criminal Justice and William E. Macaulay Honors College; graduate studies and doctorate-granting institutions are Craig Newmark Graduate School of Journalism at the City University of New York, CUNY Graduate Center, CUNY Graduate School of Public Health & Health Policy, CUNY School of Labor and Urban Studies and CUNY School of Professional Studies. The only CUNY community college located in Manhattan is the Borough of Manhattan Community College. The State University of New York is represented by the Fashion Institute of Technology, State University of New York State College of Optometry, and Stony Brook University – Manhattan. Manhattan is a world center for training and education in medicine and the life sciences. The city as a whole receives the second-highest amount of annual funding from the National Institutes of Health among all U.S. cities, the bulk of which goes to Manhattan's research institutions, including Memorial Sloan-Kettering Cancer Center, Rockefeller University, Mount Sinai School of Medicine, Columbia University College of Physicians and Surgeons, Weill Cornell Medical College, and New York University School of Medicine. Manhattan is served by the New York Public Library, which has the largest collection of any public library system in the country. The five units of the Central Library—Mid-Manhattan Library, 53rd Street Library, the New York Public Library for the Performing Arts, Andrew Heiskell Braille and Talking Book Library, and the Science, Industry and Business Library—are all located in Manhattan. More than 35 other branch libraries are located in the borough. Culture Manhattan is the borough most closely associated with New York City by non-residents; residents within the New York City metropolitan area, including New York City's boroughs outside Manhattan, will often describe a trip to Manhattan as "going to the City". Poet Walt Whitman characterized the streets of Manhattan as being traversed by "hurrying, feverish, electric crowds". Manhattan has been the scene of many important global and American cultural movements. The Harlem Renaissance in the 1920s established the African-American literary canon in the United States and introduced writers Langston Hughes and Zora Neale Hurston. Manhattan's visual art scene in the 1950s and 1960s was a center of the pop art movement, which gave birth to such giants as Jasper Johns and Roy Lichtenstein. The downtown pop art movement of the late 1970s included artist Andy Warhol and clubs like Serendipity 3 and Studio 54, where he socialized. Broadway theater is considered the highest professional form of theater in the United States. Plays and musicals are staged in one of the 39 larger professional theaters with at least 500 seats, almost all in and around Times Square. Off-Broadway theaters feature productions in venues with 100–500 seats. Lincoln Center for the Performing Arts, anchoring Lincoln Square on the Upper West Side of Manhattan, is home to 12 influential arts organizations, including the Metropolitan Opera, New York Philharmonic, and New York City Ballet, as well as the Vivian Beaumont Theater, the Juilliard School, Jazz at Lincoln Center, and Alice Tully Hall. Performance artists displaying diverse skills are ubiquitous on the streets of Manhattan. Manhattan is also home to some of the most extensive art collections in the world, both contemporary and classical art, including those on Museum Mile on Fifth Avenue, such as the Metropolitan Museum of Art, the Frick Collection, the Frank Lloyd Wright-designed Guggenheim Museum, the Neue Galerie, and the Jewish Museum. Other major museums in Manhattan include the Whitney Museum of American Art and the Museum of Modern Art (MoMA). The Upper East Side has many art galleries, and the downtown neighborhood of Chelsea is known for its more than 200 art galleries that are home to modern art from both upcoming and established artists. Many of the world's most lucrative art auctions are held in Manhattan. Manhattan is the epicenter of LGBTQ culture and the central node of the LGBTQ+ sociopolitical ecosystem. The borough is widely acclaimed as the cradle of the modern LGBTQ rights movement, with its inception at the 1969 Stonewall Riots. Brian Silverman, the author of Frommer's New York City from $90 a Day, wrote the city has "one of the world's largest, loudest, and most powerful LGBT communities", and "Gay and lesbian culture is as much a part of New York's basic identity as yellow cabs, high-rise buildings, and Broadway theatre"— radiating from this central hub, as LGBT travel guide Queer in the World states, "The fabulosity of Gay New York is unrivaled on Earth, and queer culture seeps into every corner of its five boroughs". Multiple gay villages have developed, spanning the length of the borough from the Lower East Side, East Village, and Greenwich Village, through Chelsea and Hell's Kitchen, uptown to Morningside Heights. The annual NYC Pride March (or gay pride parade) traverses southward down Fifth Avenue and ends at Greenwich Village; the Manhattan parade is the largest pride parade in the world, attracting tens of thousands of participants and millions of sidewalk spectators each June. Stonewall 50 – WorldPride NYC 2019 was the largest international Pride celebration in history, produced by Heritage of Pride. The events were in partnership with the I ❤ NY program's LGBT division, commemorating the 50th anniversary of the Stonewall uprising, with 150,000 participants and five million spectators attending in Manhattan. The borough is represented in several prominent idioms. The phrase New York minute is meant to convey an extremely short time such as an instant, sometimes in hyperbolic form, as in "perhaps faster than you would believe is possible," referring to the rapid pace of life in Manhattan. The expression "melting pot" was first popularly coined to describe the densely populated immigrant neighborhoods on the Lower East Side in Israel Zangwill's play The Melting Pot, which was an adaptation of William Shakespeare's Romeo and Juliet set in New York City in 1908. The iconic Flatiron Building is said to have been the source of the phrase "23 skidoo" or scram, from what cops would shout at men who tried to get glimpses of women's dresses being blown up by the winds created by the triangular building. The "Big Apple" dates back to the 1920s, when a reporter heard the term used by New Orleans stable-hands to refer to New York City's horse racetracks and named his racing column "Around The Big Apple". Jazz musicians adopted the term to refer to the city as the world's jazz capital, and a 1970s ad campaign by the New York Convention and Visitors Bureau helped popularize the term. Manhattan is well known for its street parades, which celebrate a broad array of themes, including holidays, nationalities, human rights, and major league sports team championship victories. The majority of higher profile parades in New York City are held in Manhattan. The annual Macy's Thanksgiving Day Parade is the world's largest parade, beginning alongside Central Park and processing southward to the flagship Macy's Herald Square store; the parade is viewed on telecasts worldwide and draws millions of spectators in person. Other notable parades including the world's oldest St. Patrick's Day Parade, held annually in March since 1762, the Greenwich Village Halloween Parade in October, and numerous parades commemorating the independence days of many nations. Ticker-tape parades celebrating victorious sporting championships as well as other national accomplishments march northward on Broadway from Bowling Green to City Hall Park in Lower Manhattan, along the Canyon of Heroes. New York Fashion Week, held at various locations in Manhattan, is a high-profile semiannual event featuring models displaying the latest wardrobes created by prominent fashion designers worldwide. Sports Manhattan is home to the NBA's New York Knicks and the NHL's New York Rangers, both of which play their home games at Madison Square Garden, the only major professional sports arena in the borough. The Garden was also home to the WNBA's New York Liberty through the 2017 season, but that team's primary home is now the Barclays Center in Brooklyn. The New York Jets proposed a West Side Stadium for their home field, but the proposal was defeated in June 2005, and they now play at MetLife Stadium in East Rutherford, New Jersey. Manhattan does not currently host a professional baseball franchise. The original New York Giants played primarily in the various incarnations of the Polo Grounds from their inception in 1883 until they headed to California with the Brooklyn Dodgers after the 1957 season. The New York Yankees began their franchise as the Highlanders, named for Hilltop Park, where they played from their creation in 1903 until 1912. The team moved to the Polo Grounds with the 1913 season, where they were officially christened the New York Yankees, remaining there until they moved across the Harlem River in 1923 to Yankee Stadium. The New York Mets played in the Polo Grounds in 1962 and 1963, their first two seasons, before Shea Stadium was completed in 1964. After the Mets departed, the Polo Grounds was demolished in April 1964. The first national college-level basketball championship, the National Invitation Tournament, was held in New York in 1938 and remains in the city. The New York Knicks started play in 1946 as one of the National Basketball Association's original teams, playing their first home games at the 69th Regiment Armory, before making Madison Square Garden their permanent home. The New York Liberty of the WNBA shared the Garden with the Knicks from their creation in 1997 as one of the league's original eight teams through the 2017 season, after which the team moved nearly all of its home schedule to White Plains, New York. Rucker Park in Harlem is a playground court, famed for its streetball style of play, where many NBA athletes have played in the summer league. Although both of New York City's football teams play today in MetLife Stadium in East Rutherford, New Jersey, both teams started out playing in the Polo Grounds. The New York Giants played side-by-side with their baseball namesakes from the time they entered the National Football League in 1925, until crossing over to Yankee Stadium in 1956. The New York Jets, originally known as the Titans of New York, started out in 1960 at the Polo Grounds, before joining the Mets in Queens at Shea Stadium in 1964. The New York Rangers of the National Hockey League have played in the various locations of Madison Square Garden since the team's founding in the 1926–1927 season. The Rangers were predated by the New York Americans, who started play in the Garden the previous season, lasting until the team folded after the 1941–1942 NHL season, a season it played in the Garden as the Brooklyn Americans. The New York Cosmos of the North American Soccer League played their home games at Downing Stadium for two seasons, starting in 1974. The playing pitch and facilities at Downing Stadium were in unsatisfactory condition, however, and as the team's popularity grew they too left for Yankee Stadium, and then Giants Stadium. The stadium was demolished in 2002 to make way for the $45 million, 4,754-seat Icahn Stadium. Government Since New York City's consolidation in 1898, Manhattan has been governed by the New York City Charter; its 1989 revision provided for a strong mayor–council system. The centralized New York City government is responsible for public education, correctional institutions, libraries, public safety, recreational facilities, sanitation, water supply, and welfare services in Manhattan. The office of Borough President was created in the consolidation of 1898 to balance centralization with local authority. Each borough president had a powerful administrative role derived from having a vote on the New York City Board of Estimate, which was responsible for creating and approving the city's budget and proposals for land use. In 1989, the US Supreme Court declared the Board of Estimate unconstitutional because Brooklyn, the most populous borough, had no greater effective representation on the Board than Staten Island, the least populous borough, a violation of the Equal Protection Clause. Since 1990, the largely powerless Borough President has acted as an advocate for the borough at the mayoral agencies, the City Council, the New York state government, and corporations.[citation needed] Manhattan's current Borough President is Brad Hoylman-Sigal, elected as a Democrat in November 2025. Alvin Bragg, a Democrat, is the District Attorney of New York County. Manhattan has ten City Council members, the third largest contingent among the five boroughs. It also has twelve administrative districts, each served by a local Community Board. Community Boards are representative bodies that field complaints and serve as advocates for local residents. As the host of the United Nations, the borough is home to the world's largest international consular corps, comprising 105 consulates, consulates general and honorary consulates. It is also the home of New York City Hall, the seat of New York City government housing the Mayor of New York City and the New York City Council. The mayor's staff and thirteen municipal agencies are located in the nearby Manhattan Municipal Building, completed in 1914, one of the largest governmental buildings in the world. The Democratic Party holds most public offices. Registered Republicans are a minority in the borough, constituting 9.88% of the electorate as of April 2016[update]. Registered Republicans are more than 20% of the electorate only in the neighborhoods of the Upper East Side and the Financial District as of 2016[update]. Democrats accounted for 68.41% of those registered to vote, while 17.94% of voters were unaffiliated. Manhattan is heavily urbanized and thus powerfully Democratic in federal elections. Over three-quarters of its vote has gone to Democratic presidential candidates in every election since 1988, and over 80% in every election since 2004. Manhattan has repeatedly held the record as the most Democratic out of all counties in New York at the federal level; this was the case in 2020 and 2024, for example. It voted solidly for Democrat Bill Clinton in both 1992 and 1996, and defied national trends by giving negligible votes to third-party candidate Ross Perot. Democrats continued to see large gains in Manhattan in the 21st Century, with Barack Obama's 2008 performance and subsequently Hillary Clinton's 2016 performance of over 86% being the best ever by Democratic presidential nominees; Joe Biden was not far behind in 2020. In 2024, Republican Donald Trump gained the best percentage of the vote (and the largest number of raw votes) for any Republican since 1988, even though Kamala Harris still managed 80% of the vote. This reflected similar rightward shifts across the state of New York and the nation as a whole in that election. As of 2023, three Democrats represented Manhattan in the United States House of Representatives. The United States Postal Service operates post offices in Manhattan. The James Farley Post Office in Midtown Manhattan is New York City's main post office. Both the United States District Court for the Southern District of New York and United States Court of Appeals for the Second Circuit are located in Lower Manhattan's Foley Square, and the U.S. Attorney and other federal offices and agencies maintain locations in that area. Starting in the mid-19th century, the United States became a magnet for immigrants seeking to escape poverty in their home countries. After arriving in New York, many new arrivals ended up living in squalor in the slums of the Five Points neighborhood, an area between Broadway and the Bowery, northeast of New York City Hall. By the 1820s, the area was home to many gambling dens and brothels, and was known as a dangerous place to go. In 1842, Charles Dickens visited the area and was appalled at the horrendous living conditions he had seen. The predominantly Irish Five Points Gang was one of the country's first major organized crime entities. As Italian immigration grew in the early 20th century many joined ethnic gangs, including Al Capone, who got his start in crime with the Five Points Gang. The Mafia (also known as Cosa Nostra) first developed in the mid-19th century in Sicily and spread to the US East Coast during the late 19th century following waves of Sicilian and Southern Italian emigration. Lucky Luciano established Cosa Nostra in Manhattan, forming alliances with other criminal enterprises, including the Jewish mob, led by Meyer Lansky, the leading Jewish gangster of that period. From 1920 to 1933, Prohibition helped create a thriving black market in liquor, upon which the Mafia was quick to capitalize. New York City as a whole experienced a sharp increase in crime during the post-war period. The murder rate in Manhattan hit an all-time high of 42 murders per 100,000 residents in 1979. Manhattan retained the highest murder rate in the city until 1985 when it was surpassed by the Bronx. Most serious violent crime has been historically concentrated in Upper Manhattan and the Lower East Side, though robbery in particular was a major quality of life concern throughout the borough. Through the 1990s and 2000s, levels of violent crime in Manhattan plummeted to levels not seen since the 1950s, with murders in Manhattan dropping from 503 in 1990, at the citywide peak, to 78 in 2022, a decline of 84%. Today crime rates in most of Lower Manhattan, Midtown, the Upper East Side, and the Upper West Side are consistent with other major city centers in the United States. However, crime rates remain high in the Upper Manhattan neighborhoods of East Harlem, Harlem, Washington Heights, Inwood, and New York City Housing Authority developments across the borough, despite significant reductions. After the start of the COVID-19 pandemic in March 2020, there had been an increase in violent crime, particularly in Upper Manhattan. Mirroring a nationwide trend, rates of shootings and violent crimes in 2023 declined from their peaks during the pandemic. Housing The rise of immigration near the turn of the 20th century left major portions of Manhattan, especially the Lower East Side, densely packed with recent arrivals, crammed into unhealthy and unsanitary housing. Tenements were usually five stories high, constructed on the then-typical 25 by 100 feet (7.6 by 30.5 m) lots, with "cockroach landlords" exploiting the new immigrants. By 1929, a new housing code effectively ended construction of tenements, though some survive today on the East Side of the borough. Conversely, there were also areas with luxury apartment developments, the first of which was the Dakota on the Upper West Side. Manhattan offers a wide array of private housing, as well as public housing, which is administered by the New York City Housing Authority (NYCHA). Affordable rental and co-operative housing units throughout the borough were created under the Mitchell–Lama Housing Program. There were 928,714 housing units in 2023 at an average density of 40,988 units per square mile (15,826/km2). As of 2003[update], only 24.3% of Manhattan residents lived in owner-occupied housing, the second-lowest rate of all counties in the nation, after the Bronx. Public housing administered by NYCHA accounts for nearly 100,000 residents in more than 50,000 units in 2023. Completed in 1935, the First Houses in the East Village were one of the country's first publicly funded low-income housing projects. At $2,024 in 2022, Manhattan has the highest average cost for rent of any county in the US, although a lower percentage of annual income than in several other American cities. Manhattan's real estate market for luxury housing continues to be among the most expensive in the world, and Manhattan residential property continues to have the highest sale price per square foot in the United States. Manhattan's apartments cost $1,773 per square foot ($19,080/m2), compared to San Francisco housing at $1,185 per square foot ($12,760/m2), Boston housing at $751 per square foot ($8,080/m2), and Los Angeles housing at $451 per square foot ($4,850/m2). As of the fourth quarter of 2021, the median value of homes in Manhattan was $1,306,208, second highest among US counties. Infrastructure Manhattan is unique in the U.S. for intense use of public transportation and lack of private car ownership. While 88% of Americans nationwide drive to their jobs, with only 5% using public transport, mass transit is the dominant form of travel for residents of Manhattan, with 72% of borough residents using public transport to get to work, while only 18% drove. According to the 2000 United States Census, 77.5% of Manhattan households do not own a car. Congestion pricing was implemented in Lower and Midtown Manhattan in January 2025 and applies to nearly all motor vehicular traffic in Manhattan south of 60th Street. The New York City Subway is the primary means of travel within the city, linking every borough except Staten Island. There are 151 subway stations in Manhattan, out of the 472 stations. A second subway, the PATH system, connects six stations in Manhattan to northern New Jersey. Commuter rail services operating to and from Manhattan are the Long Island Rail Road (LIRR), which connects Manhattan and other New York City boroughs to Long Island; the Metro-North Railroad, which connects Manhattan to Upstate New York and Southwestern Connecticut; and NJ Transit trains, which run to various points in New Jersey. The commuter rail lines converge at New York Penn Station and Grand Central Terminal, on the west and east sides of Midtown Manhattan, respectively. They are the two busiest rail stations in the United States. Amtrak provides inter-city passenger rail service from Penn Station to Boston, Philadelphia, Baltimore, and Washington, D.C.; Upstate New York and New England; cross-Canadian border service to Toronto and Montreal; and destinations in the Southern and Midwestern United States. The East Side Access project, which brings LIRR trains to Grand Central Terminal, opened in 2023. Four multi-billion-dollar projects were completed in the mid-2010s: the $1.4 billion Fulton Center in November 2014, the $2.4 billion 7 Subway Extension in September 2015, the $4 billion World Trade Center Transportation Hub in March 2016, and Phase 1 of the $4.5 billion Second Avenue Subway in January 2017. MTA New York City Transit operates local buses within Manhattan under the brand New York City Bus. An extensive network of express bus routes serves commuters and other travelers heading into Manhattan. The Roosevelt Island Tramway, one of two commuter cable car systems in North America, takes commuters between Roosevelt Island and Manhattan Island. The Staten Island Ferry carries over 21 million passengers annually on the 5.2-mile (8.4 km) run between Manhattan and Staten Island. The ferry has been fare-free since 1997. Manhattan is also served by NYC Ferry, which began operating in 2017. All of the system's routes serve Manhattan. The Port Authority Bus Terminal is the city's main intercity bus terminal and the world's busiest bus station. It serves 250,000 passengers on 7,000 buses each workday in a 1950 building designed to accommodate 60,000 daily passengers. A 2021 plan announced by the Port Authority would spend $10 billion to expand capacity and modernize the facility. In 2024, the Port Authority announced plans for a new terminal that would be completed by 2032 and include a pair of office buildings to defray the costs of the project. New York's iconic yellow taxicabs, which number 13,087 citywide and must have a medallion authorizing the pickup of street hails, are ubiquitous in the borough. Private vehicle for hire companies provide significant competition for taxicabs. According to the government of New York City, Manhattan had 19,676 bicycle commuters in 2017, roughly doubling from its total of 9,613 in 2012. The Commissioners' Plan of 1811 called for twelve numbered "avenues" running north and south roughly parallel to the Hudson River, each 100 feet (30 m) wide, with First Avenue on the east side and Twelfth Avenue on the west side. There are several intermittent avenues east of First Avenue, including four additional lettered avenues running from Avenue A eastward to Avenue D in an area now known as Alphabet City. The numbered streets in Manhattan run east–west, and are generally 60 feet (18 m) wide, with about 200 feet (61 m) between streets. The address algorithm of Manhattan is used to estimate the closest east–west cross street for building numbers on north–south avenues. According to the original Commissioner's Plan, there were 155 numbered crosstown streets, but later the grid was extended up to the northernmost corner of Manhattan Island, where the last numbered street is 220th Street, though the grid continues to 228th Street in the borough's Marble Hill neighborhood. Fifteen crosstown streets were designated as 100 feet (30 m) wide, including 34th, 42nd, 57th and 125th Streets, which became some of the borough's most significant transportation and shopping venues. Broadway, following the route of a Native American trail, is the most notable of many exceptions to the grid, starting at Bowling Green in Lower Manhattan and continuing north for 13 miles (21 km) into the Bronx. In much of Midtown Manhattan, Broadway runs at a diagonal to the grid, creating major named intersections at Union Square, Madison Square, Herald Square, Times Square, and Columbus Circle. "Crosstown streets" refers primarily to major east-west streets connecting Manhattan's East Side and West Side. The trip is notoriously frustrating for drivers because of heavy congestion on narrow local streets; absence of express roads other than the Trans-Manhattan Expressway at the far north end of Manhattan Island; and restricted to very limited crosstown automobile travel within Central Park. Proposals to build highways traversing the island through Manhattan's densest neighborhoods, namely the Mid-Manhattan Expressway across 34th Street and the Lower Manhattan Expressway through SoHo, failed in the 1960s. In New York City, all turns at red lights are illegal unless a sign permitting such maneuvers is present, significantly shaping traffic patterns in Manhattan. Another consequence of the strict grid plan of most of Manhattan, and the grid's skew of approximately 28.9 degrees, is a phenomenon sometimes referred to as Manhattanhenge (by analogy with Stonehenge). On May 28 and July 12, the sunset is aligned with the street grid lines, with the result that the sun is visible at or near the western horizon from street level. A similar phenomenon occurs with the sunrise on the eastern horizon on December 5 and January 8. The FDR Drive and Harlem River Drive, both designed by controversial New York master planner Robert Moses, comprise a single, long limited-access parkway skirting the east side of Manhattan along the East River and Harlem River south of Dyckman Street. The Henry Hudson Parkway is the corresponding parkway on the West Side north of 57th Street. Being primarily an island, Manhattan is linked to New York City's outer boroughs by bridges. Manhattan has fixed highway connections with New Jersey to its west by way of the George Washington Bridge, the Holland Tunnel, and the Lincoln Tunnel, and to three of the four other New York City boroughs—the Bronx to the northeast, and Brooklyn and Queens (both on Long Island) to the east and south. Its only direct connection with the fifth New York City borough, Staten Island, is the free Staten Island Ferry across New York Harbor, located near Battery Park at Manhattan's southern tip. It is also possible to travel on land to Staten Island by way of Brooklyn, via the Verrazzano–Narrows Bridge. The 14-lane George Washington Bridge, the world's busiest motor vehicle bridge, connects Washington Heights, in Upper Manhattan to Bergen County in New Jersey. There are numerous bridges to the Bronx across the Harlem River, and five (listed north to south)—the Triborough (known officially as the Robert F. Kennedy Bridge), Ed Koch Queensboro (also known as the 59th Street Bridge), Williamsburg, Manhattan, and Brooklyn Bridges—that cross the East River to connect Manhattan to Long Island. Several tunnels also link Manhattan Island to New York City's outer boroughs and New Jersey. The Lincoln Tunnel, which carries 120,000 vehicles a day under the Hudson River between New Jersey and Midtown Manhattan, is the busiest vehicular tunnel in the world. The tunnel was built instead of a bridge to allow unfettered passage of large passenger and cargo ships that sail through New York Harbor and up the Hudson River to Manhattan's piers. The Holland Tunnel, connecting Lower Manhattan to Jersey City, New Jersey, was the world's first mechanically ventilated vehicular tunnel. The Queens–Midtown Tunnel, built to relieve congestion on the bridges connecting Manhattan with Queens and Brooklyn, was the largest non-federal project in its time when it was completed in 1940; President Franklin D. Roosevelt was the first person to drive through it. The Brooklyn–Battery Tunnel runs underneath Battery Park and connects the Financial District at the southern tip of Manhattan to Red Hook in Brooklyn. Several ferry services operate between New Jersey and Manhattan. These ferries mainly serve midtown, Battery Park City, and Wall Street. Manhattan has three public heliports: the East 34th Street Heliport (also known as the Atlantic Metro-port), owned by New York City and run by the New York City Economic Development Corporation (NYCEDC); the Port Authority Downtown Manhattan/Wall Street Heliport, owned by the Port Authority of New York and New Jersey and run by the NYCEDC; and the West 30th Street Heliport, owned by the Hudson River Park Trust. Gas and electric service is provided by Consolidated Edison. Manhattan witnessed the doubling of its natural gas supply when a new gas pipeline opened on November 1, 2013. Con Edison operates the world's largest district steam system, which consists of 105 miles (169 km) of steam pipes, providing steam for heating, hot water, and air conditioning by some 1,800 Manhattan customers. Cable service is provided by Time Warner Cable and telephone service is provided by Verizon Communications, although AT&T is available as well. The New York City Department of Sanitation is responsible for garbage removal. The bulk of the city's trash is disposed at mega-dumps in Pennsylvania, Virginia, South Carolina, and Ohio (via transfer stations in New Jersey, Brooklyn and Queens) since the 2001 closure of the Fresh Kills Landfill on Staten Island. A small amount of trash processed at transfer sites in New Jersey is sometimes incinerated at waste-to-energy facilities. New York City has the largest clean-air diesel-hybrid and compressed natural gas bus fleet, which also operates in Manhattan, in the country. It also has some of the first hybrid taxis, most of which operate in Manhattan. There are many hospitals in Manhattan, including two of the 25 largest in the United States (as of 2017): New York City is supplied with drinking water by the protected Catskill Mountains watershed. As a result of the watershed's integrity and undisturbed natural water filtration system, New York is one of only four major cities in the US with a majority of drinking water pure enough not to require water treatment. The Croton Watershed north of the city is undergoing construction of a US$3.2 billion water purification plant to augment New York City's water supply by an estimated 290 million gallons daily, representing a greater than 20% addition to the city's water availability. Water comes to Manhattan through the tunnels 1 and 2, and in the future through Tunnel No. 3, begun in 1970. See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Thirty-seventh_government_of_Israel#cite_note-219] | [TOKENS: 9915] |
Contents Thirty-seventh government of Israel The thirty-seventh government of Israel is the current cabinet of Israel, formed on 29 December 2022, following the Knesset election the previous month. The coalition government currently consists of five parties — Likud, Shas, Otzma Yehudit, Religious Zionist Party and New Hope — and is led by Benjamin Netanyahu, who took office as the prime minister of Israel for the sixth time. The government is widely regarded as the most right-wing government in the country's history, and includes far-right politicians. Several of the government's policy proposals have led to controversies, both within Israel and abroad, with the government's attempts at reforming the judiciary leading to a wave of demonstrations across the country. Following the outbreak of the Gaza war, opposition leader Yair Lapid initiated discussions with Netanyahu on the formation of an emergency government. On 11 October 2023, National Unity MKs Benny Gantz, Gadi Eisenkot, Gideon Sa'ar, Hili Tropper, and Yifat Shasha-Biton joined the Security Cabinet of Israel to form an emergency national unity government. Their accession to the Security Cabinet and to the government (as ministers without portfolio) was approved by the Knesset the following day. Gantz, Netanyahu, and Defense Minister Yoav Gallant became part of the newly formed Israeli war cabinet, with Eisenkot and Ron Dermer serving as observers. National Unity left the government in June 2024. New Hope rejoined the government in September. Otzma Yehudit announced on 19 January 2025 that it had withdrawn from the government, which took effect on 21 January, following the cabinet's acceptance of the three-phase Gaza war ceasefire proposal, though it rejoined two months later. United Torah Judaism left the government in July 2025 over dissatisfaction with the government's draft conscription law. Shas left the government several days later, though it remains part of the coalition. Background The right-wing bloc of parties, led by Benjamin Netanyahu, known in Israel as the national camp, won 64 of the 120 seats in the elections for the Knesset, while the coalition led by the incumbent prime minister Yair Lapid won 51 seats. The new majority has been variously described as the most right-wing government in Israeli history, as well as Israel's most religious government. Shortly after the elections, Lapid conceded to Netanyahu, and congratulated him, wishing him luck "for the sake of the Israeli people". On 15 November, the swearing-in ceremony for the newly elected members of the 25th Knesset was held during the opening session. The vote to appoint a new Speaker of the Knesset, which is usually conducted at the opening session, as well as the swearing in of cabinet members were postponed since ongoing coalition negotiations had not yet resulted in agreement on these positions. Government formation Yair Lapid Yesh Atid Benjamin Netanyahu Likud On 3 November 2022, Netanyahu told his aide Yariv Levin to begin informal coalition talks with allied parties, after 97% of the vote was counted. The leader of the Shas party Aryeh Deri met with Yitzhak Goldknopf, the leader of United Torah Judaism and its Agudat Yisrael faction, on 4 November. The two parties agreed to cooperate as members of the next government. The Degel HaTorah faction of United Torah Judaism stated on 5 November that it will maintain its ideological stance about not seeking any ministerial posts, as per the instruction of its spiritual leader Rabbi Gershon Edelstein, but will seek other senior posts like Knesset committee chairmen and deputy ministers. Netanyahu himself started holding talks on 6 November. He first met with Moshe Gafni, the leader of Degel HaTorah, and then with Goldknopf. Meanwhile, the Religious Zionist Party leader Bezalel Smotrich and the leader of its Otzma Yehudit faction Itamar Ben-Gvir pledged that they would not enter the coalition without the other faction. Gafni later met with Smotrich for coalition talks. Smotrich then met with Netanyahu. On 7 November, Netanyahu met with Ben-Gvir who demanded the Ministry of Public Security with expanded powers for himself and the Ministry of Education or Transport and Road Safety for Yitzhak Wasserlauf. A major demand among all of Netanyahu's allies was that the Knesset be allowed to ignore the rulings of the Supreme Court. Netanyahu met with the Noam faction leader and its sole MK Avi Maoz on 8 November after he threatened to boycott the coalition. He demanded complete control of the Western Wall by the Haredi rabbinate and removal of what he considered as anti-Zionist and anti-Jewish content in schoolbooks. President Isaac Herzog began consultations with heads of all the political parties on 9 November after the election results were certified. During the consultations, he expressed his reservations about Ben-Gvir becoming a member in the next government. Shas met with Likud for coalition talks on 10 November. By 11 November, Netanyahu had secured recommendations from 64 MKs, which constituted a majority. He was given the mandate to form the thirty-seventh government of Israel by President Herzog on 13 November. Otzma Yehudit and Noam officially split from Religious Zionism on 20 November as per a pre-election agreement. On 25 November, Otzma Yehudit and Likud signed a coalition agreement, under which Ben-Gvir will assume the newly created position of National Security Minister, whose powers would be more expansive than that of the Minister of Public Security, including overseeing the Israel Police and the Israel Border Police in the West Bank, as well as giving powers to authorities to shoot thieves stealing from military bases. Yitzhak Wasserlauf was given the Ministry for the Development of the Negev and the Galilee with expanded powers to regulate new West Bank settlements, while separating it from the "Periphery" portfolio, which will be given to Shas. The deal also includes giving the Ministry of Heritage to Amihai Eliyahu, separating it from the "Jerusalem Affairs" portfolio, the chairmanship of the Knesset's Public Security Committee to Zvika Fogel and that of the Special Committee for the Israeli Citizens' Fund to Limor Son Har-Melech, the post of Deputy Economic Minister to Almog Cohen, establishment of a national guard, and expansion of mobilization of reservists in the Border Police. Netanyahu and Maoz signed a coalition agreement on 27 November, under which the latter would become a deputy minister, would head an agency on Jewish identity in the Prime Minister's Office, and would also head Nativ, which processes the aliyah from the former Soviet Union. The agency for Jewish identity would have authority over educational content taught outside the regular curriculum in schools, in addition to the department of the Ministry of Education overseeing external teaching and partnerships, which would bring nonofficial organisations permitted to teach and lecture at schools under its purview. Likud signed a coalition agreement with the Religious Zionist Party on 1 December. Under the deal, Smotrich would serve as the Minister of Finance in rotation with Aryeh Deri, and the party will receive the post of a minister within the Ministry of Defense with control over the departments administering settlement and open lands under the Coordinator of Government Activities in the Territories, in addition to another post of a deputy minister. The deal also includes giving the post of Minister of Aliyah and Integration to Ofir Sofer, the newly created National Missions Ministry to Orit Strook, and the chairmanship of the Knesset's Constitution, Law and Justice Committee to Simcha Rothman. Likud and United Torah Judaism signed a coalition agreement on 6 December, to allow request for an extension to the deadline. Under it, the party would receive the Ministry of Construction and Housing, the chairmanship of the Knesset Finance Committee which will be given to Moshe Gafni, the Ministry of Jerusalem and Tradition (which would replace the Ministry of Jerusalem Affairs and Heritage), in addition to several posts of deputy ministers and chairmanships of Knesset committees. Likud also signed a deal with Shas by 8 December, securing interim coalition agreements with all of their allies. Under the deal, Deri will first serve as the Minister of Interior and Health, before rotating posts with Smotrich after two years. The party will also receive the Ministry of Religious Services and Welfare Ministries, as well as posts of deputy ministers in the Ministry of Education and Interior. The vote to replace then-incumbent Knesset speaker Mickey Levy was scheduled for 13 December, after Likud and its allies secured the necessary number of signatures for it. Yariv Levin of Likud was elected as an interim speaker by 64 votes, while his opponents Merav Ben-Ari of Yesh Atid and Ayman Odeh of Hadash received 45 and five votes respectively. Netanyahu asked Herzog for a 14-day extension after the agreement with Shas to finalise the roles his allied parties would play. Herzog on 9 December extended the deadline to 21 December. On that date, Netanyahu informed Herzog that he had succeeded in forming a coalition, with the new government expected to be sworn in by 2 January 2023. The government was sworn in on 29 December 2022. Timeline Israeli law stated that people convicted of crimes cannot serve in the government. An amendment to that law was made in late 2022, known colloquially as the Deri Law, to allow those who had been convicted without prison time to serve. This allowed Deri to be appointed to the cabinet. Shas leader Aryeh Deri was appointed to be Minister of Health, Minister of the Interior, and Vice Prime Minister in December 2022. He was fired in January 2023, following a Supreme Court decision that his appointment was unreasonable, since he had been convicted of fraud, and had promised not to seek government roles through a plea deal. In March 2023, Defence Minister Yoav Gallant called on the government to delay legislation related to the judicial reform. Prime Minister Netanyahu announced that he had been dismissed from his position, leading to the continuation of mass protests across the country (which had started in January in Tel Aviv). Gallant continued to serve as a minister as he had not received formal notice of dismissal, and two weeks later it was announced that Netanyahu had reversed his decision. Public Safety Minister Itamar Ben-Gvir (Otzma Yehudit leader) and Minister of Justice Yariv Levin (Likud) both threatened to resign if the judicial reform was delayed.[better source needed] After the outbreak of the Gaza war, five members of the National Unity party joined the government as ministers without portfolio, with leader Benny Gantz being made a member of the new Israeli war cabinet (along with Netanyahu and Gallant). As the war progressed, minister of national security Itamar Ben-Gvir threatened to leave the government if the war was ended. A month later in mid December, he again threatened to leave if the war did not maintain "full strength". Gideon Sa'ar stated on 16 March that his New Hope party would resign from the government and join the opposition if Prime Minister Benjamin Netanyahu did not appoint him to the Israeli war cabinet. Netanyahu did not do so, resulting in Sa'ar's New Hope party leaving the government nine days later, reducing the size of the coalition from 76 MKs to 72. Ben-Gvir and Bezalel Smotrich, of the National Religious Party–Religious Zionism party, have indicated that they will withdraw their parties from the government if the January 2025 Gaza war ceasefire is adopted, which would bring down the government. Ben-Gvir announced on 5 June that the members of his party would be allowed to vote as they wish, though his party resumed support on 9 June. On 18 May, Gantz set an 8 June deadline for withdrawal from the coalition, which was delayed by a day following the 2024 Nuseirat rescue operation. Gantz and his party left the government on 9 June, giving the government 64 seats in the Knesset. Sa'ar and his New Hope party rejoined the Netanyahu government on 30 September, increasing the number of seats held by the government to 68. The High Court of Justice ruled on 28 March 2024 that yeshiva funds would no longer be available for students who are "eligible for enlistment", effectively allowing ultra-Orthodox Jews to be drafted into the IDF. Attorney general Gali Baharav-Miara indicated on 31 March that the conscription process must begin on 1 April. The court ruled on 25 June that the IDF must begin to draft yeshiva students. Likud announced on 7 July that it would not put forward any legislation after Shas and United Torah Judaism said that they would boycott the plenary session over the lack of legislation dealing with the Haredi draft. The Ultra-Orthodox boycott continued for a second day, with UTJ briefly ending its boycott on 9 July to unsuccessfully vote in favor of a bill which would have weakened the Law of Return. Yuli Edelstein, who was replaced by Boaz Bismuth on the Foreign Affairs and Defense Committee in early August, published a draft version of the conscription law shortly before his ouster. Bismuth cancelled the work on the draft law in September 2025, which Edelstein called "a shame." Bismuth released the official version of the draft law in late November 2025. It weakened penalties for draft evaders, with Edelstein saying it was "the exact opposite" of the bill which he attempted to pass. Members of Otzma Yehudit resigned from the government on 19 January 2025 over the January 2025 Gaza war ceasefire, which took effect on 21 January. The members rejoined in March, following the "resumption" of the war in Gaza. Avi Maoz of the Noam party left the government in March 2025. On 4 June 2025, senior rabbis for United Torah Judaism Dov Lando and Moshe Hillel Hirsch instructed the party's MKs to pass a bill which would dissolve the Knesset. Yesh Atid, Yisrael Beytenu and The Democrats announced that they will "submit a bill" for dissolution on 11 June, with Yesh Atid tabling the bill on 4 June. There were also reports that Shas would vote in favor of Knesset dissolution amidst division within the governing coalition on Haredi conscription. This jeopardized the coalition's majority and would have triggered new elections if the bill passed. The following day, Agudat Yisrael, one of the United Torah Judaism factions, confirmed that it would submit a bill to dissolve the Knesset. Asher Medina, a Shas spokesman, indicated on 9 June that the party would vote in favor of a preliminary bill to dissolve the Knesset. The rabbis of Degel HaTorah instructed the parties' MKs on 12 June 2025 to oppose the dissolution of the Knesset, which was followed by Yuli Edelstein and the Shas and Degel HaTorah parties announcing that a deal had been reached, with "rabbinical leaders" telling their parties to delay the dissolution vote by a week. Shas and Degel HaTorah voted against the dissolution bill, which led to the bill failing its preliminary reading in a vote of 61 against and 53 in favor. MKs Ya'akov Tessler and Moshe Roth of Agudat Yisrael voted in favor of dissolution. Another dissolution bill will be unable to be brought forward for six months. If the bill had passed its preliminary reading, in addition to three more readings, an election would have been held in approximately three months; The Jerusalem Post posited it would have been held in October. Degel HaTorah announced on 14 July 2025 that it would leave the government because members of the party were dissatisfied after viewing the proposed draft bill by Yuli Edelstein regarding Haredi exemptions from the Israeli draft. Several hours later, Agudat Yisrael announced that it would also leave the government. Deputy Transportation Minister Uri Maklev, Moshe Gafni, the head of the Knesset Finance Committee, Ya'akov Asher, the head of the Knesset Interior and Environment Protection Committee and Jerusalem Affairs minister Meir Porush all submitted their resignations, with their resignations taking effect in 48 hours. Sports Minister Ya'akov Tessler and "Special Committee for Public Petitions Chair" Yitzhak Pindrus also submitted resignations. Yisrael Eichler submitted his resignation as the "head of the Knesset Labor and Welfare Committee" the same day. The resignations will leave Netanyahu's government with a 60-seat majority in the Knesset, as Avi Maoz, of the Noam party, left the government in March 2025. Despite Edelstein's ouster in August, a spokesman for UTJ head Yitzhak Goldknopf remarked that it would not change the faction's withdrawal from the government. The religious council for Shas, called the Moetzet Chachmei HaTorah, instructed the party on 16 July to leave the government, but stay in the coalition. The following day, various cabinet ministers submitted their resignations, including "Interior Minister Moshe Arbel, Social Affairs Minister Ya'akov Margi and Religious Services Minister Michael Malchieli." Malchieli reportedly has postponed his resignation so he could attend a 20 July meeting of the panel investigating whether attorney general Gali Baharav-Miara should be dismissed. Deputy Minister of Agriculture Moshe Abutbul, Minister of Health Uriel Buso and Haim Biton, a minister in the Education Ministry, also submitted their resignation letters, while Arbel retracted his resignation letter. The last cabinet member from the party to submit it was Labor Minister Yoav Ben-Tzur. The ministers who resigned will return to the Knesset, replacing MKs Moshe Roth, Yitzhak Pindrus and Eliyahu Baruchi. Members of government Listed below are the current ministers in the government: Principles and priorities According to the agreements signed between Likud and each of its coalition partners, and the incoming government's published guideline principles, its stated priorities are to combat the cost of living, further centralize Orthodox control over the state religious services, pass judicial reforms which include legislation to reduce judicial controls on executive and legislative power, expand settlements in the West Bank, and consider an annexation of the West Bank. Before the vote of confidence in his new government in the Knesset, Netanyahu presented three top priorities for the new government: internal security and governance, halting the nuclear program of Iran, and the development of infrastructure, with a focus on further connecting the center of the country with its periphery. Policies The government's flagship program, centered around reforms in the judicial branch, drew widespread criticism. Critics said it would have negative effects on the separation of powers, the office of the Attorney General, the economy, public health, women and minorities, workers' rights, scientific research, the overall strength of Israel's democracy and its foreign relations. After weeks of public protests on Israel's streets, joined by a growing number of military reservists, Minister of Defense Yoav Gallant spoke against the reform on 25 March, calling for a halt of the legislative process "for the sake of Israel's security". The next day, Netanyahu announced that he would be removed from his post, sparking another wave of protest across Israel and ultimately leading to Netanyahu agreeing to pause the legislation. On 10 April, Netanyahu announced that Gallant would keep his post. On 27 March 2023, after the public protests and general strikes, Netanyahu announced a pause in the reform process to allow for dialogue with opposition parties. However, negotiations aimed at reaching a compromise collapsed in June, and the government resumed its plans to unilaterally pass parts of the legislation. On 24 July 2023, the Knesset passed a bill that curbs the power of the Supreme Court to declare government decisions unreasonable; on 1 January 2024, the Supreme Court struck the bill down. The Knesset passed a "watered-down" version of the judicial reform package in late March 2025 which "changes the composition" of the judicial selection committee. In December 2022 Minister of National Security Itamar Ben-Gvir sought to amend the law that regulates the operations of the Israel Police, such that the ministry will have more direct control of its forces and policies, including its investigative priorities. Attorney General Gali Baharav-Miara objected to the draft proposal, raising concerns that the law would enable the politicization of police work, and the draft was amended to partially address those concerns. Nevertheless, in March 2023 Deputy Attorney General Gil Limon stated that the Attorney General's fears had been realized, referring to several instances of ministerial involvement in the day-to-day work of the otherwise independent police force – statements that were repeated by the Attorney General herself two days later. Separately, Police Commissioner Kobi Shabtai instructed Deputy Commissioners to avoid direct communication with the minister, later stating that "the Israel Police will remain apolitical, and act only according to law". Following appeals by the Association for Civil Rights in Israel and the Movement for Quality Government in Israel, the High Court of Justice instructed Ben-Gvir "to refrain from giving operational directions to the police... [especially] as regards to protests and demonstrations against the government." As talks of halting the judicial reform gained wind during March 2023, Minister of National Security Itamar Ben-Gvir threatened to resign if the legislation implementing the changes was suspended. To appease Ben-Gvir, Prime Minister Netanyahu announced that the government would promote the creation of a new National Guard, to be headed by Ben-Gvir. On 29 March, thousands of Israelis demonstrated in Tel Aviv, Haifa and Jerusalem against this decision. On 1 April, the New York Times quoted Gadeer Nicola, head of the Arab department at the Association for Civil Rights in Israel, as saying "If this thing passes, it will be an imminent danger to the rights of Arab citizens in this country. This will create two separate systems of applying the law. The regular police which will operate against Jewish citizens — and a militarized militia to deal only with Arab citizens." The same day, while speaking on Israel's Channel 13 about those whom he'd like to see enlist in the National Guard, Ben-Gvir specifically mentioned La Familia, the far-right fan club of the Beitar Jerusalem soccer team. On 2 April, Israel's cabinet approved the establishment of a law enforcement body that would operate independently of the police, under Ben-Gvir's authority. According to the decision, the Minister was to establish a committee chaired by the Director General of the Ministry of National Security, with representatives of the ministries of defense, justice and finance, as well as the police and the IDF, to outline the operations of the new organization. The committee's recommendations will be submitted to the government for consideration. Addressing a conference on 4 April, Police Commissioner Kobi Shabtai said that he is not opposed to the establishment of a security body which would answer to the police, but "a separate body? Absolutely not." The police chief said he had warned Ben-Gvir that the establishment of a security body separate from the police is "unnecessary, with extremely high costs that may harm citizens' personal security." During a press conference on 10 April, Prime Minister Netanyahu said, in what has been seen by some news outlets as a concession to the protesters, that "This will not be anyone's militia, it will be a security body, orderly, professional, that will be subordinate to one of the [existing] security bodies." The committee established by the government recommended the government to order the establishment of the National Guard immediately while allocating budgets. The National Guard, under whose command will be a superintendent of the police, will not be subordinate to Ben-Gvir. It will be subordinate to the police commissioner and will be part of Israel Border Police. The Ministry of Defense and Finance opposed the conclusions. The Israeli National Security Council called for further discussion on this. The coalition's efforts to expand the purview of Rabbinical courts; force some organizations, such as hospitals, to enforce certain religious practices; amend the Law Prohibiting Discrimination to allow gender segregation and discrimination on the grounds of religious belief; expand funding for religious causes; and put into law the exemption of yeshiva and kolel students from conscription have drawn criticism. According to the Haaretz op-ed of 7 March 2023, "the current coalition is interested... in modifying the public space so it suits the religious lifestyle. The legal coup is meant to castrate anyone who can prevent it, most of all the HCJ." Several banks and institutional investors, including the Israel Discount Bank and AIG have committed to avoid investing in, or providing credit to any organization that will discriminate against others on ground of religion, race, gender or sexual orientation. A series of technology companies and investment firms including Wiz, Intel Israel, Salesforce and Microsoft Israel Research and Development, have criticized the proposed changes to the Law Prohibiting Discrimination, with Wiz stating that it will require its suppliers to commit to preventing discrimination. Over sixty prominent law firms pledged that they will neither represent, nor do business with discriminating individuals and organizations. Insight Partners, a major private equity fund operating in Israel, released a statement warning against intolerance and any attempt to harm personal liberties. Orit Lahav, chief executive of the women's rights organization Mavoi Satum ("Dead End"), said that "the Rabbinical courts are the most discriminatory institution in the State of Israel... Limiting the HCJ[d] while expanding the jurisdiction of the Rabbinical courts would... cause significant harm to women." Anat Thon Ashkenazy, Director of the Center for Democratic Values and Institutions at the Israel Democracy Institute, said that "almost every part of the reform could harm women... the meaning of an override clause is that even if the court says that the law on gender segregation is illegitimate, is harmful, the Knesset could say 'Okay, we say otherwise'". She added that "there is a very broad institutional framework here, after which there will come legislation that harms women's right and we will have no way of protecting or stopping it." During July 2023, 20 professional medical associations signed a letter of position warning against the ramifications to public health that would result from the exclusion of women from the public sphere. They cited, among others, a rise in prevalence of risk factors for cardiovascular disease, pregnancy-related ailments, psychological distress, and the risk of suicide. On 30 July the Knesset passed an amendment to penal law adding sexual offenses to those offenses whose penalty can be doubled if done on grounds of "nationalistic terrorism, racism or hostility towards a certain community". According to MK Limor Son Har-Melech, the bill is meant to penalize any individual who "[intends to] harm a woman sexually based on her Jewishness". The law was criticized by MK Gilad Kariv as "populist, nationalistic, and dangerous towards the Arab citizens of Israel", and by MK Ahmad Tibi as a "race law", and was objected to by legal advisors at the Ministry of Justice and the Knesset Committee on National Security. Activist Orit Kamir wrote that "the amendment... is neither feminist, equal, nor progressive, but the opposite: it subordinates women's sexuality to the nationalistic, racist patriarchy. It hijacks the Law for Prevention of Sexual Harassment to serve a world view that tags women as sexual objects that personify the nation's honor." Yael Sherer, director of the Lobby to Combat Sexual Violence, criticized the law as being informed by dated ideas about sexual assault, and proposed that MKs "dedicate a session... to give victims of sexual assault an opportunity to come out of the darkness... instead of [submitting] declarative bills that change nothing and are not meant but for grabbing headlines". In Israel, during 2022, 24 women "were murdered because they were women," which was an increase of 50% compared to 2021. A law permitting courts to order men subject to a restraining order following domestic violence offenses to wear electronic tags was drafted during the previous Knesset and had passed its first reading unanimously. On 22 March 2023, the Knesset voted to reject the bill. It had been urged to do so by National Security Minister Itamar Ben-Gvir, who said that the bill was unfair to men. Earlier in the week, Ben-Gvir had blocked the measure from advancing in the ministerial legislative committee. The MKs voting against the bill included Prime Minister Netanyahu. The Association of Families of Murder Victims said that by rejecting the law, National Security Minister Itamar Ben-Gvir "brings joy to violent men and abandons the women threatened with murder… unsupervised restraining orders endanger women's lives even more. They give women the illusion of being protected, and then they are murdered." MK Pnina Tamano-Shata, chairwoman of the Knesset Committee on the Status of Women and Gender Equality, said that "the coalition proved today that it despises women's lives." The NGO Amutat Bat Melech [he], which assists Orthodox and ultra-Orthodox women who suffer from domestic violence, said that: "Rejecting the electronic bracelet bill is disconnected from the terrible reality of seven femicides since the beginning of the year. This is an effective tool of the first degree that could have saved lives and reduced the threat to women suffering from domestic violence. This is a matter of life and death, whose whole purpose is to provide a solution to defend women." The agreement signed by the coalition parties includes the setting up of a committee to draft changes to the Law of Return. Israeli religious parties have long demanded that the "grandchild clause" of the Law of Return be cancelled. This clause grants citizenship to anyone with at least one Jewish grandparent, as long as they do not practice another religion. If the grandchild clause were to be removed from the Law of Return then around 3 million people who are currently eligible for aliyah would no longer be eligible. The heads of the Jewish Agency, the Jewish Federations of North America, the World Zionist Organization and Keren Hayesod sent a joint letter to Prime Minister Netanyahu, expressing their "deep concern" about any changes to the Law of Return, adding that "Any change in the delicate and sensitive status quo on issues such as the Law of Return or conversion could threaten to unravel the ties between us and keep us away from each other." The Executive Council of Australian Jewry and the Zionist Federation of Australia issued a joint statement saying "We… view with deep concern… proposals in relation to religious pluralism and the law of return that risk damaging Israel's… relationship with Diaspora Jewry." On 19 March 2023, Israeli Finance Minister Bezalel Smotrich spoke in Paris at a memorial service for a Likud activist. The lectern at which Smotrich spoke was covered with a flag depicting the 'Greater Land of Israel,' encompassing the whole of Mandatory Palestine, as well as Trans-Jordan. During his speech, Smotrich said that "there's no such thing as Palestinians because there's no such thing as a Palestinian people." He added that the Palestinian people are a fictitious nation invented only to fight the Zionist movement, asking "Is there a Palestinian history or culture? There isn't any." The event received widespread media coverage. On 21 March, a spokesman for the US State Department sharply criticized Smotrich's comments. "The comments, which were delivered at a podium adorned with an inaccurate and provocative map, are offensive, they are deeply concerning, and, candidly, they're dangerous. The Palestinians have a rich history and culture, and the United States greatly values our partnership with the Palestinian people," he said. The Jordanian Foreign Ministry also voiced disapproval: "The Israeli Minister of Finance's use, during his participation in an event held yesterday in Paris, of a map of Israel that includes the borders of the Hashemite Kingdom of Jordan and the occupied Palestinian territories represents a reckless inflammatory act, and a violation of international norms and the Jordanian-Israeli peace treaty." Additionally, a map encompassing Mandatory Palestine and Trans-Jordan with a Jordanian flag on it was placed on a central lectern in the Jordanian Parliament. Jordan's parliament voted to expel the Israeli ambassador. Israel's Ministry of Foreign Affairs released a clarification relating to the matter, stating that "Israel is committed to the 1994 peace agreement with Jordan. There has been no change in the position of the State of Israel, which recognizes the territorial integrity of the Hashemite Kingdom of Jordan". Ahead of a Europe Day event due to take place on 9 May 2023, far-right wing National Security Minister Itamar Ben-Gvir was assigned as a representative of the government and a speaker at the event by the government secretariat, which deals with placing ministers at receptions on the occasion of the national days of the foreign embassies. The European Union requested that Ben-Gvir not attend, but the government did not make changes to the plan. On 8 May, the European delegation to Israel cancelled the reception, stating that: "The EU Delegation to Israel is looking forward to celebrating Europe Day on May 9, as it does every year. Regrettably, this year we have decided to cancel the diplomatic reception, as we do not want to offer a platform to someone whose views contradict the values the European Union stands for. However, the Europe Day cultural event for the Israeli public will be maintained to celebrate with our friends and partners in Israel the strong and constructive bilateral relationship". Israel's Opposition Leader Yair Lapid stated: "Sending Itamar Ben-Gvir to a gathering of EU ambassadors is a serious professional mistake. The government is embarrassing a large group of friendly countries, jeopardizing future votes in international institutions, and damaging our foreign relations. Last year, after a decade of efforts, we succeeded in signing an economic-political agreement with the European Union that will contribute to the Israeli economy and our foreign relations. Why risk it, and for what? Ben-Gvir is not a legitimate person in the international community (and not really in Israel either), and sometimes you have to be both wise and just and simply send someone else". On 23 February 2023, Defense Minister Gallant signed an agreement assigning governmental powers in the West Bank to a body to be headed by Minister Bezalel Smotrich, who will effectively become the governor of the West Bank, controlling almost all areas of life in the area, including planning, building and infrastructure. Israeli governments have hitherto been careful to keep the occupation as a military government. The temporary holding of power by an occupying military force, pending a negotiated settlement, is a principle of international law – an expression of the prohibition against obtaining sovereignty through conquest that was introduced in the wake of World War II. An editorial in Haaretz noted that the assignment of governmental powers in the West Bank to a civilian governor, alongside the plan to expand the dual justice system so that Israeli law will apply fully to settlers in the West Bank, constitutes de jure annexation of the West Bank. On 26 February 2023, following the 2023 Huwara shooting in which two Israelis were killed by an unidentified attacker, hundreds of Israeli settlers attacked the Palestinian town of Huwara and three nearby villages, setting alight hundreds of Palestinian homes (some with people in them), businesses, a school, and numerous vehicles, killing one Palestinian man and injuring 100 others. Bezalel Smotrich subsequently called on Twitter for Huwara to be "wiped out" by the Israeli government. Zvika Fogel MK, of the ultra-nationalist Otzma Yehudit, which forms part of the governing coalition, said that he "looks very favorably upon" the results of the rampage. Members of the coalition proposed an amendment to the Disengagement Law, which would allow Israelis to resettle settlements vacated during the 2005 Israeli disengagement from Gaza and the northern West Bank. The evacuated settlements were considered illegal under international law, according to most countries. The proposal was approved for voting by the Foreign Affairs and Defense Committee on 9 March 2023, while the committee was still waiting for briefing materials from the NSS, IDF, MFA and Shin Bet, and was passed on 21 March. The US has requested clarification from Israeli ambassador Michael Herzog. A US State Department spokesman stated that "The U.S. strongly urges Israel to refrain from allowing the return of settlers to the area covered by the legislation, consistent with both former Prime Minister Sharon and the current Israeli Government's commitment to the United States," noting that the actions represent a clear violation of undertakings given by the Sharon government to the Bush administration in 2005 and Netanyahu's far-right coalition to the Biden administration the previous week. Minister of Communication Shlomo Karhi had initially intended to cut the funding of the Israeli Public Broadcasting Corporation (also known by its blanket branding Kan) by 400 million shekels – roughly half of its total budget – closing several departments, and privatizing content creation. In response, the Director-General of the European Broadcasting Union, Noel Curran, sent two urgent letters to Netanyahu, expressing his concerns and calling on the Israeli government to "safeguard the independence of our Member KAN and ensure it is allowed to operate in a sustainable way, with funding that is both stable, adequate, fair, and transparent." On 25 January 2023, nine journalist organizations representing some of Kan's competitors issued a statement of concern, acknowledging the "important contribution of public broadcasting in creating a worthy, unbiased and non-prejudicial journalistic platform", and noting that "the existence of the [broadcasting] corporation as a substantial public broadcast organization strengthens media as a whole, adding to the competition in the market rather than weakening it." They also expressed their concern that the "real reason" for the proposal was actually "an attempt to silence voices from which... [the Minister] doesn't always draw satisfaction". The same day, hundreds of journalists, actors and filmmakers protested in Tel Aviv. The proposal was eventually put on hold. On 22 February 2023 it was reported that Prime Minister Netanyahu was attempting to appoint his close associate Yossi Shelley as the deputy to the National Statistician — a highly sensitive position in charge of providing accurate data for decision makers. The appointment of Shelley, who did not possess the required qualifications for the role, was withdrawn following publication. In its daily editorial, Haaretz tied this attempt with the judicial reform: "once they take control of the judiciary, law enforcement and public media, they wish to control the state's data base, the dry numerical data it uses to plan its future". Netanyahu also proposed Avi Simhon for the role, and eventually froze all appointments at the Israel Central Bureau of Statistics. Also on 22 February 2023, it was revealed that Yoav Kish, the Minister of Education, was promoting a draft government decision change to the National Library of Israel board of directors which would grant him more power over the institution. In response, the Hebrew University — which owned the library until 2008 – announced that if the draft is accepted, it will withdraw its collections from the library. The university's collections, which according to the university constitute some 80% of the library's collection, include the Agnon archive, the original manuscript of Hatikvah, and the Rothschild Haggadah, the oldest known Haggadah. A group of 300 authors and poets signed an open letter against the move, further noting their objection against "political takeover" of public broadcasting, as well as "any legislation that will castrate the judiciary and damage the democratic foundations of the state of Israel". Several days later, it was reported that a series of donors decided to withhold their donations to the library, totaling some 80 million shekels. On 3 March a petition against the move by 1,500 academics, including Israel Prize laureates, was sent to Kish. The proposal has been seen by some as retribution against Shai Nitzan, the former State Attorney and the library's current rector. On 5 March it was reported that the Legal Advisor to the Ministry of Finance, Asi Messing, was withholding the proposal. According to Messing, the proposal – which was being promoted as part of the Economic Arrangements Law – "was not reviewed... by the qualified personnel in the Ministry of Finance, does not align with any of the common goals of the economic plan, was not agreed to by myself and was not approved by the Attorney General." As of February 2023, the government has been debating several proposals that will significantly weaken the Ministry of Environmental Protection, including reducing the environmental regulation of planning and development and electricity production. One of the main proposals, the transferal of a 3 billion shekel fund meant to finance waste management plants from the Ministry of Environmental Protection to the Ministry of the Interior, was eventually withdrawn. The Minister of Environmental Protection, Idit Silman, has been criticized for using for meeting with climate change denialists, for wasteful and personally-motivated travel on the ministry's expense, for politicizing the role, and for engaging in political activity on the ministry's time. The government has been noted for an unusually high number of dismissals and resignations of senior career civil servants, and for the frequent attempts to replace them with candidates with known political associations, who are often less competent. According to sources, Netanyahu and people in his vicinity are seeking out civil servants who were appointed by the previous government, intent on replacing them with people loyal to him. Governmental nominees for various positions have been criticized for lack of expertise. In addition to the nominee to the position of Deputy National Statistician (see above), the Director General of the Ministry of Finance, Shlomi Heisler; the Director General of the Ministry of Justice, Itamar Donenfeld; and the Director General of Ministry of Transport, Moshe Ben Zaken, have all been criticized for incompetence, lack of familiarity with their Ministries' subject matter, lack of interest in the job, or lack of experience in managing large organizations. It has been reported that in some ministries, senior officials were enacting slowdowns as a means for dealing with the new ministers and director generals. On 28 July the director general of the Ministry of Education resigned, citing as reason the societal "rift". Asaf Zalel, a retired Air Force Brigadier General, was appointed in January. When asked about attempts to appoint his personal friend and attorney to the board of directors of a state-owned company, Minister David Amsalem replied: "that is my job, due to my authority to appoint directors. I put forward people that I know and hold in esteem". Under Minister of Transport Miri Regev, the ministry has either dismissed or lost the heads of the National Public Transport Authority, Israel Airports Authority, National Road Safety Authority, Israel Railways, and several officials in Netivei Israel. The current chair of Netivei Israel is Likud member and Regev associate Yigal Amadi, and the legal counsel is Einav Abuhzira, daughter of a former Likud branch chair. Abuhzira was appointed instead of Elad Berdugo, nephew of Netanyahu surrogate Yaakov Bardugo, after he was disqualified for the role by the Israel Government Companies Authority. In July 2023 the Ministry of Communications, Shlomo Karhi, and the minister in charge of the Israel Government Companies Authority, Dudi Amsalem, deposed the chair of the Israel Postal Company, Michael Vaknin. The chair, who was hired to lead the company's financial recovery after years of operational loss and towards privatization, has gained the support of officials at the Authority and at the Ministry of Finance; nevertheless, the ministers claimed that his performance is inadequate, and nominated in his place Yiftah Ron-Tal, who has known ties to Netanyahu and Smotrich. They also nominated four new directors, two of which have known political associations, and a third who was a witness in Netanyahu's trial. The coalition is allowed to spend a portion of the state's budget on a discretionary basis, meant to coax member parties to reach an agreement on the budget. As of May 2023, the government was pushing an allocation of over 13 billion shekels over two years - almost seven times the amount allocated by the previous government. Most of the funds will be allocated for uses associated with the religious, orthodox and settler communities. The head of the Budget Department at the Ministry of Finance, Yoav Gardos, objected to the allocations, claiming they would exacerbate unemployment in the Orthodox community, which is projected to cost the economy a total of 6.7 trillion shekels in lost produce by 2065. At the onset of the Gaza war and the declaration of a state of national emergency, Minister of Finance Bezalel Smotrich instructed government agencies to continue with the planned distribution of discretionary funds. Corruption During March 2023, the government was promoting an amendment to the Law on Public Service (Gifts) that would allow Netanyahu to receive donations to fund his legal defense. The amendment follows a decision by the High Court of Justice (HCJ) that forced Netanyahu to refund US$270,000 given to him and his wife by his late cousin, Nathan Mileikowsky, for their legal defense. This is in contrast to past statements by Minister of Justice Yariv Levin, who spoke against the possible conflict of interests that can result from such transactions. The bill was opposed by the Attorney General Gali Baharav-Miara, who stressed that it could "create a real opportunity for governmental corruption", and was eventually withdrawn at the end of March. As of March 2023, the coalition was promoting a bill that would prevent judicial review of ministerial appointments. The bill is intended to prevent the HCJ from reviewing the appointment of the twice-convicted chairman of Shas, Aryeh Deri (convicted of bribery, fraud, and breach of trust), to a ministerial position, after his previous appointment was annulled on grounds of unreasonableness. The bill follows on the heels of another amendment, that relaxed the ban on the appointment of convicted criminals, so that Deri - who was handed a suspended sentence after his second conviction - could be appointed. The bill is opposed by the Attorney General, as well as by the Knesset Legal Adviser, Sagit Afik. Israeli law allows for declaring a Prime Minister (as well as several other high-ranking public officials) to be temporarily or permanently incapacitated, but does not specify the conditions which can lead to a declaration of incapacitation. In the case of the Prime Minister, the authority to do so is given to the Attorney General. In March 2023, the coalition advanced a bill that passes this authority from the Attorney General to the government with the approval of the Knesset committee, and clarified that incapacitation can only result from medical or mental conditions. On 3 January 2024, the Supreme Court ruled by a majority of 6 out of 11 that the validity of the law will be postponed to the next Knesset because the bill in its immediate application is a personal law and is intended to serve a distinct personal purpose. Later, the court rejected a petition regarding the definition of Netanyahu as an incapacitated prime minister due to his ongoing trial and conflict of interests. Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_ref-67] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Europe_Day] | [TOKENS: 1025] |
Contents Europe Day Europe Day is a day celebrating "peace and unity in Europe" celebrated on 5 May by the Council of Europe and on 9 May by the European Union. The first recognition of Europe Day was by the Council of Europe, introduced in 1964. The European Union later started to celebrate its own European Day in commemoration of the 1950 Schuman Declaration which first proposed the European Coal and Steel Community, leading it to be referred to by some as "Schuman Day" or "Day of the united Europe". Both days are celebrated by displaying the flag of Europe. History The Council of Europe was founded on 5 May 1949, and hence it chose that day for its celebrations when it established the holiday in 1964. The "Europe Day" of the EU was introduced in 1985 by the European Communities (the predecessor organisation of the EU). The date commemorates the Schuman Declaration of 9 May 1950, put forward by Robert Schuman, which proposed the pooling of French, Italian and West German coal and steel industries. This led to the creation of the European Coal and Steel Community, the first European Community, established on 18 April 1951. A "raft of cultural icons" was launched by the European Commission in 1985, in reaction to the report by the ad hoc commission "for a People's Europe" chaired by Pietro Adonnino. The aim was to facilitate European integration by fostering a Pan-European identity among the populations of the EC member states. The European Council adopted "Europe Day" along with the flag of Europe and other items on 29 June 1985, in Milan. Following the foundation of the European Union in 1993, observance of Europe Day by national and regional authorities increased significantly. Germany in particular has gone beyond celebrating just the day, since 1995 extending the observance to an entire "Europe Week" (Europawoche [de]) centered on 9 May. In Poland, the Schuman Foundation [pl], a Polish organisation advocating European integration established in 1991, first organised its Warsaw Schuman Parade [pl] on Europe Day 1999, at the time advocating the accession of Poland to the EU. Observance of 9 May as "Europe Day" was reported "across Europe" as of 2008. In 2019, 9 May became an official public holiday in Luxembourg each year, to mark Europe Day. The EU's choice of the date of foundation of the European Coal and Steel Community rather than that of the EU itself established a narrative in which Schuman's speech, concerned with inducing economic growth and cementing peace between France and Germany, is presented as anticipating a "vocation of the European Union to be the main institutional framework" for the much further-reaching European integration of later decades. The European Constitution would have legally enshrined all the European symbols in the EU treaties; however, the treaty failed to be ratified in 2005, and usage would continue only in the present de facto manner. The Constitution's replacement, the Treaty of Lisbon, contains a declaration by sixteen members supporting the symbols. The European Parliament "formally recognised" Europe Day in October 2008. Celebrations and commemorations The EU institutions open their doors to the public every year in Brussels and Strasbourg, allowing citizens to visit these places. Moreover, many of these organize commemorative events to honor the historical importance of the date. The bodies that choose to make this symbolic gesture are: In 2020 and 2021, due to the COVID-19 pandemic and the consequent inability to host physical events, the EU institutions organized virtual acts to pay tribute to all those Europeans who were collaborating in the fight against the pandemic. Furthermore, 2020 marked the 70th anniversary of the Schuman declaration and the 75th anniversary of the end of the Second World War. Given the occasion, the above-mentioned EU institutions launched several online events to commemorate the importance of the date. Legal recognition Europe Day is a public holiday for employees of European Union institutions. In 2019, it was declared a public holiday in Luxembourg, and is also a public holiday in Kosovo. It is a "memorial day" in Croatia, which is a legally-recognised day, but is not a public holiday, and a legally-recognised commemorative day in Lithuania. Europe Day in Germany and Austria is considered a flag day (German: Beflaggungstag), where flags are ordered to be shown by federal decree, while in Finland, it is considered a customary flag flying day. Europe Day is also celebrated in Romania, where it coincides with the State Independence Day of Romania (Romania's independence day). Between 2003 and 2023, Europe Day was celebrated in Ukraine on the third Saturday of May. On 8 May 2023, the President of Ukraine established a decree to celebrate Europe Day on May 9, coinciding with EU member states. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_ref-GPro87_185-1] | [TOKENS: 10728] |
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006—over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to £700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, — with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) — as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999–2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256×224 to 640×480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consoles—including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears out—usually unevenly—due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References |
======================================== |
[SOURCE: https://www.theverge.com/news/816549/texas-roblox-lawsuit-child-safety] | [TOKENS: 2142] |
NewsCloseNewsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All NewsGamingCloseGamingPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All GamingPolicyClosePolicyPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PolicyTexas sues Roblox for allegedly failing to protect children on its platformRoblox is being accused of lying about its safety features as children were put in harm’s way.Roblox is being accused of lying about its safety features as children were put in harm’s way.by Stevie BonifieldCloseStevie BonifieldNews WriterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Stevie BonifieldNov 7, 2025, 5:51 PM UTCLinkShareGiftIllustration: The VergePart OfRoblox: all the news about the popular social and gaming platformsee all updates Stevie BonifieldCloseStevie BonifieldPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Stevie Bonifield is a news writer covering all things consumer tech. Stevie started out at Laptop Mag writing news and reviews on hardware, gaming, and AI.Texas AG Ken Paxton is accusing Roblox of “putting pixel pedophiles and profits over the safety of Texas children,” alleging in a lawsuit filed this week that it is “flagrantly ignoring state and federal online safety laws while deceiving parents about the dangers of its platform.”The lawsuit accuses Roblox of deceptive trade practices for misleading parents and users about its safety features, and for creating a common nuisance by harboring a space “that has become a habitual destination for child predators engaging in grooming and child sexual exploitation.”The lawsuit’s examples focus on instances of children who have been abused by predators they met via Roblox, and the activities of groups like 764 which have used online platforms to identify and blackmail victims into sexually explicit acts or self harm. According to the suit, Roblox’s parental controls push only began after a number of lawsuits, and a report released last fall by the short seller Hindenburg that said its “in-game research revealed an X-rated pedophile hellscape, exposing children to grooming, pornography, violent content and extremely abusive speech.”In August, Louisiana filed a similar lawsuit, alleging that Roblox “permitted and perpetuated an online environment in which child predators thrive.” A couple of months later, the state of Kentucky also sued Roblox, calling it “a hunting ground for child predators.” Last month, Florida Attorney General James Uthmeier subpoenaed Roblox over similar allegations. It’s not just states suing Roblox, either. Numerous families and Roblox players have also sued the platform for alleged abuse, such as the cases detailed in Texas’s lawsuit.Eric Porterfield, Senior Director of Policy Communications at Roblox, responded to the lawsuit in a statement to The Verge, saying, “We are disappointed that, rather than working collaboratively with Roblox on this industry-wide challenge and seeking real solutions, the AG has chosen to file a lawsuit based on misrepresentations and sensationalized claims.” He added, “We have introduced over 145 safety measures on the platform this year alone.”RelatedRoblox will require a facial scan or government ID to have unfiltered chatsRoblox is locking down sexual content and access to ‘adult’ locations after lawsuitsRoblox reported in September that it has over 111 million daily active users, many of whom are children. Earlier this year, Roblox announced plans to roll out age verification using IDs and facial scans, along with an AI system intended to “detect early signals of potential child endangerment.”It echoes similar changes on social media platforms like Discord, which also began rolling out age verification this year and has even been cited in some of the same lawsuits filed against Roblox, including one case involving a 13-year-old from Texas. Social media platforms have often successfully used Section 230 to shield themselves from liability for individual users’ actions on their platforms, however — a barrier this Roblox suit, like others against the company, will face.Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Stevie BonifieldCloseStevie BonifieldNews WriterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Stevie BonifieldGamingCloseGamingPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All GamingNewsCloseNewsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All NewsPolicyClosePolicyPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All PolicyTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechMore in: Roblox: all the news about the popular social and gaming platformLos Angeles County is suing Roblox.Jay Peters1:17 AM UTCRoblox is working to pull in adult playersStevie BonifieldFeb 5Roblox calls its take on AI world models ‘real-time dreaming’Jay PetersFeb 4Most PopularMost PopularThe RAM shortage is coming for everything you care aboutA $10K+ bounty is waiting for anyone who can unplug Ring doorbells from Amazon’s cloudMeta’s VR metaverse is ditching VRTurtle Beach’s new PC controller with swiveling sticks is 30 percent offAmazon blames human employees for an AI coding agent’s mistakeThe Verge DailyA free daily digest of the news that matters most.Email (required)Sign UpBy submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.Advertiser Content FromThis is the title for the native ad Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Gaming Posts from this topic will be added to your daily email digest and your homepage feed. See All Policy Texas sues Roblox for allegedly failing to protect children on its platform Roblox is being accused of lying about its safety features as children were put in harm’s way. Roblox is being accused of lying about its safety features as children were put in harm’s way. Posts from this author will be added to your daily email digest and your homepage feed. See All by Stevie Bonifield Posts from this author will be added to your daily email digest and your homepage feed. See All by Stevie Bonifield Texas AG Ken Paxton is accusing Roblox of “putting pixel pedophiles and profits over the safety of Texas children,” alleging in a lawsuit filed this week that it is “flagrantly ignoring state and federal online safety laws while deceiving parents about the dangers of its platform.” The lawsuit accuses Roblox of deceptive trade practices for misleading parents and users about its safety features, and for creating a common nuisance by harboring a space “that has become a habitual destination for child predators engaging in grooming and child sexual exploitation.” The lawsuit’s examples focus on instances of children who have been abused by predators they met via Roblox, and the activities of groups like 764 which have used online platforms to identify and blackmail victims into sexually explicit acts or self harm. According to the suit, Roblox’s parental controls push only began after a number of lawsuits, and a report released last fall by the short seller Hindenburg that said its “in-game research revealed an X-rated pedophile hellscape, exposing children to grooming, pornography, violent content and extremely abusive speech.” In August, Louisiana filed a similar lawsuit, alleging that Roblox “permitted and perpetuated an online environment in which child predators thrive.” A couple of months later, the state of Kentucky also sued Roblox, calling it “a hunting ground for child predators.” Last month, Florida Attorney General James Uthmeier subpoenaed Roblox over similar allegations. It’s not just states suing Roblox, either. Numerous families and Roblox players have also sued the platform for alleged abuse, such as the cases detailed in Texas’s lawsuit. Eric Porterfield, Senior Director of Policy Communications at Roblox, responded to the lawsuit in a statement to The Verge, saying, “We are disappointed that, rather than working collaboratively with Roblox on this industry-wide challenge and seeking real solutions, the AG has chosen to file a lawsuit based on misrepresentations and sensationalized claims.” He added, “We have introduced over 145 safety measures on the platform this year alone.” Roblox reported in September that it has over 111 million daily active users, many of whom are children. Earlier this year, Roblox announced plans to roll out age verification using IDs and facial scans, along with an AI system intended to “detect early signals of potential child endangerment.” It echoes similar changes on social media platforms like Discord, which also began rolling out age verification this year and has even been cited in some of the same lawsuits filed against Roblox, including one case involving a 13-year-old from Texas. Social media platforms have often successfully used Section 230 to shield themselves from liability for individual users’ actions on their platforms, however — a barrier this Roblox suit, like others against the company, will face. Posts from this author will be added to your daily email digest and your homepage feed. See All by Stevie Bonifield Posts from this topic will be added to your daily email digest and your homepage feed. See All Gaming Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Policy Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech More in: Roblox: all the news about the popular social and gaming platform Most Popular The Verge Daily A free daily digest of the news that matters most. This is the title for the native ad More in News This is the title for the native ad Top Stories © 2026 Vox Media, LLC. All Rights Reserved |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Quark_matter] | [TOKENS: 2836] |
Contents QCD matter Quark matter or QCD matter (quantum chromodynamic) refers to any of a number of hypothetical phases of matter whose degrees of freedom include quarks and gluons, of which the prominent example is quark-gluon plasma. Several series of conferences in 2019, 2020, and 2021 were devoted to this topic. Quarks are liberated into quark matter at extremely high temperatures and/or densities, and some of them are still only theoretical as they require conditions so extreme that they cannot be produced in any laboratory, especially not at equilibrium conditions. Under these extreme conditions, the familiar structure of matter, where the basic constituents are nuclei (consisting of nucleons which are bound states of quarks) and electrons, is disrupted. In quark matter it is more appropriate to treat the quarks themselves as the basic degrees of freedom. In the Standard Model of particle physics, the strong force is described by the theory of QCD. At ordinary temperatures or densities this force just confines the quarks into composite particles (hadrons) of size around 10−15 m = 1 femtometer = 1 fm (corresponding to the QCD energy scale ΛQCD ≈ 200 MeV) and its effects are not noticeable at longer distances. However, when the temperature reaches the QCD energy scale (T of order 1012 kelvins) or the density rises to the point where the average inter-quark separation is less than 1 fm (quark chemical potential μ around 400 MeV), the hadrons are melted into their constituent quarks, and the strong interaction becomes the dominant feature of the physics. Such phases are called quark matter or QCD matter. The strength of the color force makes the properties of quark matter unlike gas or plasma, instead leading to a state of matter more reminiscent of a liquid. At high densities, quark matter is a Fermi liquid, but is predicted to exhibit color superconductivity at high densities and temperatures below 1012 K. Occurrence At this time no star with properties expected of these objects has been observed, although some evidence has been provided for quark matter in the cores of large neutron stars. Laboratory experiments suggest that the inevitable interaction with heavy noble gas nuclei in the upper atmosphere would lead to quark–gluon plasma formation. Even though quark-gluon plasma can only occur under quite extreme conditions of temperature and/or pressure, it is being actively studied at particle colliders, such as the Large Hadron Collider LHC at CERN and the Relativistic Heavy Ion Collider RHIC at Brookhaven National Laboratory. In these collisions, the plasma only occurs for a very short time before it spontaneously disintegrates. The plasma's physical characteristics are studied by detecting the debris emanating from the collision region with large particle detectors Heavy-ion collisions at very high energies can produce small short-lived regions of space whose energy density is comparable to that of the 20-micro-second-old universe. This has been achieved by colliding heavy nuclei such as lead nuclei at high speeds, and a first time claim of formation of quark–gluon plasma came from the SPS accelerator at CERN in February 2000. This work has been continued at more powerful accelerators, such as RHIC in the US, and as of 2010 at the European LHC at CERN located in the border area of Switzerland and France. There is good evidence that the quark–gluon plasma has also been produced at RHIC. Thermodynamics The context for understanding the thermodynamics of quark matter is the Standard Model of particle physics, which contains six different flavors of quarks, as well as leptons like electrons and neutrinos. These interact via the strong interaction, electromagnetism, and also the weak interaction which allows one flavor of quark to turn into another. Electromagnetic interactions occur between particles that carry electrical charge; strong interactions occur between particles that carry color charge. The correct thermodynamic treatment of quark matter depends on the physical context. For large quantities that exist for long periods of time (the "thermodynamic limit"), we must take into account the fact that the only conserved charges in the standard model are quark number (equivalent to baryon number), electric charge, the eight color charges, and lepton number. Each of these can have an associated chemical potential. However, large volumes of matter must be electrically and color-neutral, which determines the electric and color charge chemical potentials. This leaves a three-dimensional phase space, parameterized by quark chemical potential, lepton chemical potential, and temperature. In compact stars quark matter would occupy cubic kilometers and exist for millions of years, so the thermodynamic limit is appropriate. However, the neutrinos escape, violating lepton number, so the phase space for quark matter in compact stars only has two dimensions, temperature (T) and quark number chemical potential μ. A strangelet is not in the thermodynamic limit of large volume, so it is like an exotic nucleus: it may carry electric charge. A heavy-ion collision is in neither the thermodynamic limit of large volumes nor long times. Putting aside questions of whether it is sufficiently equilibrated for thermodynamics to be applicable, there is certainly not enough time for weak interactions to occur, so flavor is conserved, and there are independent chemical potentials for all six quark flavors. The initial conditions (the impact parameter of the collision, the number of up and down quarks in the colliding nuclei, and the fact that they contain no quarks of other flavors) determine the chemical potentials. (Reference for this section:). Phase diagram Based on rigorous theoretical calculations valid at ultrahigh density and a few experimental ultrarelativistic heavy ion collision experiments, an outline of the phase diagram of quark matter has been worked out as shown in the figure to the right. It is relevant for the understanding the core of neutron stars, where the only relevant thermodynamic potentials are quark chemical potential μ and temperature T. For guidance it also shows the typical values of μ and T in heavy-ion collisions and in the early universe. For readers who are not familiar with the concept of a chemical potential, it is helpful to think of μ as a measure of the imbalance between quarks and antiquarks in the system. Higher μ means a stronger bias favoring quarks over antiquarks. At low temperatures there are no antiquarks, and then higher μ generally means a higher density of quarks. Ordinary atomic matter as we know it is really a mixed phase, droplets of nuclear matter (nuclei) surrounded by vacuum, which exists at the low-temperature phase boundary between vacuum and nuclear matter, at μ = 310 MeV and T close to zero. If we increase the quark density (i.e. increase μ) keeping the temperature low, we move into a phase of more and more compressed nuclear matter. Following this path corresponds to burrowing more and more deeply into a neutron star. Eventually, at an unknown critical value of μ, there is a transition to quark matter. At ultra-high densities we expect to find the color-flavor-locked (CFL) phase of color-superconducting quark matter. At intermediate densities we expect some other phases (labelled "non-CFL quark liquid" in the figure) whose nature is presently unknown. They might be other forms of color-superconducting quark matter, or something different. Now, imagine starting at the bottom left corner of the phase diagram, in the vacuum where μ = T = 0. If we heat up the system without introducing any preference for quarks over antiquarks, this corresponds to moving vertically upwards along the T axis. At first, quarks are still confined and we create a gas of hadrons (pions, mostly). Then around T = 150 MeV there is a crossover to the quark gluon plasma: thermal fluctuations break up the pions, and we find a gas of quarks, antiquarks, and gluons, as well as lighter particles such as photons, electrons, positrons, etc. Following this path corresponds to travelling far back in time (so to say), to the state of the universe shortly after the big bang (where there was a very tiny preference for quarks over antiquarks). The line that rises up from the nuclear/quark matter transition and then bends back towards the T axis, with its end marked by a star, is the conjectured boundary between confined and unconfined phases. Until recently it was also believed to be a boundary between phases where chiral symmetry is broken (low temperature and density) and phases where it is unbroken (high temperature and density). It is now known that the CFL phase exhibits chiral symmetry breaking, and other quark matter phases may also break chiral symmetry, so it is not clear whether this is really a chiral transition line. The line ends at the "chiral critical point", marked by a star in this figure, which is a special temperature and density at which striking physical phenomena, analogous to critical opalescence, are expected. (Reference for this section:). For a complete description of phase diagram it is required that one must have complete understanding of dense, strongly interacting hadronic matter and strongly interacting quark matter from some underlying theory e.g. quantum chromodynamics (QCD). However, because such a description requires the proper understanding of QCD in its non-perturbative regime, which is still far from being completely understood, any theoretical advance remains very challenging. Theoretical challenges: calculation techniques The phase structure of quark matter remains mostly conjectural because it is difficult to perform calculations predicting the properties of quark matter. The reason is that QCD, the theory describing the dominant interaction between quarks, is strongly coupled at the densities and temperatures of greatest physical interest, and hence it is very hard to obtain any predictions from it. Here are brief descriptions of some of the standard approaches. The only first-principles calculational tool currently available is lattice QCD, i.e. brute-force computer calculations. Because of a technical obstacle known as the fermion sign problem, this method can only be used at low density and high temperature (μ < T), and it predicts that the crossover to the quark–gluon plasma will occur around T = 150 MeV However, it cannot be used to investigate the interesting color-superconducting phase structure at high density and low temperature. Because QCD is asymptotically free it becomes weakly coupled at unrealistically high densities, and diagrammatic methods can be used. Such methods show that the CFL phase occurs at very high density. At high temperatures, however, diagrammatic methods are still not under full control. To obtain a rough idea of what phases might occur, one can use a model that has some of the same properties as QCD, but is easier to manipulate. Many physicists use Nambu–Jona-Lasinio models, which contain no gluons, and replace the strong interaction with a four-fermion interaction. Mean-field methods are commonly used to analyse the phases. Another approach is the bag model, in which the effects of confinement are simulated by an additive energy density that penalizes unconfined quark matter. Many physicists simply give up on a microscopic approach, and make informed guesses of the expected phases (perhaps based on NJL model results). For each phase, they then write down an effective theory for the low-energy excitations, in terms of a small number of parameters, and use it to make predictions that could allow those parameters to be fixed by experimental observations. There are other methods that are sometimes used to shed light on QCD, but for various reasons have not yet yielded useful results in studying quark matter. Treat the number of colors N, which is actually 3, as a large number, and expand in powers of 1/N. It turns out that at high density the higher-order corrections are large, and the expansion gives misleading results. Adding scalar quarks (squarks) and fermionic gluons (gluinos) to the theory makes it more tractable, but the thermodynamics of quark matter depends crucially on the fact that only fermions can carry quark number, and on the number of degrees of freedom in general. Experimental challenges Experimentally, it is hard to map the phase diagram of quark matter because it has been rather difficult to learn how to tune to high enough temperatures and density in the laboratory experiment using collisions of relativistic heavy ions as experimental tools. However, these collisions ultimately will provide information about the crossover from hadronic matter to QGP. It has been suggested that the observations of compact stars may also constrain the information about the high-density low-temperature region. Models of the cooling, spin-down, and precession of these stars offer information about the relevant properties of their interior. As observations become more precise, physicists hope to learn more. One of the natural subjects for future research is the search for the exact location of the chiral critical point. Some ambitious lattice QCD calculations may have found evidence for it, and future calculations will clarify the situation. Heavy-ion collisions might be able to measure its position experimentally, but this will require scanning across a range of values of μ and T. Evidence In 2020, evidence was provided that the cores of neutron stars with mass ~2M⊙ were likely composed of quark matter. Their result was based on neutron-star tidal deformability during a neutron star merger as measured by gravitational-wave observatories, leading to an estimate of star radius, combined with calculations of the equation of state relating the pressure and energy density of the star's core. The evidence was strongly suggestive but did not conclusively prove the existence of quark matter. See also Sources and further reading References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Hyves] | [TOKENS: 996] |
Contents Hyves Hyves was a Dutch social media and social networking site with mainly Dutch visitors and members, where it competed with sites such as Facebook and MySpace. Hyves was founded in 2004 by Raymond Spanjar and Floris Rost van Tonningen. The service was available in both Dutch and English. In May 2010 Hyves had more than 10.3 million accounts. These correspond to two thirds of the size of the Dutch population (which stood at over 16 million in 2010), however this included multiple accounts per person and inactive accounts. The number of accounts had grown by over two million as compared to 1.5 years earlier. Hyves could be used free of charge, but there was an option for a paid Premium Membership (called "Gold Membership"). Gold members had access to some extra features, such as the ability to use a wider variety of smileys in their messages and more uploading space for pictures. The creators have said that the basic of a Hyves account will always be free.[citation needed] In 2013, the social network was officially discontinued, due to the huge decrease in accounts due to the growing popularity of other social networks like Facebook and Twitter in the Netherlands. The site continued as Hyves Games, where members could use their Hyves accounts to play social games. History Hyves started in September 2004. The name Hyves was chosen because the desired domain name hives.nl was already taken. The name referred to beehives and the fact that social networks are built the same way. In May 2006 it became public that the Dutch police was using Hyves as a tool to investigate possible suspects. Only information that is uploaded by suspects is being checked. On December 13, 2007 Hyves was awarded with the title of "most popular site of the year".[by whom?] In April 2008, Dutch media tycoon Joop van den Ende took a large interest in Hyves. The intention was to expand abroad and provide mobile services.[who?] Hyves changed its design in July of '09. The site got a new look and feel, described as being more docile and synoptic. The profile picture format was also changed into a standard square shape. In her Christmas speech of 2009, Queen Beatrix of the Netherlands expressed negative views about online social networks. In response, the founder of Hyves offered her a free account/profile, so that she could experience Hyves herself. In 2010, it became clear that the fast growth of Hyves was slowing down due to growth of Twitter and Facebook. Therefore, Hyves announced extra measures to leave the competition further behind. These measures were successful, because Hyves welcomed its 10 millionth user in April 2010. Despite the perception that Hyves mostly had young members, the target group ages faster than it rejuvenates. The average age of a member at Hyves is 30 years old. Also, in that same month, Hyves announced "Hyves Payments" and "Hyves Games", which allows users to play games and pay friends through the social network. Although Facebook was rapidly growing in the Netherlands, in 2010 Hyves was still the most popular social network with 10.6 million users and a 68% penetration. Hyves was sold to the Telegraaf Media Groep in November that same year. In September 2011, Facebook received more unique visitors than Hyves since its creation. In the same year, Hyves redirected their focus from a pure social network platform to a content platform by adding a news section, sports results and radio channels. In 2013, the Telegraaf Media Group announced to reform Hyves from a social network platform to an exclusive gaming platform. All users had the ability to download their social profile. On December 1, 2013 all the not downloaded accounts were removed. In February 2006, Wouter Bos was the first Dutch politician with a Hyves account and created a trend. The Prime Minister at that time, Jan Peter Balkenende, also saw the potential to communicate with his target audience through social media. He created an account in 2006. During the Dutch elections of 2010, Hyves was used in a broad range of ways. Every leader of every political party had an account on Hyves, and the world's first debate between political leaders on a social network was organised and hosted by Hyves. Features User profiles can be created without knowledge of HTML. Profiles can be built by filling in questionnaires and uploading content.[citation needed] Hyves' users had the option to make their profiles only available (to a degree chosen by the user) to friends or friends of friends (the so-called 'connections'). Users could also protect their messages by making them visible only to friends or connections.[citation needed] See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/United_States_dollar#Countries_that_use_US_dollar] | [TOKENS: 8224] |
Contents United States dollar Page version status This is an accepted version of this page The United States dollar (symbol: $; currency code: USD[a]) is the official currency of the United States and several other countries. The Coinage Act of 1792 introduced the U.S. dollar at par with the Spanish silver dollar, divided it into 100 cents, and authorized the minting of coins denominated in dollars and cents. U.S. banknotes are issued in the form of Federal Reserve Notes, popularly called greenbacks due to their predominantly green color. The U.S. dollar was originally defined under a bimetallic standard of 371.25 grains (24.057 g) (0.7734375 troy ounces) fine silver or, from 1834, 23.22 grains (1.505 g) fine gold, or $20.67 per troy ounce. The Gold Standard Act of 1900 linked the dollar solely to gold. From 1934, its equivalence to gold was revised to $35 per troy ounce. In 1971 all links to gold were repealed. The U.S. dollar became an important international reserve currency after the First World War, and displaced the pound sterling as the world's primary reserve currency by the Bretton Woods Agreement towards the end of the Second World War. The dollar is the most widely used currency in international transactions, and a free-floating currency. It is also the official currency in several countries and the de facto currency in many others, with Federal Reserve Notes (and, in a few cases, U.S. coins) used in circulation. The monetary policy of the United States is conducted by the Federal Reserve System, which acts as the nation's central bank. As of February 10, 2021, currency in circulation amounted to US$2.10 trillion, $2.05 trillion of which is in Federal Reserve Notes (the remaining $50 billion is in the form of coins and older-style United States Notes). As of January 1, 2025, the Federal Reserve estimated that the total amount of currency in circulation was approximately US$2.37 trillion. Overview Article I, Section 8 of the U.S. Constitution provides that Congress has the power "to coin money". Laws implementing this power are currently codified in Title 31 of the U.S. Code, under Section 5112, which prescribes the forms in which the United States dollars should be issued. These coins are both designated in the section as legal tender in payment of debts. The Sacagawea dollar is one example of the copper alloy dollar, in contrast to the American Silver Eagle which is pure silver. Section 5112 also provides for the minting and issuance of other coins, which have values ranging from one cent (U.S. Penny) to 100 dollars. These other coins are more fully described in Coins of the United States dollar. Article I, Section 9 of the Constitution provides that "a regular Statement and Account of the Receipts and Expenditures of all public Money shall be published from time to time", which is further specified by Section 331 of Title 31 of the U.S. Code. The sums of money reported in the "Statements" are currently expressed in U.S. dollars, thus the U.S. dollar may be described as the unit of account of the United States. "Dollar" is one of the first words of Section 9, in which the term refers to the Spanish milled dollar, or the coin worth eight Spanish reales. In 1792, the U.S. Congress passed the Coinage Act, of which Section 9 authorized the production of various coins, including:: 248 Dollars or Units—each to be of the value of a Spanish milled dollar as the same is now current, and to contain three hundred and seventy-one grains and four sixteenth parts of a grain of pure, or four hundred and sixteen grains of standard silver. Section 20 of the Act designates the United States dollar as the unit of currency of the United States:: 250–1 [T]he money of account of the United States shall be expressed in dollars, or units...and that all accounts in the public offices and all proceedings in the courts of the United States shall be kept and had in conformity to this regulation. Unlike the Spanish milled dollar, the Continental Congress and the Coinage Act prescribed a decimal system of units to go with the unit dollar, as follows: the mill, or one-thousandth of a dollar; the cent, or one-hundredth of a dollar; the dime, or one-tenth of a dollar; and the eagle, or ten dollars. The current relevance of these units: The Spanish peso, or dollar, was historically divided into eight reales (colloquially, bits) – hence pieces of eight. Americans also learned counting in non-decimal bits of 12+1⁄2 cents before 1857 when Mexican bits were more frequently encountered than American cents; in fact this practice survived in New York Stock Exchange quotations until 2001. In 1854, Secretary of the Treasury James Guthrie proposed creating $100, $50, and $25 gold coins, to be referred to as a union, half union, and quarter union, respectively, thus implying a denomination of 1 Union = $100. However, no such coins were ever struck, and only patterns for the $50 half union exist. When currently issued in circulating form, denominations less than or equal to a dollar are emitted as U.S. coins, while denominations greater than or equal to a dollar are emitted as Federal Reserve Notes, disregarding the following special cases: In the 16th century, Count Hieronymus Schlick of Bohemia began minting coins known as joachimstalers, named for Joachimstal, the valley in which the silver was mined. In turn, the valley's name is titled after Saint Joachim, whereby thal or tal, a cognate of the English word dale, is German for 'valley.' The joachimstaler was later shortened to the German taler, a word that eventually found its way into many languages, including: tolar (Czech, Slovak and Slovenian); daler (Danish and Swedish); talar (Polish); dalar and daler (Norwegian); daler or daalder (Dutch); talari (Ethiopian); tallér (Hungarian); tallero (Italian); دولار (Arabic); and dollar (English). Though the Dutch pioneered in modern-day New York in the 17th century the use and the counting of money in silver dollars in the form of German-Dutch reichsthalers and native Dutch leeuwendaalders ('lion dollars'), it was the ubiquitous Spanish American eight-real coin which became exclusively known as the dollar since the 18th century. The colloquialism buck(s) (much like the British quid for the pound sterling) is often used to refer to dollars of various nations, including the U.S. dollar. This term, dating to the 18th century, may have originated with the colonial leather trade, or it may also have originated from a poker term. Greenback is another nickname, originally applied specifically to the 19th-century Demand Note dollars, which were printed black and green on the backside, created by Abraham Lincoln to finance the North for the Civil War. It is still used to refer to the U.S. dollar (but not to the dollars of other countries). The term greenback is also used by the financial press in other countries, such as Australia, New Zealand, South Africa, and India. Other well-known names of the dollar as a whole in denominations include greenmail, green, and dead presidents, the latter of which referring to the deceased presidents pictured on most bills. Dollars in general have also been known as bones (e.g. "twenty bones" = $20). The newer designs, with portraits displayed in the main body of the obverse (rather than in cameo insets), upon paper color-coded by denomination, are sometimes referred to as bigface notes or Monopoly money.[citation needed] Piastre was the original French word for the U.S. dollar, used for example in the French text of the Louisiana Purchase. Though the U.S. dollar is called dollar in Modern French, the term piastre is still used among the speakers of Cajun French and New England French, as well as speakers in Haiti and other French Caribbean islands. Nicknames specific to denomination: The symbol $, usually written before the numerical amount, is used for the U.S. dollar (as well as for many other currencies). The sign was perhaps the result of a late 18th-century evolution of the scribal abbreviation ps for the peso, the common name for the Spanish dollars that were in wide circulation in the New World from the 16th to the 19th centuries. The p and the s eventually came to be written over each other giving rise to $. Another popular explanation is that it is derived from the Pillars of Hercules on the Spanish coat of arms of the Spanish dollar. These Pillars of Hercules on the silver Spanish dollar coins take the form of two vertical bars (||) and a swinging cloth band in the shape of an S. Yet another explanation suggests that the dollar sign was formed from the capital letters U and S written or printed one on top of the other. This theory, popularized by novelist Ayn Rand in Atlas Shrugged, does not consider the fact that the symbol was already in use before the formation of the United States. History The U.S. dollar was introduced at par with the Spanish-American silver dollar (or Spanish peso, Spanish milled dollar, eight-real coin, piece-of-eight). The latter was produced from the rich silver mine output of Spanish America, was minted in Mexico City, Potosí (Bolivia), Lima (Peru), and elsewhere, and was in wide circulation throughout the Americas, Asia, and Europe from the 16th to the 19th centuries. The minting of machine-milled Spanish dollars since 1732 boosted its worldwide reputation as a trade coin and positioned it to be the model for the new currency of the United States.[citation needed] Even after the United States Mint commenced issuing coins in 1792, locally minted dollars and cents were less abundant in circulation than Spanish American pesos and reales; hence Spanish, Mexican, and American dollars all remained legal tender in the United States until the Coinage Act of 1857. In particular, colonists' familiarity with the Spanish two-real quarter peso was the reason for issuing a quasi-decimal 25-cent quarter dollar coin rather than a 20-cent coin.[citation needed] For the relationship between the Spanish dollar and the individual state colonial currencies, see Connecticut pound, Delaware pound, Georgia pound, Maryland pound, Massachusetts pound, New Hampshire pound, New Jersey pound, New York pound, North Carolina pound, Pennsylvania pound, Rhode Island pound, South Carolina pound, and Virginia pound.[citation needed] On July 6, 1785, the Continental Congress resolved that the money unit of the United States, the dollar, would contain 375.64 grains of fine silver; on August 8, 1786, the Continental Congress continued that definition and further resolved that the money of account, corresponding with the division of coins, would proceed in a decimal ratio, with the sub-units being mills at 0.001 of a dollar, cents at 0.010 of a dollar, and dimes at 0.100 of a dollar. After the adoption of the United States Constitution, the U.S. dollar was defined by the Coinage Act of 1792. It specified a "dollar" based on the Spanish milled dollar to contain 371+4⁄16 grains of fine silver, or 416.0 grains (26.96 g) of "standard silver" of fineness 371.25/416 = 89.24%; as well as an "eagle" to contain 247+4⁄8 grains of fine gold, or 270.0 grains (17.50 g) of 22 karat or 91.67% fine gold. Alexander Hamilton arrived at these numbers based on a treasury assay of the average fine silver content of a selection of worn Spanish dollars, which came out to be 371 grains. Combined with the prevailing gold-silver ratio of 15, the standard for gold was calculated at 371/15 = 24.73 grains fine gold or 26.98 grains 22K gold. Rounding the latter to 27.0 grains finalized the dollar's standard to 24.75 grains of fine gold or 24.75 × 15 = 371.25 grains = 24.0566 grams = 0.7735 troy ounces of fine silver. The same coinage act also set the value of an eagle at 10 dollars, and the dollar at 1⁄10 eagle. It called for silver coins in denominations of 1, 1⁄2, 1⁄4, 1⁄10, and 1⁄20 dollar, as well as gold coins in denominations of 1, 1⁄2 and 1⁄4 eagle. The value of gold or silver contained in the dollar was then converted into relative value in the economy for the buying and selling of goods. This allowed the value of things to remain fairly constant over time, except for the influx and outflux of gold and silver in the nation's economy. Though a Spanish dollar freshly minted after 1772 theoretically contained 417.7 grains of silver of fineness 130/144 (or 377.1 grains fine silver), reliable assays of the period in fact confirmed a fine silver content of 370.95 grains (24.037 g) for the average Spanish dollar in circulation. The new U.S. silver dollar of 371.25 grains (24.057 g) therefore compared favorably and was received at par with the Spanish dollar for foreign payments, and after 1803 the United States Mint had to suspend making this coin out of its limited resources since it failed to stay in domestic circulation. It was only after Mexican independence in 1821 when their peso's fine silver content of 377.1 grains was firmly upheld, which the U.S. later had to compete with using a heavier 378.0 grains (24.49 g) Trade dollar coin. The early currency of the United States did not exhibit faces of presidents, as is the custom now; although today, by law, only the portrait of a deceased individual may appear on United States currency. In fact, the newly formed government was against having portraits of leaders on the currency, a practice compared to the policies of European monarchs. The currency as we know it today did not get the faces they currently have until after the early 20th century; before that "heads" side of coinage used profile faces and striding, seated, and standing figures from Greek and Roman mythology and composite Native Americans. The last coins to be converted to profiles of historic Americans were the dime (1946), the half Dollar (1948), and the Dollar (1971). After the American Revolution, the Thirteen Colonies became independent. Freed from British monetary regulations, they each issued £sd paper money to pay for military expenses. The Continental Congress also began issuing "Continental Currency" denominated in Spanish dollars. For its value relative to states' currencies, see Early American currency. Continental currency depreciated badly during the war, giving rise to the famous phrase "not worth a continental". A primary problem was that monetary policy was not coordinated between Congress and the states, which continued to issue bills of credit. Additionally, neither Congress nor the governments of the several states had the will or the means to retire the bills from circulation through taxation or the sale of bonds. The currency was ultimately replaced by the silver dollar at the rate of 1 silver dollar to 1000 continental dollars. This resulted in the clause "No state shall... make anything but gold and silver coin a tender in payment of debts" being written into the United States Constitution article 1, section 10. From implementation of the 1792 Mint Act to the 1900 implementation of the gold standard, the dollar was on a bimetallic silver-and-gold standard, defined as either 371.25 grains (24.056 g) of fine silver or 24.75 grains of fine gold (gold-silver ratio 15). Subsequent to the Coinage Act of 1834 the dollar's fine gold equivalent was revised to 23.2 grains; it was slightly adjusted to 23.22 grains (1.505 g) in 1837 (gold-silver ratio ≈16). The same act also resolved the difficulty in minting the "standard silver" of 89.24% fineness by revising the dollar's alloy to 412.5 grains, 90% silver, still containing 371.25 grains fine silver. Gold was also revised to 90% fineness: 25.8 grains gross, 23.22 grains fine gold. Following the rise in the price of silver during the California Gold Rush and the disappearance of circulating silver coins, the Coinage Act of 1853 reduced the standard for silver coins less than $1 from 412.5 grains to 384 grains (24.9 g), 90% silver per 100 cents (slightly revised to 25.0 g, 90% silver in 1873). The Act also limited the free silver right of individuals to convert bullion into only one coin, the silver dollar of 412.5 grains; smaller coins of lower standard can only be produced by the United States Mint using its own bullion. Summary and links to coins issued in the 19th century: In order to finance the War of 1812, Congress authorized the issuance of Treasury Notes, interest-bearing short-term debt that could be used to pay public dues. While they were intended to serve as debt, they did function "to a limited extent" as money. Treasury Notes were again printed to help resolve the reduction in public revenues resulting from the Panic of 1837 and the Panic of 1857, as well as to help finance the Mexican–American War and the Civil War. Paper money was issued again in 1862 without the backing of precious metals due to the Civil War. In addition to Treasury Notes, Congress in 1861 authorized the Treasury to borrow $50 million in the form of Demand Notes, which did not bear interest but could be redeemed on demand for precious metals. However, by December 1861, the Union government's supply of specie was outstripped by demand for redemption and they were forced to suspend redemption temporarily. In February 1862 Congress passed the Legal Tender Act of 1862, issuing United States Notes, which were not redeemable on demand and bore no interest, but were legal tender, meaning that creditors had to accept them at face value for any payment except for import tariffs and interest on public debts. However, silver and gold coins continued to be issued, resulting in the depreciation of the newly printed notes through Gresham's law. In 1869, the Supreme Court ruled in Hepburn v. Griswold that Congress could not require creditors to accept United States Notes, but overturned that ruling the next year in the Legal Tender Cases. In 1875, Congress passed the Specie Payment Resumption Act, requiring the Treasury to allow U.S. Notes to be redeemed for gold after January 1, 1879. Though the dollar came under the gold standard de jure only after 1900, the bimetallic era was ended de facto when the Coinage Act of 1873 suspended the minting of the standard silver dollar of 412.5 Troy grains = 26.73 g; 0.859 ozt, the only fully legal tender coin that individuals could convert bullion into in unlimited (or Free silver) quantities,[b] and right at the onset of the silver rush from the Comstock Lode in the 1870s. This was the so-called "Crime of '73". The Gold Standard Act of 1900 repealed the U.S. dollar's historic link to silver and defined it solely as 23.22 grains (1.505 g) of fine gold (or $20.67 per troy ounce of 480 grains). In 1933, gold coins were confiscated by Executive Order 6102 under Franklin D. Roosevelt, and in 1934 the standard was changed to $35 per troy ounce fine gold, or 13.71 grains (0.888 g) per dollar. After 1968 a series of revisions to the gold peg was implemented, culminating in the Nixon Shock of August 15, 1971, which suddenly ended the convertibility of dollars to gold. The U.S. dollar has since floated freely on the foreign exchange markets.[citation needed] Congress continued to issue paper money after the Civil War, the latest of which is the Federal Reserve Note that was authorized by the Federal Reserve Act of 1913. Since the discontinuation of all other types of notes (Gold Certificates in 1933, Silver Certificates in 1963, and United States Notes in 1971), U.S. dollar notes have since been issued exclusively as Federal Reserve Notes. The U.S. dollar first emerged as an important international reserve currency in the 1920s, displacing the British pound sterling as it emerged from the First World War relatively unscathed and since the United States was a significant recipient of wartime gold inflows. After the United States emerged as an even stronger global superpower during the Second World War, the Bretton Woods Agreement of 1944 established the U.S. dollar as the world's primary reserve currency and the only post-war currency linked to gold. Despite all links to gold being severed in 1971, the dollar continues to be the world's foremost reserve currency for international trade to this day. The Bretton Woods Agreement of 1944 also defined the post-World War II monetary order and relations among modern-day independent states, by setting up a system of rules, institutions, and procedures to regulate the international monetary system. The agreement founded the International Monetary Fund and other institutions of the modern-day World Bank Group, establishing the infrastructure for conducting international payments and accessing the global capital markets using the U.S. dollar. The monetary policy of the United States is conducted by the Federal Reserve System, which acts as the nation's central bank. It was founded in 1913 under the Federal Reserve Act in order to furnish an elastic currency for the United States and to supervise its banking system, particularly in the aftermath of the Panic of 1907. For most of the post-war period, the U.S. government has financed its own spending by borrowing heavily from the dollar-lubricated global capital markets, in debts denominated in its own currency and at minimal interest rates. This ability to borrow heavily without facing a significant balance of payments crisis has been described as the United States's exorbitant privilege. Coins The United States Mint has issued legal tender coins every year from 1792 to the present. From 1934 to the present, the only denominations produced for circulation have been the familiar penny, nickel, dime, quarter, half dollar, and dollar. (2000) Gold and silver coins have been previously minted for general circulation from the 18th to the 20th centuries. The last gold coins were minted in 1933. The last 90% silver coins were minted in 1964, and the last 40% silver half dollar was minted in 1970. The United States Mint currently produces circulating coins at the Philadelphia and Denver Mints, and commemorative and proof coins for collectors at the San Francisco and West Point Mints. Mint mark conventions for these and for past mint branches are discussed in Coins of the United States dollar#Mint marks. The one-dollar coin has never been in popular circulation from 1794 to present, despite several attempts to increase their usage since the 1970s, the most important reason of which is the continued production and popularity of the one-dollar bill. It has not been produced for circulation since 2012. Half dollar coins were commonly used currency since inception in 1794, but has fallen out of use from the mid-1960s when all silver half dollars began to be hoarded. The nickel is the only coin whose size and composition (5 grams, 75% copper, and 25% nickel) is still in use from 1865 to today, except for wartime 1942–1945 Jefferson nickels which contained silver. Due to the penny's low value, debate exists over the penny's status as circulating coinage. In 2025, the Mint halted the production of pennies for circulation, but pennies remain in circulation as only an act of Congress can eliminate a currency. For a discussion of other discontinued and canceled denominations, see Obsolete denominations of United States currency and Canceled denominations of United States currency. Collector coins are technically legal tender at face value but are usually worth far more due to their numismatic value or for their precious metal content. These include: Banknotes The U.S. Constitution provides that Congress shall have the power to "borrow money on the credit of the United States." Congress has exercised that power by authorizing Federal Reserve Banks to issue Federal Reserve Notes. Those notes are "obligations of the United States" and "shall be redeemed in lawful money on demand at the Treasury Department of the United States, in the city of Washington, District of Columbia, or at any Federal Reserve bank". Federal Reserve Notes are designated by law as "legal tender" for the payment of debts. Congress has also authorized the issuance of more than 10 other types of banknotes, including the United States Note and the Federal Reserve Bank Note. The Federal Reserve Note is the only type that remains in circulation since the 1970s. Federal Reserve Notes are printed by the Bureau of Engraving and Printing and are made from cotton fiber paper (as opposed to wood fiber used to make common paper). The "large-sized notes" issued before 1928 measured 7.42 in × 3.125 in (188.5 mm × 79.4 mm), while small-sized notes introduced that year measure 6.14 in × 2.61 in × 0.0043 in (155.96 mm × 66.29 mm × 0.11 mm). The dimensions of the modern (small-size) U.S. currency is identical to the size of Philippine peso banknotes issued under United States administration after 1903, which had proven highly successful. The American large-note bills became known as "horse blankets" or "saddle blankets". Currently printed denominations are $1, $2, $5, $10, $20, $50, and $100. Notes above the $100 denomination stopped being printed in 1946 and were officially withdrawn from circulation in 1969. These notes were used primarily in inter-bank transactions or by organized crime; it was the latter usage that prompted President Richard Nixon to issue an executive order in 1969 halting their use. With the advent of electronic banking, they became less necessary. Notes in denominations of $500, $1,000, $5,000, $10,000 (discontinued, but still legal tender); $100,000 were all produced at one time; see large denomination bills in U.S. currency for details. With the exception of the $100,000 bill (which was only issued as a Series 1934 Gold Certificate and was never publicly circulated; thus it is illegal to own), these notes are now collector's items and are worth more than their face value to collectors. Though still predominantly green, the post-2004 series incorporate other colors to better distinguish different denominations. As a result of a 2008 decision in an accessibility lawsuit filed by the American Council of the Blind, the Bureau of Engraving and Printing is planning to implement a raised tactile feature in the next redesign of each note, except the $1 and the current version of the $100 bill. It also plans larger, higher-contrast numerals, more color differences, and distribution of currency readers to assist the visually impaired during the transition period.[e] Countries that use US dollar These countries and territories use the US dollar as the official currency: These countries and territories widely accept the US dollar unofficially as a secondary currency: Monetary policy The Federal Reserve Act created the Federal Reserve System in 1913 as the central bank of the United States. Its primary task is to conduct the nation's monetary policy to promote maximum employment, stable prices, and moderate long-term interest rates in the U.S. economy. It is also tasked to promote the stability of the financial system and regulate financial institutions, and to act as lender of last resort. The Monetary policy of the United States is conducted by the Federal Open Market Committee, which is composed of the Federal Reserve Board of Governors and 5 out of the 12 Federal Reserve Bank presidents, and is implemented by all twelve regional Federal Reserve Banks. Monetary policy refers to actions made by central banks that determine the size and growth rate of the money supply available in the economy, and which would result in desired objectives like low inflation, low unemployment, and stable financial systems. The economy's aggregate money supply is the total of The FOMC influences the level of money available to the economy by the following means: Monetary policy directly affects interest rates; it indirectly affects stock prices, wealth, and currency exchange rates. Through these channels, monetary policy influences spending, investment, production, employment, and inflation in the United States. Effective monetary policy complements fiscal policy to support economic growth. The adjusted monetary base has increased from approximately $400 billion in 1994, to $800 billion in 2005, and to over $3 trillion in 2013. When the Federal Reserve makes a purchase, it credits the seller's reserve account (with the Federal Reserve). This money is not transferred from any existing funds—it is at this point that the Federal Reserve has created new high-powered money. Commercial banks then decide how much money to keep in deposit with the Federal Reserve and how much to hold as physical currency. In the latter case, the Federal Reserve places an order for printed money from the U.S. Treasury Department. The Treasury Department, in turn, sends these requests to the Bureau of Engraving and Printing (to print new dollar bills) and the Bureau of the Mint (to stamp the coins). The Federal Reserve's monetary policy objectives to keep prices stable and unemployment low is often called the dual mandate. This replaces past practices under a gold standard where the main concern is the gold equivalent of the local currency, or under a gold exchange standard where the concern is fixing the exchange rate versus another gold-convertible currency (previously practiced worldwide under the Bretton Woods Agreement of 1944 via fixed exchange rates to the U.S. dollar). International use as reserve currency The primary currency used for global trade between Europe, Asia, and the Americas has historically been the Spanish-American silver dollar, which created a global silver standard system from the 16th to 19th centuries, due to abundant silver supplies in Spanish America. The U.S. dollar itself was derived from this coin. The Spanish dollar was later displaced by the British pound sterling in the advent of the international gold standard in the last quarter of the 19th century. The U.S. dollar began to displace the pound sterling as international reserve currency from the 1920s since it emerged from the First World War relatively unscathed and since the United States was a significant recipient of wartime gold inflows. After the U.S. emerged as an even stronger global superpower during the Second World War, the Bretton Woods Agreement of 1944 established the post-war international monetary system, with the U.S. dollar ascending to become the world's primary reserve currency for international trade, and the only post-war currency linked to gold at $35 per troy ounce. The U.S. dollar is joined by the world's other major currencies – the euro, pound sterling, Japanese yen and Chinese renminbi – in the currency basket of the special drawing rights of the International Monetary Fund. Central banks worldwide have huge reserves of U.S. dollars in their holdings and are significant buyers of U.S. treasury bills and notes. Foreign companies, entities, and private individuals hold U.S. dollars in foreign deposit accounts called eurodollars (not to be confused with the euro), which are outside the jurisdiction of the Federal Reserve System. Private individuals also hold dollars outside the banking system mostly in the form of US$100 bills, of which 80% of its supply is held overseas. The United States Department of the Treasury exercises considerable oversight over the SWIFT financial transfers network, and consequently has a huge sway on the global financial transactions systems, with the ability to impose sanctions on foreign entities and individuals. The U.S. dollar is predominantly the standard currency unit in which goods are quoted and traded, and with which payments are settled, in the global commodity markets. The U.S. Dollar Index is an important indicator of the dollar's strength or weakness versus a basket of six foreign currencies. The United States Government is capable of borrowing trillions of dollars from the global capital markets in U.S. dollars issued by the Federal Reserve, which is itself under U.S. government purview, at minimal interest rates, and with virtually zero default risk. In contrast, foreign governments and corporations incapable of raising money in their own local currencies are forced to issue debt denominated in U.S. dollars, along with its consequent higher interest rates and risks of default. The United States's ability to borrow in its own currency without facing a significant balance of payments crisis has been frequently described as its exorbitant privilege. A frequent topic of debate is whether the strong dollar policy of the United States is indeed in America's own best interests, as well as in the best interest of the international community. For a more exhaustive discussion of countries using the U.S. dollar as official or customary currency, or using currencies which are pegged to the U.S. dollar, see International use of the U.S. dollar#Dollarization and fixed exchange rates and Currency substitution#US dollar. Countries using the U.S. dollar as their official currency include: Among the countries using the U.S. dollar together with other foreign currencies and their local currency are Cambodia and Zimbabwe. Currencies pegged to the U.S. dollar include: Value Buying power of one U.S. dollar compared to 1775 Spanish milled dollar: The 6th paragraph of Section 8 of Article 1 of the U.S. Constitution provides that the U.S. Congress shall have the power to "coin money" and to "regulate the value" of domestic and foreign coins. Congress exercised those powers when it enacted the Coinage Act of 1792. That Act provided for the minting of the first U.S. dollar and it declared that the U.S. dollar shall have "the value of a Spanish milled dollar as the same is now current". The table above shows the equivalent amount of goods that, in a particular year, could be purchased with $1. The table shows that from 1774 through 2012 the U.S. dollar has lost about 97.0% of its buying power. The decline in the value of the U.S. dollar corresponds to price inflation, which is a rise in the general level of prices of goods and services in an economy over a period of time. A consumer price index (CPI) is a measure estimating the average price of consumer goods and services purchased by households. The United States Consumer Price Index, published by the Bureau of Labor Statistics, is a measure estimating the average price of consumer goods and services in the United States. It reflects inflation as experienced by consumers in their day-to-day living expenses. A graph showing the U.S. CPI relative to 1982–1984 and the annual year-over-year change in CPI is shown at right. The value of the U.S. dollar declined significantly during wartime, especially during the American Civil War, World War I, and World War II. The Federal Reserve, which was established in 1913, was designed to furnish an "elastic" currency subject to "substantial changes of quantity over short periods", which differed significantly from previous forms of high-powered money such as gold, national banknotes, and silver coins. Over the very long run, the prior gold standard kept prices stable—for instance, the price level and the value of the U.S. dollar in 1914 were not very different from the price level in the 1880s. The Federal Reserve initially succeeded in maintaining the value of the U.S. dollar and price stability, reversing the inflation caused by the First World War and stabilizing the value of the dollar during the 1920s, before presiding over a 30% deflation in U.S. prices in the 1930s. Under the Bretton Woods system established after World War II, the value of gold was fixed to $35 per ounce, and the value of the U.S. dollar was thus anchored to the value of gold. Rising government spending in the 1960s, however, led to doubts about the ability of the United States to maintain this convertibility, gold stocks dwindled as banks and international investors began to convert dollars to gold, and as a result, the value of the dollar began to decline. Facing an emerging currency crisis and the imminent danger that the United States would no longer be able to redeem dollars for gold, gold convertibility was finally terminated in 1971 by President Nixon, resulting in the "Nixon shock". The value of the U.S. dollar was therefore no longer anchored to gold, and it fell upon the Federal Reserve to maintain the value of the U.S. currency. The Federal Reserve, however, continued to increase the money supply, resulting in stagflation and a rapidly declining value of the U.S. dollar in the 1970s. This was largely due to the prevailing economic view at the time that inflation and real economic growth were linked (the Phillips curve), and so inflation was regarded as relatively benign. Between 1965 and 1981, the U.S. dollar lost two thirds of its value. In 1979, President Carter appointed Paul Volcker Chairman of the Federal Reserve. The Federal Reserve tightened the money supply and inflation was substantially lower in the 1980s, and hence the value of the U.S. dollar stabilized. Over the thirty-year period from 1981 to 2009, the U.S. dollar lost over half its value. This is because the Federal Reserve has targeted not zero inflation, but a low, stable rate of inflation—between 1987 and 1997, the rate of inflation was approximately 3.5%, and between 1997 and 2007 it was approximately 2%. The so-called "Great Moderation" of economic conditions since the 1970s is credited to monetary policy targeting price stability. There is an ongoing debate about whether central banks should target zero inflation (which would mean a constant value for the U.S. dollar over time) or low, stable inflation (which would mean a continuously but slowly declining value of the dollar over time, as is the case now). Although some economists are in favor of a zero inflation policy and therefore a constant value for the U.S. dollar, others contend that such a policy limits the ability of the central bank to control interest rates and stimulate the economy when needed. Pegged currencies (incomplete list) Exchange rates See also Notes References Further reading External links |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.