text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/Fan_club] | [TOKENS: 560] |
Contents Fan club A fan club is an organized group of fans, generally of a celebrity. Most fan clubs are run by fans who devote considerable time and resources to support them. There are also "official" fan clubs that are run by someone associated with the person or organization the club is centered on. This is the case for many musicians, sports teams, etc. People in a fan club usually have either a T-shirt or a pin to indicate which fan club they are a part of. All fan clubs have unique paraphernalia that are given or sold to fans to use as an indication. Barbz, who support Nicki Minaj, Hollanders, who support Tom Holland, Carats, who support Seventeen, and Swifties who support Taylor Swift are examples of a fan club. Etymology The origin of the term fan in reference to a dedicated zealot is unclear. The word may have emerged in the 1800s, when boxing supporters were said to take a "fancy" to pugilistic sports. Among modern sports fans, however, the title is considered a shortened version of the word fanatic, as in "boxing fanatic", an indication of the dedication of fan club members. Functions Larger fan clubs may organize events and fundraising relating to what they are based on. In some cases, the money that is raised goes directly to fan club members or to fund the club itself. There are two main kinds of fan clubs, there are fan clubs that do not require an official registration process and others that do. Fan clubs that do require a formal registration usually require a membership fee. Different fan clubs have different systems, however most clubs have an annual membership fee. These fees will be used to run the foundation. If the fan clubs are for certain fashion brands, they may use those fees for advertisement. The term groupie is slang commonly used in reference to fans of a particular musician, band, or celebrity who follow the group- or individual- while they are touring, or who attend as many of their public appearances as possible. The word is often used to describe female fans seeking sexual relationships with musicians. They often value musicians themselves over their music. Groupies are more personally affiliated with the band or celebrity they follow, however fans are not affiliated as they are more reserved than groupies would be. Most fan clubs are online and fans who are a part of these clubs, do not usually get to have personal connections with whom they are fans of. Internet Today, many fan clubs have websites to support their efforts. Technology allows individuals in fan clubs to communicate across the world. These sites usually have photos and information on the object of their affection. For example, a fan site dedicated to musicians might have photos, videos, discussion boards, and information on upcoming concerts. References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Cape_Mesurado] | [TOKENS: 757] |
Contents Cape Mesurado Cape Mesurado, also called Cape Montserrado, is a headland on the coast of Liberia near the capital Monrovia and the mouth of the Saint Paul River. It was named Cape Mesurado by Portuguese sailors in the 1560s. It is the promontory on which African-American settlers established the city now called Monrovia on 25 April 1822. There is a lighthouse on Cape Mesurado, located in the Mamba Point neighborhood of Monrovia and in the cape's northwestern portion, that was established in 1855. It is currently inactive, although the Liberian government is seeking financial assistance to restore and reactivate the lighthouse. History Because Cape Mesurado was being used as a base for the illegal slave trade, in 1815 Governor William Maxwell of Sierra Leone sent an armed force there to interfere with it, seizing ships and merchandise and rescuing enslaved Africans who were working in the factories there. For their crimes, the factory owners, Robert Bostock and John McQueen, were sentenced to fourteen years transportation to New South Wales by the Vice admiralty court.: 1145 Interference with the illegal slave trade was furthered the following year, when HMS Queen Charlotte of the British West Africa Squadron seized the Le Louis, which was suspected of being engaged in the slave trade. Settlers had previously landed at Sherbro Island in Sierra Leone, but they were experiencing a high death rate there due to the island's swampy, unhealthy conditions, so, in 1821, the American Colonization Society dispatched a representative, Dr. Eli Ayers, to purchase land farther south down the coast from Sierra Leone that would provide better living conditions. With the aid of Robert F. Stockton, a U.S. naval officer, Ayers sought out land to establish a new colony. Stockton led negotiations with leaders of the Dei and Bassa peoples who lived in the area of Cape Mesurado. At first, the local ruler, Zolu Duma (King Peter), was reluctant to surrender their peoples' land to the strangers, but he was forcefully persuaded (some accounts claim at gunpoint) to sell them a "36 mile long and 3 mile wide" strip of coastal land, in exchange for trade goods, supplies, weapons, and rum worth approximately $300 (a considerable sum at the time). The Cape Mesurado colony faced many of the same challenges that had faced the previous colony at Sherbro Island: scarce supplies and swampy, unhealthy conditions. The new colony’s Americo-Liberian residents (who had been slaves or the children of former slaves in the United States before their emigration to Africa) were also sporadically attacked by local tribes who resented the newcomers’ efforts to put an end to the slave trade. Led by Lott Carey and Elijah Johnson, the Americo-Liberians organized a defense against local attacks, rejecting an offer of British military assistance that would have required them to hoist the Union Jack on Cape Mesurado. During the Battle of Fort Hill on 1 December 1822, a colonist named Matilda Newport is supposed to have repelled an attack by lighting a cannon with an ember from her pipe. To commemorate her action, a local holiday, called “Matilda Newport Day,” was established in 1916. (It was abolished in 1980 by a government that came to power in a military coup.) Climate References External links 6°18′48″N 10°48′28″W / 6.31333°N 10.80778°W / 6.31333; -10.80778 |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Computer#cite_ref-babbageonline_25-0] | [TOKENS: 10628] |
Contents Computer A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (computation). Modern digital electronic computers can perform generic sets of operations known as programs, which enable computers to perform a wide range of tasks. The term computer system may refer to a nominally complete computer that includes the hardware, operating system, software, and peripheral equipment needed and used for full operation, or to a group of computers that are linked and function together, such as a computer network or computer cluster. A broad range of industrial and consumer products use computers as control systems, including simple special-purpose devices like microwave ovens and remote controls, and factory devices like industrial robots. Computers are at the core of general-purpose devices such as personal computers and mobile devices such as smartphones. Computers power the Internet, which links billions of computers and users. Early computers were meant to be used only for calculations. Simple manual instruments like the abacus have aided people in doing calculations since ancient times. Early in the Industrial Revolution, some mechanical devices were built to automate long, tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II, both electromechanical and using thermionic valves. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power, and versatility of computers have been increasing dramatically ever since then, with transistor counts increasing at a rapid pace (Moore's law noted that counts doubled every two years), leading to the Digital Revolution during the late 20th and early 21st centuries. Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a microprocessor, together with some type of computer memory, typically semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers, etc.), and input/output devices that perform both functions (e.g. touchscreens). Peripheral devices allow information to be retrieved from an external source, and they enable the results of operations to be saved and retrieved. Etymology It was not until the mid-20th century that the word acquired its modern definition; according to the Oxford English Dictionary, the first known use of the word computer was in a different sense, in a 1613 book called The Yong Mans Gleanings by the English writer Richard Brathwait: "I haue [sic] read the truest computer of Times, and the best Arithmetician that euer [sic] breathed, and he reduceth thy dayes into a short number." This usage of the term referred to a human computer, a person who carried out calculations or computations. The word continued to have the same meaning until the middle of the 20th century. During the latter part of this period, women were often hired as computers because they could be paid less than their male counterparts. By 1943, most human computers were women. The Online Etymology Dictionary gives the first attested use of computer in the 1640s, meaning 'one who calculates'; this is an "agent noun from compute (v.)". The Online Etymology Dictionary states that the use of the term to mean "'calculating machine' (of any type) is from 1897." The Online Etymology Dictionary indicates that the "modern use" of the term, to mean 'programmable digital electronic computer' dates from "1945 under this name; [in a] theoretical [sense] from 1937, as Turing machine". The name has remained, although modern computers are capable of many higher-level functions. History Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was most likely a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, likely livestock or grains, sealed in hollow unbaked clay containers.[a] The use of counting rods is one example. The abacus was initially used for arithmetic tasks. The Roman abacus was developed from devices used in Babylonia as early as 2400 BCE. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money. The Antikythera mechanism is believed to be the earliest known mechanical analog computer, according to Derek J. de Solla Price. It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to approximately c. 100 BCE. Devices of comparable complexity to the Antikythera mechanism would not reappear until the fourteenth century. Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th century. The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BCE and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer and gear-wheels was invented by Abi Bakr of Isfahan, Persia in 1235. Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar astrolabe, an early fixed-wired knowledge processing machine with a gear train and gear-wheels, c. 1000 AD. The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation. The planimeter was a manual instrument to calculate the area of a closed figure by tracing over it with a mechanical linkage. The slide rule was invented around 1620–1630, by the English clergyman William Oughtred, shortly after the publication of the concept of the logarithm. It is a hand-operated analog computer for doing multiplication and division. As slide rule development progressed, added scales provided reciprocals, squares and square roots, cubes and cube roots, as well as transcendental functions such as logarithms and exponentials, circular and hyperbolic trigonometry and other functions. Slide rules with special scales are still used for quick performance of routine calculations, such as the E6B circular slide rule used for time and distance calculations on light aircraft. In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automaton) that could write holding a quill pen. By switching the number and order of its internal wheels different letters, and hence different messages, could be produced. In effect, it could be mechanically "programmed" to read instructions. Along with two other complex machines, the doll is at the Musée d'Art et d'Histoire of Neuchâtel, Switzerland, and still operates. In 1831–1835, mathematician and engineer Giovanni Plana devised a Perpetual Calendar machine, which through a system of pulleys and cylinders could predict the perpetual calendar for every year from 0 CE (that is, 1 BCE) to 4000 CE, keeping track of leap years and varying day length. The tide-predicting machine invented by the Scottish scientist Sir William Thomson in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location. The differential analyser, a mechanical analog computer designed to solve differential equations by integration, used wheel-and-disc mechanisms to perform the integration. In 1876, Sir William Thomson had already discussed the possible construction of such calculators, but he had been stymied by the limited output torque of the ball-and-disk integrators. In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that allowed these machines to work. Starting in the 1920s, Vannevar Bush and others developed mechanical differential analyzers. In the 1890s, the Spanish engineer Leonardo Torres Quevedo began to develop a series of advanced analog machines that could solve real and complex roots of polynomials, which were published in 1901 by the Paris Academy of Sciences. Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the "father of the computer", he conceptualized and invented the first mechanical computer in the early 19th century. After working on his difference engine he announced his invention in 1822, in a paper to the Royal Astronomical Society, titled "Note on the application of machinery to the computation of astronomical and mathematical tables". He also designed to aid in navigational calculations, in 1833 he realized that a much more general design, an analytical engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The engine would incorporate an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. The machine was about a century ahead of its time. All the parts for his machine had to be made by hand – this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage's failure to complete the analytical engine can be chiefly attributed to political and financial difficulties as well as his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906. In his work Essays on Automatics published in 1914, Leonardo Torres Quevedo wrote a brief history of Babbage's efforts at constructing a mechanical Difference Engine and Analytical Engine. The paper contains a design of a machine capable to calculate formulas like a x ( y − z ) 2 {\displaystyle a^{x}(y-z)^{2}} , for a sequence of sets of values. The whole machine was to be controlled by a read-only program, which was complete with provisions for conditional branching. He also introduced the idea of floating-point arithmetic. In 1920, to celebrate the 100th anniversary of the invention of the arithmometer, Torres presented in Paris the Electromechanical Arithmometer, which allowed a user to input arithmetic problems through a keyboard, and computed and printed the results, demonstrating the feasibility of an electromechanical analytical engine. During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers. The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson (later to become Lord Kelvin) in 1872. The differential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the elder brother of the more famous Sir William Thomson. The art of mechanical analog computing reached its zenith with the differential analyzer, completed in 1931 by Vannevar Bush at MIT. By the 1950s, the success of digital electronic computers had spelled the end for most analog computing machines, but analog computers remained in use during the 1950s in some specialized applications such as education (slide rule) and aircraft (control systems).[citation needed] Claude Shannon's 1937 master's thesis laid the foundations of digital computing, with his insight of applying Boolean algebra to the analysis and synthesis of switching circuits being the basic concept which underlies all electronic digital computers. By 1938, the United States Navy had developed the Torpedo Data Computer, an electromechanical analog computer for submarines that used trigonometry to solve the problem of firing a torpedo at a moving target. During World War II, similar devices were developed in other countries. Early digital computers were electromechanical; electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939 in Berlin, was one of the earliest examples of an electromechanical relay computer. In 1941, Zuse followed his earlier machine up with the Z3, the world's first working electromechanical programmable, fully automatic digital computer. The Z3 was built with 2000 relays, implementing a 22-bit word length that operated at a clock frequency of about 5–10 Hz. Program code was supplied on punched film while data could be stored in 64 words of memory or supplied from the keyboard. It was quite similar to modern machines in some respects, pioneering numerous advances such as floating-point numbers. Rather than the harder-to-implement decimal system (used in Charles Babbage's earlier design), using a binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time. The Z3 was not itself a universal computer but could be extended to be Turing complete. Zuse's next computer, the Z4, became the world's first commercial computer; after initial delay due to the Second World War, it was completed in 1950 and delivered to the ETH Zurich. The computer was manufactured by Zuse's own company, Zuse KG, which was founded in 1941 as the first company with the sole purpose of developing computers in Berlin. The Z4 served as the inspiration for the construction of the ERMETH, the first Swiss computer and one of the first in Europe. Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the same time that digital calculation replaced analog. The engineer Tommy Flowers, working at the Post Office Research Station in London in the 1930s, began to explore the possible use of electronics for the telephone exchange. Experimental equipment that he built in 1934 went into operation five years later, converting a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum tubes. In the US, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed and tested the Atanasoff–Berry Computer (ABC) in 1942, the first "automatic electronic digital computer". This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory. During World War II, the British code-breakers at Bletchley Park achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was first attacked with the help of the electro-mechanical bombes which were often run by women. To crack the more sophisticated German Lorenz SZ 40/42 machine, used for high-level Army communications, Max Newman and his colleagues commissioned Flowers to build the Colossus. He spent eleven months from early February 1943 designing and building the first Colossus. After a functional test in December 1943, Colossus was shipped to Bletchley Park, where it was delivered on 18 January 1944 and attacked its first message on 5 February. Colossus was the world's first electronic digital programmable computer. It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Colossus Mark I contained 1,500 thermionic valves (tubes), but Mark II with 2,400 valves, was both five times faster and simpler to operate than Mark I, greatly speeding the decoding process. The ENIAC (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the U.S. Although the ENIAC was similar to the Colossus, it was much faster, more flexible, and it was Turing-complete. Like the Colossus, a "program" on the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that came later. Once a program was written, it had to be mechanically set into the machine with manual resetting of plugs and switches. The programmers of the ENIAC were six women, often known collectively as the "ENIAC girls". It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors. The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper, On Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine" and that is now known as a universal Turing machine. He proved that such a machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable. The fundamental concept of Turing's design is the stored program, where all the instructions for computing are stored in memory. Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine. Early computing machines had fixed programs. Changing its function required the re-wiring and re-structuring of the machine. With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set and can store in memory a set of instructions (a program) that details the computation. The theoretical basis for the stored-program computer was laid out by Alan Turing in his 1936 paper. In 1945, Turing joined the National Physical Laboratory and began work on developing an electronic stored-program digital computer. His 1945 report "Proposed Electronic Calculator" was the first specification for such a device. John von Neumann at the University of Pennsylvania also circulated his First Draft of a Report on the EDVAC in 1945. The Manchester Baby was the world's first stored-program computer. It was built at the University of Manchester in England by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948. It was designed as a testbed for the Williams tube, the first random-access digital storage device. Although the computer was described as "small and primitive" by a 1998 retrospective, it was the first working machine to contain all of the elements essential to a modern electronic computer. As soon as the Baby had demonstrated the feasibility of its design, a project began at the university to develop it into a practically useful computer, the Manchester Mark 1. The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first commercially available general-purpose computer. Built by Ferranti, it was delivered to the University of Manchester in February 1951. At least seven of these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam. In October 1947 the directors of British catering company J. Lyons & Company decided to take an active role in promoting the commercial development of computers. Lyons's LEO I computer, modelled closely on the Cambridge EDSAC of 1949, became operational in April 1951 and ran the world's first routine office computer job. The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947, which was followed by Shockley's bipolar junction transistor in 1948. From 1955 onwards, transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialized applications. At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developed transistors instead of valves. Their first transistorized computer and the first in the world, was operational by 1953, and a second version was completed there in April 1955. However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer. That distinction goes to the Harwell CADET of 1955, built by the electronics division of the Atomic Energy Research Establishment at Harwell. The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented at Bell Labs between 1955 and 1960 and was the first truly compact transistor that could be miniaturized and mass-produced for a wide range of uses. With its high scalability, and much lower power consumption and higher density than bipolar junction transistors, the MOSFET made it possible to build high-density integrated circuits. In addition to data processing, it also enabled the practical use of MOS transistors as memory cell storage elements, leading to the development of MOS semiconductor memory, which replaced earlier magnetic-core memory in computers. The MOSFET led to the microcomputer revolution, and became the driving force behind the computer revolution. The MOSFET is the most widely used transistor in computers, and is the fundamental building block of digital electronics. The next great advance in computing power came with the advent of the integrated circuit (IC). The idea of the integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C., on 7 May 1952. The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor. Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958. In his patent application of 6 February 1959, Kilby described his new device as "a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated". However, Kilby's invention was a hybrid integrated circuit (hybrid IC), rather than a monolithic integrated circuit (IC) chip. Kilby's IC had external wire connections, which made it difficult to mass-produce. Noyce also came up with his own idea of an integrated circuit half a year later than Kilby. Noyce's invention was the first true monolithic IC chip. His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium. Noyce's monolithic IC was fabricated using the planar process, developed by his colleague Jean Hoerni in early 1959. In turn, the planar process was based on Carl Frosch and Lincoln Derick work on semiconductor surface passivation by silicon dioxide. Modern monolithic ICs are predominantly MOS (metal–oxide–semiconductor) integrated circuits, built from MOSFETs (MOS transistors). The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962. General Microelectronics later introduced the first commercial MOS IC in 1964, developed by Robert Norman. Following the development of the self-aligned gate (silicon-gate) MOS transistor by Robert Kerwin, Donald Klein and John Sarace at Bell Labs in 1967, the first silicon-gate MOS IC with self-aligned gates was developed by Federico Faggin at Fairchild Semiconductor in 1968. The MOSFET has since become the most critical device component in modern ICs. The development of the MOS integrated circuit led to the invention of the microprocessor, and heralded an explosion in the commercial and personal use of computers. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term "microprocessor", it is largely undisputed that the first single-chip microprocessor was the Intel 4004, designed and realized by Federico Faggin with his silicon-gate MOS IC technology, along with Ted Hoff, Masatoshi Shima and Stanley Mazor at Intel.[b] In the early 1970s, MOS IC technology enabled the integration of more than 10,000 transistors on a single chip. System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of a coin. They may or may not have integrated RAM and flash memory. If not integrated, the RAM is usually placed directly above (known as Package on package) or below (on the opposite side of the circuit board) the SoC, and the flash memory is usually placed right next to the SoC. This is done to improve data transfer speeds, as the data signals do not have to travel long distances. Since ENIAC in 1945, computers have advanced enormously, with modern SoCs (such as the Snapdragon 865) being the size of a coin while also being hundreds of thousands of times more powerful than ENIAC, integrating billions of transistors, and consuming only a few watts of power. The first mobile computers were heavy and ran from mains power. The 50 lb (23 kg) IBM 5100 was an early example. Later portables such as the Osborne 1 and Compaq Portable were considerably lighter but still needed to be plugged in. The first laptops, such as the Grid Compass, removed this requirement by incorporating batteries – and with the continued miniaturization of computing resources and advancements in portable battery life, portable computers grew in popularity in the 2000s. The same developments allowed manufacturers to integrate computing resources into cellular mobile phones by the early 2000s. These smartphones and tablets run on a variety of operating systems and recently became the dominant computing device on the market. These are powered by System on a Chip (SoCs), which are complete computers on a microchip the size of a coin. Types Computers can be classified in a number of different ways, including: A computer does not need to be electronic, nor even have a processor, nor RAM, nor even a hard disk. While popular usage of the word "computer" is synonymous with a personal electronic computer,[c] a typical modern definition of a computer is: "A device that computes, especially a programmable [usually] electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information." According to this definition, any device that processes information qualifies as a computer. Hardware The term hardware covers all of those parts of a computer that are tangible physical objects. Circuits, computer chips, graphic cards, sound cards, memory (RAM), motherboard, displays, power supplies, cables, keyboards, printers and "mice" input devices are all hardware. A general-purpose computer has four main components: the arithmetic logic unit (ALU), the control unit, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by buses, often made of groups of wires. Inside each of these parts are thousands to trillions of small electrical circuits which can be turned off or on by means of an electronic switch. Each circuit represents a bit (binary digit) of information so that when the circuit is on it represents a "1", and when off it represents a "0" (in positive logic representation). The circuits are arranged in logic gates so that one or more of the circuits may control the state of one or more of the other circuits. Input devices are the means by which the operations of a computer are controlled and it is provided with data. Examples include: Output devices are the means by which a computer provides the results of its calculations in a human-accessible form. Examples include: The control unit (often called a control system or central controller) manages the computer's various components; it reads and interprets (decodes) the program instructions, transforming them into control signals that activate other parts of the computer.[e] Control systems in advanced computers may change the order of execution of some instructions to improve performance. A key component common to all CPUs is the program counter, a special memory cell (a register) that keeps track of which location in memory the next instruction is to be read from.[f] The control system's function is as follows— this is a simplified description, and some of these steps may be performed concurrently or in a different order depending on the type of CPU: Since the program counter is (conceptually) just another set of memory cells, it can be changed by calculations done in the ALU. Adding 100 to the program counter would cause the next instruction to be read from a place 100 locations further down the program. Instructions that modify the program counter are often known as "jumps" and allow for loops (instructions that are repeated by the computer) and often conditional instruction execution (both examples of control flow). The sequence of operations that the control unit goes through to process an instruction is in itself like a short computer program, and indeed, in some more complex CPU designs, there is another yet smaller computer called a microsequencer, which runs a microcode program that causes all of these events to happen. The control unit, ALU, and registers are collectively known as a central processing unit (CPU). Early CPUs were composed of many separate components. Since the 1970s, CPUs have typically been constructed on a single MOS integrated circuit chip called a microprocessor. The ALU is capable of performing two classes of operations: arithmetic and logic. The set of arithmetic operations that a particular ALU supports may be limited to addition and subtraction, or might include multiplication, division, trigonometry functions such as sine, cosine, etc., and square roots. Some can operate only on whole numbers (integers) while others use floating point to represent real numbers, albeit with limited precision. However, any computer that is capable of performing just the simplest operations can be programmed to break down the more complex operations into simple steps that it can perform. Therefore, any computer can be programmed to perform any arithmetic operation—although it will take more time to do so if its ALU does not directly support the operation. An ALU may also compare numbers and return Boolean truth values (true or false) depending on whether one is equal to, greater than or less than the other ("is 64 greater than 65?"). Logic operations involve Boolean logic: AND, OR, XOR, and NOT. These can be useful for creating complicated conditional statements and processing Boolean logic. Superscalar computers may contain multiple ALUs, allowing them to process several instructions simultaneously. Graphics processors and computers with SIMD and MIMD features often contain ALUs that can perform arithmetic on vectors and matrices. A computer's memory can be viewed as a list of cells into which numbers can be placed or read. Each cell has a numbered "address" and can store a single number. The computer can be instructed to "put the number 123 into the cell numbered 1357" or to "add the number that is in cell 1357 to the number that is in cell 2468 and put the answer into cell 1595." The information stored in memory may represent practically anything. Letters, numbers, even computer instructions can be placed into memory with equal ease. Since the CPU does not differentiate between different types of information, it is the software's responsibility to give significance to what the memory sees as nothing but a series of numbers. In almost all modern computers, each memory cell is set up to store binary numbers in groups of eight bits (called a byte). Each byte is able to represent 256 different numbers (28 = 256); either from 0 to 255 or −128 to +127. To store larger numbers, several consecutive bytes may be used (typically, two, four or eight). When negative numbers are required, they are usually stored in two's complement notation. Other arrangements are possible, but are usually not seen outside of specialized applications or historical contexts. A computer can store any kind of information in memory if it can be represented numerically. Modern computers have billions or even trillions of bytes of memory. The CPU contains a special set of memory cells called registers that can be read and written to much more rapidly than the main memory area. There are typically between two and one hundred registers depending on the type of CPU. Registers are used for the most frequently needed data items to avoid having to access main memory every time data is needed. As data is constantly being worked on, reducing the need to access main memory (which is often slow compared to the ALU and control units) greatly increases the computer's speed. Computer main memory comes in two principal varieties: RAM can be read and written to anytime the CPU commands it, but ROM is preloaded with data and software that never changes, therefore the CPU can only read from it. ROM is typically used to store the computer's initial start-up instructions. In general, the contents of RAM are erased when the power to the computer is turned off, but ROM retains its data indefinitely. In a PC, the ROM contains a specialized program called the BIOS that orchestrates loading the computer's operating system from the hard disk drive into RAM whenever the computer is turned on or reset. In embedded computers, which frequently do not have disk drives, all of the required software may be stored in ROM. Software stored in ROM is often called firmware, because it is notionally more like hardware than software. Flash memory blurs the distinction between ROM and RAM, as it retains its data when turned off but is also rewritable. It is typically much slower than conventional ROM and RAM however, so its use is restricted to applications where high speed is unnecessary.[g] In more sophisticated computers there may be one or more RAM cache memories, which are slower than registers but faster than main memory. Generally computers with this sort of cache are designed to move frequently needed data into the cache automatically, often without the need for any intervention on the programmer's part. I/O is the means by which a computer exchanges information with the outside world. Devices that provide input or output to the computer are called peripherals. On a typical personal computer, peripherals include input devices like the keyboard and mouse, and output devices such as the display and printer. Hard disk drives, floppy disk drives and optical disc drives serve as both input and output devices. Computer networking is another form of I/O. I/O devices are often complex computers in their own right, with their own CPU and memory. A graphics processing unit might contain fifty or more tiny computers that perform the calculations necessary to display 3D graphics.[citation needed] Modern desktop computers contain many smaller computers that assist the main CPU in performing I/O. A 2016-era flat screen display contains its own computer circuitry. While a computer may be viewed as running one gigantic program stored in its main memory, in some systems it is necessary to give the appearance of running several programs simultaneously. This is achieved by multitasking, i.e. having the computer switch rapidly between running each program in turn. One means by which this is done is with a special signal called an interrupt, which can periodically cause the computer to stop executing instructions where it was and do something else instead. By remembering where it was executing prior to the interrupt, the computer can return to that task later. If several programs are running "at the same time". Then the interrupt generator might be causing several hundred interrupts per second, causing a program switch each time. Since modern computers typically execute instructions several orders of magnitude faster than human perception, it may appear that many programs are running at the same time, even though only one is ever executing in any given instant. This method of multitasking is sometimes termed "time-sharing" since each program is allocated a "slice" of time in turn. Before the era of inexpensive computers, the principal use for multitasking was to allow many people to share the same computer. Seemingly, multitasking would cause a computer that is switching between several programs to run more slowly, in direct proportion to the number of programs it is running, but most programs spend much of their time waiting for slow input/output devices to complete their tasks. If a program is waiting for the user to click on the mouse or press a key on the keyboard, then it will not take a "time slice" until the event it is waiting for has occurred. This frees up time for other programs to execute so that many programs may be run simultaneously without unacceptable speed loss. Some computers are designed to distribute their work across several CPUs in a multiprocessing configuration, a technique once employed in only large and powerful machines such as supercomputers, mainframe computers and servers. Multiprocessor and multi-core (multiple CPUs on a single integrated circuit) personal and laptop computers are now widely available, and are being increasingly used in lower-end markets as a result. Supercomputers in particular often have highly unique architectures that differ significantly from the basic stored-program architecture and from general-purpose computers.[h] They often feature thousands of CPUs, customized high-speed interconnects, and specialized computing hardware. Such designs tend to be useful for only specialized tasks due to the large scale of program organization required to use most of the available resources at once. Supercomputers usually see usage in large-scale simulation, graphics rendering, and cryptography applications, as well as with other so-called "embarrassingly parallel" tasks. Software Software is the part of a computer system that consists of the encoded information that determines the computer's operation, such as data or instructions on how to process the data. In contrast to the physical hardware from which the system is built, software is immaterial. Software includes computer programs, libraries and related non-executable data, such as online documentation or digital media. It is often divided into system software and application software. Computer hardware and software require each other and neither is useful on its own. When software is stored in hardware that cannot easily be modified, such as with BIOS ROM in an IBM PC compatible computer, it is sometimes called "firmware". The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that some type of instructions (the program) can be given to the computer, and it will process them. Modern computers based on the von Neumann architecture often have machine code in the form of an imperative programming language. In practical terms, a computer program may be just a few instructions or extend to many millions of instructions, as do the programs for word processors and web browsers for example. A typical modern computer can execute billions of instructions per second (gigaflops) and rarely makes a mistake over many years of operation. Large computer programs consisting of several million instructions may take teams of programmers years to write, and due to the complexity of the task almost certainly contain errors. This section applies to most common RAM machine–based computers. In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to the instruction following that jump instruction. Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention. Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time, with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. The following example is written in the MIPS assembly language: Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in a fraction of a second. In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode; the command to multiply them would have a different opcode, and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from, each with a unique numerical code. Since the computer's memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of these instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer in the same way as numeric data. The fundamental concept of storing programs in the computer's memory alongside the data they operate on is the crux of the von Neumann, or stored program, architecture. In some cases, a computer might store some or all of its program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches. While it is possible to write computer programs as long lists of numbers (machine language) and while this technique was used with many early computers,[i] it is extremely tedious and potentially error-prone to do so in practice, especially for complicated programs. Instead, each basic instruction can be given a short name that is indicative of its function and easy to remember – a mnemonic such as ADD, SUB, MULT or JUMP. These mnemonics are collectively known as a computer's assembly language. Converting programs written in assembly language into something the computer can actually understand (machine language) is usually done by a computer program called an assembler. A programming language is a notation system for writing the source code from which a computer program is produced. Programming languages provide various ways of specifying programs for computers to run. Unlike natural languages, programming languages are designed to permit no ambiguity and to be concise. They are purely written languages and are often difficult to read aloud. They are generally either translated into machine code by a compiler or an assembler before being run, or translated directly at run time by an interpreter. Sometimes programs are executed by a hybrid method of the two techniques. There are thousands of programming languages—some intended for general purpose programming, others useful for only highly specialized applications. Machine languages and the assembly languages that represent them (collectively termed low-level programming languages) are generally unique to the particular architecture of a computer's central processing unit (CPU). For instance, an ARM architecture CPU (such as may be found in a smartphone or a hand-held videogame) cannot understand the machine language of an x86 CPU that might be in a PC.[j] Historically a significant number of other CPU architectures were created and saw extensive use, notably including the MOS Technology 6502 and 6510 in addition to the Zilog Z80. Although considerably easier than in machine language, writing long programs in assembly language is often difficult and is also error prone. Therefore, most practical programs are written in more abstract high-level programming languages that are able to express the needs of the programmer more conveniently (and thereby help reduce programmer error). High level languages are usually "compiled" into machine language (or sometimes into assembly language and then into machine language) using another computer program called a compiler.[k] High level languages are less related to the workings of the target computer than assembly language, and more related to the language and structure of the problem(s) to be solved by the final program. It is therefore often possible to use different compilers to translate the same high level language program into the machine language of many different types of computer. This is part of the means by which software like video games may be made available for different computer architectures such as personal computers and various video game consoles. Program design of small programs is relatively simple and involves the analysis of the problem, collection of inputs, using the programming constructs within languages, devising or using established procedures and algorithms, providing data for output devices and solutions to the problem as applicable. As problems become larger and more complex, features such as subprograms, modules, formal documentation, and new paradigms such as object-oriented programming are encountered. Large programs involving thousands of line of code and more require formal software methodologies. The task of developing large software systems presents a significant intellectual challenge. Producing software with an acceptably high reliability within a predictable schedule and budget has historically been difficult; the academic and professional discipline of software engineering concentrates specifically on this challenge. Errors in computer programs are called "bugs". They may be benign and not affect the usefulness of the program, or have only subtle effects. However, in some cases they may cause the program or the entire system to "hang", becoming unresponsive to input such as mouse clicks or keystrokes, to completely fail, or to crash. Otherwise benign bugs may sometimes be harnessed for malicious intent by an unscrupulous user writing an exploit, code designed to take advantage of a bug and disrupt a computer's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design.[l] Admiral Grace Hopper, an American computer scientist and developer of the first compiler, is credited for having first used the term "bugs" in computing after a dead moth was found shorting a relay in the Harvard Mark II computer in September 1947. Networking and the Internet Computers have been used to coordinate information between multiple physical locations since the 1950s. The U.S. military's SAGE system was the first large-scale example of such a system, which led to a number of special-purpose commercial systems such as Sabre. In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. The effort was funded by ARPA (now DARPA), and the computer network that resulted was called the ARPANET. Logic gates are a common abstraction which can apply to most of the above digital or analog paradigms. The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a minimum capability (being Turing-complete) is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, any type of computer (netbook, supercomputer, cellular automaton, etc.) is able to perform the same computational tasks, given enough time and storage capacity. In the 20th century, artificial intelligence systems were predominantly symbolic: they executed code that was explicitly programmed by software developers. Machine learning models, however, have a set parameters that are adjusted throughout training, so that the model learns to accomplish a task based on the provided data. The efficiency of machine learning (and in particular of neural networks) has rapidly improved with progress in hardware for parallel computing, mainly graphics processing units (GPUs). Some large language models are able to control computers or robots. AI progress may lead to the creation of artificial general intelligence (AGI), a type of AI that could accomplish virtually any intellectual task at least as well as humans. Professions and organizations As the use of computers has spread throughout society, there are an increasing number of careers involving computers. The need for computers to work well together and to be able to exchange information has spawned the need for many standards organizations, clubs and societies of both a formal and informal nature. See also Notes References Sources External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/X-12-ARIMA] | [TOKENS: 553] |
Contents X-13ARIMA-SEATS X-13ARIMA-SEATS, successor to X-12-ARIMA and X-11, is a set of statistical methods for seasonal adjustment and other descriptive analysis of time series data that are implemented in the U.S. Census Bureau's software package. These methods are or have been used by Statistics Canada, Australian Bureau of Statistics, and the statistical offices of many other countries. X-12-ARIMA can be used together with many statistical packages, such as SAS in its econometric and time series (ETS) package, R in its (seasonal) package, Gretl or EViews which provides a graphical user interface for X-12-ARIMA, and NumXL which avails X-12-ARIMA functionality in Microsoft Excel. There is also a version for MATLAB. Notable statistical agencies presently[when?] using X-12-ARIMA for seasonal adjustment include Statistics Canada, the U.S. Bureau of Labor Statistics and Census and Statistics Department (Hong Kong). The Brazilian Institute of Geography and Statistics uses X-13-ARIMA. X-12-ARIMA was the successor to X-11-ARIMA; the current version is X-13ARIMA-SEATS. X-13-ARIMA-SEATS's source code can be found on the Census Bureau's website. Methods The default method for seasonal adjustment is based on the X-11 algorithm. It is assumed that the observations in a time series, Y t {\displaystyle Y_{t}} , can be decomposed additively, or multiplicatively, In this decomposition, T t {\displaystyle T_{t}} is the trend (or the "trend cycle" because it also includes cyclical movements such as business cycles) component, S t {\displaystyle S_{t}} is the seasonal component, and I t {\displaystyle I_{t}} is the irregular (or random) component. The goal is to estimate each of the three components and then remove the seasonal component from the time series, producing a seasonally adjusted time series. The decomposition is accomplished through the iterative application of centered moving averages. For an additive decomposition of a monthly time series, for example, the algorithm follows the following pattern: The method also includes a number of tests, diagnostics and other statistics for evaluating the quality of the seasonal adjustments. Copyright and conditions The software is US government work, and those are in the public domain (in the US); for this software copyright has also been granted for other countries; the "User agrees to make a good faith effort to use the Software in a way that does not cause damage, harm, or embarrassment to the United States/Commerce." See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/The_New_York_Times#Circulation] | [TOKENS: 13653] |
Contents The New York Times The New York Times (NYT)[b] is a newspaper based in Manhattan, New York City. The New York Times covers domestic, national, and international news, and publishes opinion pieces and reviews. As one of the longest-running newspapers in the United States, the Times serves as one of the country's newspapers of record. As of August 2025[update], The New York Times had 11.88 million total and 11.3 million online subscribers, both by significant margins the highest numbers for any newspaper in the United States; the total also included 580,000 print subscribers. The New York Times is published by the New York Times Company; since 1896, the company has been chaired by the Ochs-Sulzberger family, whose current chairman and the paper's publisher is A. G. Sulzberger. The Times is headquartered at The New York Times Building in Midtown Manhattan. The Times was founded as the conservative New-York Daily Times in 1851, and came to national recognition in the 1870s with its aggressive coverage of corrupt politician Boss Tweed. Following the Panic of 1893, Chattanooga Times publisher Adolph Ochs gained a controlling interest in the company. In 1935, Ochs was succeeded by his son-in-law, Arthur Hays Sulzberger, who began a push into European news. Sulzberger's son Arthur Ochs Sulzberger became publisher in 1963, adapting to a changing newspaper industry and introducing radical changes. The New York Times was involved in the landmark 1964 U.S. Supreme Court case New York Times Co. v. Sullivan, which restricted the ability of public officials to sue the media for defamation. In 1971, The New York Times published the Pentagon Papers, an internal Department of Defense document detailing the United States's historical involvement in the Vietnam War, despite pushback from then-president Richard Nixon. In the landmark decision New York Times Co. v. United States (1971), the Supreme Court ruled that the First Amendment guaranteed the right to publish the Pentagon Papers. In the 1980s, the Times began a two-decade progression to digital technology and launched nytimes.com in 1996. In the 21st century, it shifted its publication online amid the global decline of newspapers. Currently, the Times maintains several regional bureaus staffed with journalists across six continents. It has expanded to several other publications, including The New York Times Magazine, The New York Times International Edition, and The New York Times Book Review. In addition, the paper has produced several television series, podcasts—including The Daily—and games through The New York Times Games. The New York Times has been involved in a number of controversies in its history. Among other accolades, it has been awarded the Pulitzer Prize 135 times since 1918, the most of any publication. According to a 2025 Pew Research Center study on educational differences among audiences of 30 major U.S. news outlets, The New York Times had the highest proportion of college-educated readers among the daily newspapers surveyed, with 56% of its audience holding at least a bachelor's degree. History The New York Times was established in 1851 as the New-York Daily Times by New-York Tribune journalists Henry Jarvis Raymond and George Jones. The Times experienced significant circulation, particularly among conservatives; New-York Tribune publisher Horace Greeley praised the Times. During the American Civil War, Times correspondents gathered information directly from Confederate states. In 1869, Jones inherited the paper from Raymond, who had changed its name to The New-York Times. Under Jones, the Times began to publish a series of articles criticizing Tammany Hall political boss William M. Tweed, despite vehement opposition from other New York newspapers. In 1871, The New-York Times published Tammany Hall's accounting books; Tweed was tried in 1873 and sentenced to twelve years in prison. The Times earned national recognition for its coverage of Tweed. In 1891, Jones died, creating a management imbroglio in which his children had insufficient business acumen to inherit the company and his will prevented an acquisition of the Times. Editor-in-chief Charles Ransom Miller, editorial editor Edward Cary, and correspondent George F. Spinney established a company to manage The New-York Times, but faced financial difficulties during the Panic of 1893. In August 1896, Chattanooga Times publisher Adolph Ochs acquired The New-York Times, implementing significant alterations to the newspaper's structure. Ochs established the Times as a merchant's newspaper and removed the hyphen from the newspaper's name. In 1905, The New York Times opened Times Tower, marking expansion. The Times experienced a political realignment in the 1910s amid several disagreements within the Republican Party. The New York Times reported on the sinking of the Titanic, as other newspapers were cautious about bulletins circulated by the Associated Press. Through managing editor Carr Van Anda, the Times paid considerable attention to advances in science, reporting on Albert Einstein's then-obscure theory of general relativity and becoming involved in the discovery of the tomb of Tutankhamun. In April 1935, Ochs died, leaving his son-in-law Arthur Hays Sulzberger as publisher. The Great Depression forced Sulzberger to reduce The New York Times's operations, and developments in the New York newspaper landscape resulted in the formation of larger newspapers, such as the New York Herald Tribune and the New York World-Telegram. In contrast to Ochs, Sulzberger encouraged wirephotography. The New York Times extensively covered World War II through large headlines, reporting on exclusive stories such as the Yugoslav coup d'état. Amid the war, Sulzberger began expanding the Times's operations further, acquiring WQXR-FM in 1944—the first non-Times investment since the Jones era—and established a fashion show in Times Hall. Despite reductions as a result of conscription, The New York Times retained the largest journalism staff of any newspaper. The Times's print edition became available internationally during the war through the Army & Air Force Exchange Service; The New York Times Overseas Weekly later became available in Japan through The Asahi Shimbun and in Germany through the Frankfurter Zeitung. The international edition would develop into a separate newspaper. Journalist William L. Laurence publicized the atomic bomb race between the United States and Germany, resulting in the Federal Bureau of Investigation seizing copies of the Times. The United States government recruited Laurence to document the Manhattan Project in April 1945. Laurence became the only witness of the Manhattan Project, a detail realized by employees of The New York Times following the atomic bombing of Hiroshima. Following World War II, The New York Times continued to expand. The Times was subject to investigations from the Senate Internal Security Subcommittee, a McCarthyist subcommittee that investigated purported communism from within press institutions. Arthur Hays Sulzberger's decision to dismiss a copyreader who had pleaded the Fifth Amendment drew ire from within the Times and from external organizations. In April 1961, Sulzberger resigned, appointing his son-in-law, The New York Times Company president Orvil Dryfoos. Under Dryfoos, The New York Times established a newspaper based in Los Angeles. In 1962, the implementation of automated printing presses in response to increasing costs mounted fears over technological unemployment. The New York Typographical Union staged a strike in December, altering the media consumption of New Yorkers. The strike left New York with three remaining newspapers—the Times, the Daily News, and the New York Post—by its conclusion in March 1963. In May, Dryfoos died of a heart ailment. Following weeks of ambiguity, Arthur Ochs Sulzberger became The New York Times's publisher. Technological advancements leveraged by newspapers such as the Los Angeles Times and improvements in coverage from The Washington Post and The Wall Street Journal necessitated adaptations to nascent computing. The New York Times published "Heed Their Rising Voices" in 1960, a full-page advertisement purchased by supporters of Martin Luther King Jr. criticizing law enforcement in Montgomery, Alabama for their response to the civil rights movement. Montgomery Public Safety commissioner L. B. Sullivan sued the Times for defamation. In New York Times Co. v. Sullivan (1964), the U.S. Supreme Court ruled that the verdict in Alabama county court and the Supreme Court of Alabama violated the First Amendment. The decision is considered to be landmark. After financial losses, The New York Times ended its international edition, acquiring a stake in the Paris Herald Tribune, forming the International Herald Tribune. The Times initially published the Pentagon Papers, facing opposition from then-president Richard Nixon. The Supreme Court ruled in The New York Times's favor in New York Times Co. v. United States (1971), allowing the Times and The Washington Post to publish the papers. The New York Times remained cautious in its initial coverage of the Watergate scandal. As Congress began investigating the scandal, the Times furthered its coverage, publishing details on the Huston Plan, alleged wiretapping of reporters and officials, and testimony from James W. McCord Jr. that the Committee for the Re-Election of the President paid the conspirators off. The exodus of readers to suburban New York newspapers, such as Newsday and Gannett papers, adversely affected The New York Times's circulation. Contemporary newspapers balked at additional sections; Time devoted a cover for its criticism and New York wrote that the Times was engaging in "middle-class self-absorption". The New York Times, the Daily News, and the New York Post were the subject of a strike in 1978, allowing emerging newspapers to leverage halted coverage. The Times deliberately avoided coverage of the AIDS epidemic, running its first front-page article in May 1983. Max Frankel's editorial coverage of the epidemic, with mentions of anal intercourse, contrasted with then-executive editor A. M. Rosenthal's puritan approach, intentionally avoiding descriptions of the luridity of gay venues. Following years of waning interest in The New York Times, Sulzberger resigned in January 1992, appointing his son, Arthur Ochs Sulzberger Jr., as publisher. The Internet represented a generational shift within the Times; Sulzberger, who negotiated The New York Times Company's acquisition of The Boston Globe in 1993, derided the Internet, while his son expressed antithetical views. @times appeared on America Online's website in May 1994 as an extension of The New York Times, featuring news articles, film reviews, sports news, and business articles. Despite opposition, several employees of the Times had begun to access the Internet. The online success of publications that traditionally co-existed with the Times—such as America Online, Yahoo, and CNN—and the expansion of websites such as Monster.com and Craigslist that threatened The New York Times's classified advertisement model increased efforts to develop a website. nytimes.com debuted on January 19 and was formally announced three days later. The Times published domestic terrorist Ted Kaczynski's essay Industrial Society and Its Future in 1995, contributing to his arrest after his brother David recognized the essay's penmanship. Following the establishment of nytimes.com, The New York Times retained its journalistic hesitancy under executive editor Joseph Lelyveld, refusing to publish an article reporting on the Clinton–Lewinsky scandal from Drudge Report. nytimes.com editors conflicted with print editors on several occasions, including wrongfully naming security guard Richard Jewell as the suspect in the Centennial Olympic Park bombing and covering the death of Diana, Princess of Wales in greater detail than the print edition. The New York Times Electronic Media Company was adversely affected by the dot-com crash. The Times extensively covered the September 11 attacks. The following day's print issue contained sixty-six articles, the work of over three hundred dispatched reporters. Journalist Judith Miller was the recipient of a package containing a white powder during the 2001 anthrax attacks, furthering anxiety within The New York Times. In September 2002, Miller and military correspondent Michael R. Gordon wrote an article for the Times claiming that Iraq had purchased aluminum tubes. The article was cited by then-president George W. Bush to claim that Iraq was constructing weapons of mass destruction; the theoretical use of aluminum tubes to produce nuclear material was speculation. In March 2003, the United States invaded Iraq, beginning the Iraq War. The New York Times attracted controversy after thirty-six articles from journalist Jayson Blair were discovered to be plagiarized. Criticism over then-executive editor Howell Raines and then-managing editor Gerald M. Boyd mounted following the scandal, culminating in a town hall in which a deputy editor criticized Raines for failing to question Blair's sources in article he wrote on the D.C. sniper attacks. In June 2003, Raines and Boyd resigned. Arthur Ochs Sulzberger Jr. appointed Bill Keller as executive editor. Miller continued to report on the Iraq War as a journalistic embed covering the country's weapons of mass destruction program. Keller and then-Washington bureau chief Jill Abramson unsuccessfully attempted to subside criticism. Conservative media criticized the Times over its coverage of missing explosives from the Al Qa'qaa weapons facility. An article in December 2005 disclosing warrantless surveillance by the National Security Agency contributed to further criticism from the George W. Bush administration and the Senate's refusal to renew the Patriot Act. In the Plame affair, a Central Intelligence Agency inquiry found that Miller had become aware of Valerie Plame's identity through then-vice president Dick Cheney's chief of staff Scooter Libby, resulting in Miller's resignation. During the Great Recession, The New York Times suffered significant fiscal difficulties as a consequence of the subprime mortgage crisis and a decline in classified advertising. Exacerbated by Rupert Murdoch's revitalization of The Wall Street Journal through his acquisition of Dow Jones & Company, The New York Times Company began enacting measures to reduce the newsroom budget. The company was forced to borrow $250 million (equivalent to $373.84 million in 2025) from Mexican billionaire Carlos Slim and fired over one hundred employees by 2010. nytimes.com's coverage of the Eliot Spitzer prostitution scandal, resulting in the resignation of then-New York governor Eliot Spitzer, furthered the legitimacy of the website as a journalistic medium. The Times's economic downturn renewed discussions of an online paywall; The New York Times implemented a paywall in March 2011. Abramson succeeded Keller, continuing her characteristic investigations into corporate and government malfeasance into the Times's coverage. Following conflicts with newly appointed chief executive Mark Thompson's ambitions, Abramson was dismissed by Sulzberger Jr., who named Dean Baquet as her replacement. Leading up to the 2016 presidential election, The New York Times elevated the Hillary Clinton email controversy into a national issue. Donald Trump's upset victory contributed to an increase in subscriptions to the Times. The New York Times experienced unprecedented indignation from Trump, who referred to publications such as the Times as "enemies of the people" at the Conservative Political Action Conference and tweeted his disdain for the newspaper and CNN. In October 2017, The New York Times published an article by journalists Jodi Kantor and Megan Twohey alleging that dozens of women had accused film producer and The Weinstein Company co-chairman Harvey Weinstein of sexual misconduct. The investigation resulted in Weinstein's resignation and conviction, precipitated the Weinstein effect, and served as a catalyst for the #MeToo movement. The New York Times Company vacated the public editor position and eliminated the copy desk in November. Sulzberger Jr. announced his resignation in December 2017, appointing his son, A. G. Sulzberger, as publisher. Trump's relationship—equally diplomatic and negative—marked Sulzberger's tenure. In September 2018, The New York Times published "I Am Part of the Resistance Inside the Trump Administration", an anonymous essay by a self-described Trump administration official later revealed to be Department of Homeland Security chief of staff Miles Taylor. The animosity—which extended to nearly three hundred instances of Trump disparaging the Times by May 2019—culminated in Trump ordering federal agencies to cancel their subscriptions to The New York Times and The Washington Post in October 2019. Trump's tax returns have been the subject of three separate investigations.[c] During the COVID-19 pandemic, the Times began implementing data services and graphs. On May 23, 2020, The New York Times's front page solely featured U.S. Deaths Near 100,000, An Incalculable Loss, a subset of the 100,000 people in the United States who died of COVID-19, the first time that the Times's front page lacked images since they were introduced. Since 2020, The New York Times has focused on broader diversification, developing online games and producing television series. The New York Times Company acquired The Athletic in January 2022. Organization Since 1896, The New York Times has been published by the Ochs-Sulzberger family, having previously been published by Henry Jarvis Raymond until 1869 and by George Jones until 1896. Adolph Ochs published the Times until his death in 1935, when he was succeeded by his son-in-law, Arthur Hays Sulzberger. Sulzberger was publisher until 1961 and was succeeded by Orvil Dryfoos, his son-in-law, who served in the position until his death in 1963. Arthur Ochs Sulzberger succeeded Dryfoos until his resignation in 1992. His son, Arthur Ochs Sulzberger Jr., served as publisher until 2018. The New York Times's current publisher is A. G. Sulzberger, Sulzberger Jr.'s son. As of 2023, the Times's executive editor is Joseph Kahn and the paper's managing editors are Marc Lacey and Carolyn Ryan, having been appointed in June 2022. The New York Times's deputy managing editors are Sam Dolnick, Monica Drake, and Steve Duenes, and the paper's assistant managing editors are Matthew Ericson, Jonathan Galinsky, Hannah Poferl, Sam Sifton, Karron Skog, and Michael Slackman. The New York Times is owned by The New York Times Company, a publicly traded company. The New York Times Company, in addition to the Times, owns Wirecutter, The Athletic, The New York Times Cooking, and The New York Times Games, and acquired Serial Productions and Audm. The New York Times Company holds undisclosed minority investments in multiple other businesses, and formerly owned The Boston Globe and several radio and television stations. The New York Times Company is majority-owned by the Ochs-Sulzberger family through elevated shares in the company's dual-class stock structure held largely in a trust, in effect since the 1950s; as of 2022, the family holds ninety-five percent of The New York Times Company's Class B shares, allowing it to elect seventy percent of the company's board of directors. Class A shareholders have restrictive voting rights. As of 2023, The New York Times Company's chief executive is Meredith Kopit Levien, the company's former chief operating officer who was appointed in September 2020. As of March 2023, The New York Times Company employs 5,800 individuals, including 1,700 journalists according to deputy managing editor Sam Dolnick. Journalists for The New York Times may not run for public office, provide financial support to political candidates or causes, endorse candidates, or demonstrate public support for causes or movements. Journalists are subject to the guidelines established in "Ethical Journalism" and "Guidelines on Integrity". According to the former, Times journalists must abstain from using sources with a personal relationship to them and must not accept reimbursements or inducements from individuals who may be written about in The New York Times, with exceptions for gifts of nominal value. The latter requires attribution and exact quotations, though exceptions are made for linguistic anomalies. Staff writers are expected to ensure the veracity of all written claims, but may delegate researching obscure facts to the research desk. In March 2021, the Times established a committee to avoid journalistic conflicts of interest with work written for The New York Times, following columnist David Brooks's resignation from the Aspen Institute for his undisclosed work on the initiative Weave. The New York Times editorial board was established in 1896 by Adolph Ochs. With the opinion department, the editorial board is independent of the newsroom. Then-editor-in-chief Charles Ransom Miller served as opinion editor from 1883 until his death in 1922. Rollo Ogden succeeded Miller until his death in 1937. From 1937 to 1938, John Huston Finley served as opinion editor; in a prearranged plan, Charles Merz succeeded Finley. Merz served in the position until his retirement in 1961. John Bertram Oakes served as opinion editor from 1961 to 1976, when then-publisher Arthur Ochs Sulzberger appointed Max Frankel. Frankel served in the position until 1986, when he was appointed as executive editor. Jack Rosenthal was the opinion editor from 1986 to 1993. Howell Raines succeeded Rosenthal until 2001, when he was made executive editor. Gail Collins succeeded Raines until her resignation in 2006. From 2007 to 2016, Andrew Rosenthal was the opinion editor. James Bennet succeeded Rosenthal until his resignation in 2020. As of July 2024[update], the editorial board comprises thirteen opinion writers. The New York Times's opinion editor is Kathleen Kingsbury and the deputy opinion editor is Patrick Healy. The New York Times's editorial board was initially opposed to liberal beliefs, opposing women's suffrage in 1900 and 1914. The editorial board began to espouse progressive beliefs during Oakes's tenure, conflicting with the Ochs-Sulzberger family, of which Oakes was a member as Adolph Ochs's nephew; in 1976, Oakes publicly disagreed with Sulzberger's endorsement of Daniel Patrick Moynihan over Bella Abzug in the 1976 Senate Democratic primaries in a letter sent from Martha's Vineyard. Under Rosenthal, the editorial board took positions supporting assault weapons legislation and the legalization of marijuana, but publicly criticized the Obama administration over its portrayal of terrorism. In presidential elections, The New York Times has endorsed a total of twelve Republican candidates and thirty-two Democratic candidates, and has endorsed the Democrat in every election since 1960.[j] With the exception of Wendell Willkie, Republicans endorsed by the Times have won the presidency. In 2016, the editorial board issued an anti-endorsement against Donald Trump for the first time in its history. In February 2020, the editorial board reduced its presence from several editorials each day to occasional editorials for events deemed particularly significant. Since August 2024, the board no longer endorses candidates in local or congressional races in New York. Since 1940, editorial, media, and technology workers of The New York Times have been represented by the New York Times Guild. The Times Guild, along with the Times Tech Guild, are represented by the NewsGuild-CWA. In 1940, Arthur Hays Sulzberger was called upon by the National Labor Relations Board amid accusations that he had discouraged Guild membership in the Times. Over the next few years, the Guild would ratify several contracts, expanding to editorial and news staff in 1942 and maintenance workers in 1943. The New York Times Guild has walked out several times in its history, including for six and a half hours in 1981 and in 2017, when copy editors and reporters walked out at lunchtime in response to the elimination of the copy desk. On December 7, 2022, the union held a one-day strike, the first interruption to The New York Times since 1978. The New York Times Guild reached an agreement in May 2023 to increase minimum salaries for employees and a retroactive bonus. The Times Tech Guild is the largest technology union with collective bargaining rights in the United States. The guild held a second strike beginning on November 4, 2024, threatening the Times's coverage of the 2024 United States presidential election. Content As of August 2025, The New York Times has 11.8 million subscribers, with 11.3 million online-only subscribers and 580,000 print subscribers. The New York Times Company intends to have 15 million subscribers by 2027. The Times's shift towards subscription-based revenue with the debut of an online paywall in 2011 contributed to subscription revenue exceeding advertising revenue the following year, furthered by the 2016 presidential election and Donald Trump. In 2022, Vox wrote that The New York Times's subscribers skew "older, richer, whiter, and more liberal"; to reflect the general population of the United States, the Times has attempted to alter its audience by acquiring The Athletic, investing in verticals such as The New York Times Games, and beginning a marketing campaign showing diverse subscribers to the Times. The New York Times Company chief executive Meredith Kopit Levien stated that the average age of subscribers has remained constant. In October 2001, The New York Times began publishing DealBook, a financial newsletter edited by Andrew Ross Sorkin. The Times had intended to publish the newsletter in September, but delayed its debut following the September 11 attacks. A website for DealBook was established in March 2006. The New York Times began shifting towards DealBook as part of the newspaper's financial coverage in November 2010 with a renewed website and a presence in the Times's print edition. In 2011, the Times began hosting the DealBook Summit, an annual conference hosted by Sorkin. During the COVID-19 pandemic, The New York Times hosted the DealBook Online Summit in 2020 and 2021. The 2022 DealBook Summit featured—among other speakers—former vice president Mike Pence and Israeli prime minister Benjamin Netanyahu, culminating in an interview with former FTX chief executive Sam Bankman-Fried; FTX had filed for bankruptcy several weeks prior. The 2023 DealBook Summit's speakers included vice president Kamala Harris, Israeli president Isaac Herzog, and businessman Elon Musk. In June 2010, The New York Times licensed the political blog FiveThirtyEight in a three-year agreement. The blog, written by Nate Silver, had garnered attention during the 2008 presidential election for predicting the elections in forty-nine of fifty states. FiveThirtyEight appeared on nytimes.com in August. According to Silver, several offers were made for the blog; Silver wrote that a merger of unequals must allow for editorial sovereignty and resources from the acquirer, comparing himself to Groucho Marx. According to The New Republic, FiveThirtyEight drew as much as a fifth of the traffic to nytimes.com during the 2012 presidential election. In July 2013, FiveThirtyEight was sold to ESPN. In an article following Silver's exit, public editor Margaret Sullivan wrote that he was disruptive to the Times's culture for his perspective on probability-based predictions and scorn for polling—having stated that punditry is "fundamentally useless", comparing him to Billy Beane, who implemented sabermetrics in baseball. According to Sullivan, his work was criticized by several notable political journalists. The New Republic obtained a memo in November 2013 revealing then-Washington bureau chief David Leonhardt's ambitions to establish a data-driven newsletter with presidential historian Michael Beschloss, graphic designer Amanda Cox, economist Justin Wolfers, and The New Republic journalist Nate Cohn. By March, Leonhardt had amassed fifteen employees from within The New York Times; the newsletter's staff included individuals who had created the Times's dialect quiz, fourth down analyzer, and a calculator for determining buying or renting a home. The Upshot debuted in April 2014. Fast Company reviewed an article about Illinois Secure Choice—a state-funded retirement saving system—as "neither a terse news item, nor a formal financial advice column, nor a politically charged response to economic policy", citing its informal and neutral tone. The Upshot developed "the needle" for the 2016 presidential election and 2020 presidential elections, a thermometer dial displaying the probability of a candidate winning. In January 2016, Cox was named editor of The Upshot. Kevin Quealy was named editor in June 2022. The New York Times has said it is perceived as a liberal newspaper. An analysis by Pew Research Center in October 2014 placed the Times readership as ideologically liberal based on a scale of 10 political values questions. According to an internal readership poll conducted by The New York Times in 2019, eighty-four percent of readers identified as liberal. The New York Times has struggled internally with how to balance its coverage, dismissing criticism from the left for "sanewashing" right-wing viewpoints in its coverage of Donald Trump. In covering Israel's war on the Gaza Strip that began in 2023, The New York Times instructed its reporters to restrict use of the terms 'Palestine', 'genocide', and 'refugee camps' to specific usages, with data analysis showing a pattern of articles emphasizing Israeli civilians killed by Palestinians over a much larger number of Palestinian civilians killed by Israelis. The group Writers Against the War on Gaza wrote in the blog Mondoweiss that this has contrasted with The New York Times coverage of Russia's invasion of Ukraine, in which Russia is considered a threat to U.S. foreign policy interests, while Israel is considered an ally. In February 1942, The New York Times crossword debuted in The New York Times Magazine; according to Richard Shepard, the attack on Pearl Harbor in December 1941 convinced then-publisher Arthur Hays Sulzberger of the necessity of a crossword. The New York Times has published recipes since the 1850s and has had a separate food section since the 1940s. In 1961, restaurant critic Craig Claiborne published The New York Times Cookbook, an unauthorized cookbook that drew from the Times's recipes. Since 2010, former food editor Amanda Hesser has published The Essential New York Times Cookbook, a compendium of recipes from The New York Times. The Innovation Report in 2014 revealed that the Times had attempted to establish a cooking website since 1998, but faced difficulties with the absence of a defined data structure. In September 2014, The New York Times introduced NYT Cooking, an application and website. Edited by food editor Sam Sifton, the Times's cooking website features 21,000 recipes as of 2022. NYT Cooking features videos as part of an effort by Sifton to hire two former Tasty employees from BuzzFeed. In August 2023, NYT Cooking added personalized recommendations through the cosine similarity of text embeddings of recipe titles. The website also features no-recipe recipes, a concept proposed by Sifton. In May 2016, The New York Times Company announced a partnership with startup Chef'd to form a meal delivery service that would deliver ingredients from The New York Times Cooking recipes to subscribers; Chef'd shut down in July 2018 after failing to accrue capital and secure financing. The Hollywood Reporter reported in September 2022 that the Times would expand its delivery options to US$95 cooking kits curated by chefs such as Nina Compton, Chintan Pandya, and Naoko Takei Moore. That month, the staff of NYT Cooking went on tour with Compton, Pandya, and Moore in Los Angeles, New Orleans, and New York City, culminating in a food festival. In addition, The New York Times offered its own wine club originally operated by the Global Wine Company. The New York Times Wine Club was established in August 2009, during a dramatic decrease in advertising revenue. By 2021, the wine club was managed by Lot18, a company that provides proprietary labels. Lot18 managed the Williams Sonoma Wine Club and its own wine club Tasting Room. The New York Times archives its articles in a basement annex beneath its building known as "the morgue", a venture started by managing editor Carr Van Anda in 1907. The morgue comprises news clippings, a pictures library, and the Times's book and periodicals library. As of 2014, it is the largest library of any media company, dating back to 1851. In November 2018, The New York Times partnered with Google to digitize the Archival Library. Additionally, The New York Times has maintained a virtual microfilm reader known as TimesMachine since 2014. The service launched with archives from 1851 to 1980; in 2016, TimesMachine expanded to include archives from 1981 to 2002. The Times built a pipeline to take in TIFF images, article metadata in XML and an INI file of Cartesian geometry describing the boundaries of the page, and convert it into a PNG of image tiles and JSON containing the information in the XML and INI files. The image tiles are generated using GDAL and displayed using Leaflet, using data from a content delivery network. The Times ran optical character recognition on the articles using Tesseract and shingled and fuzzy string matched the result. The New York Times uses a proprietary content management system known as Scoop for its online content and the Microsoft Word-based content management system CCI for its print content. Scoop was developed in 2008 to serve as a secondary content management system for editors working in CCI to publish their content on the Times's website; as part of The New York Times's online endeavors, editors now write their content in Scoop and send their work to CCI for print publication. Since its introduction, Scoop has superseded several processes within the Times, including print edition planning and collaboration, and features tools such as multimedia integration, notifications, content tagging, and drafts. The New York Times uses private articles for high-profile opinion pieces, such as those written by Russian president Vladimir Putin and actress Angelina Jolie, and for high-level investigations. In January 2012, the Times released Integrated Content Editor (ICE), a revision tracking tool for WordPress and TinyMCE. ICE is integrated within the Times's workflow by providing a unified text editor for print and online editors, reducing the divide between print and online operations. By 2017, The New York Times began developing a new authoring tool to its content management system known as Oak, in an attempt to further the Times's visual efforts in articles and reduce the discrepancy between the mediums in print and online articles. The system reduces the input of editors and supports additional visual mediums in an editor that resembles the appearance of the article. Oak is based on ProseMirror, a JavaScript rich-text editor toolkit, and retains the revision tracking and commenting functionalities of The New York Times's previous systems. Additionally, Oak supports predefined article headers. In 2019, Oak was updated to support collaborative editing using Firebase to update editors's cursor status. Several Google Cloud Functions and Google Cloud Tasks allow articles to be previewed as they will be printed, and the Times's primary MySQL database is regularly updated to update editors on the article status. Style and design Since 1895, The New York Times has maintained a manual of style in several forms. The New York Times Manual of Style and Usage was published on the Times's intranet in 1999. The New York Times uses honorifics when referring to individuals. With the AP Stylebook's removal of honorifics in 2000 and The Wall Street Journal's omission of courtesy titles in May 2023, the Times is the only national newspaper that continues to use honorifics. According to former copy editor Merrill Perlman, The New York Times continues to use honorifics as a "sign of civility". The Times's use of courtesy titles led to an apocryphal rumor that the paper had referred to singer Meat Loaf as "Mr. Loaf". Several exceptions have been made; the former sports section and The New York Times Book Review do not use honorifics. A leaked memo following the killing of Osama bin Laden in May 2011 revealed that editors were given a last-minute instruction to omit the honorific from Osama bin Laden's name, consistent with deceased figures of historic significance, such as Adolf Hitler, Napoleon, and Vladimir Lenin. The New York Times uses academic and military titles for individuals prominently serving in that position. In 1986, the Times began to use Ms., and introduced the gender-neutral title Mx. in 2015. The New York Times uses initials when a subject has expressed a preference, such as Donald Trump. The New York Times maintains a strict but not absolute obscenity policy, including phrases. In a review of the Canadian hardcore punk band Fucked Up, music critic Kelefa Sanneh wrote that the band's name—entirely rendered in asterisks—would not be printed in the Times "unless an American president, or someone similar, says it by mistake"; The New York Times did not repeat then-vice president Dick Cheney's use of "fuck" against then-senator Patrick Leahy in 2004 or then-vice president Joe Biden's remarks that the passage of the Affordable Care Act in 2010 was a "big fucking deal". The Times's profanity policy has been tested by former president Donald Trump. The New York Times published Trump's Access Hollywood tape in October 2016, containing the words "fuck", "pussy", "bitch", and "tits", the first time the publication had published an expletive on its front page, and repeated an explicit phrase for fellatio stated by then-White House communications director Anthony Scaramucci in July 2017. The New York Times omitted Trump's use of the phrase "shithole countries" from its headline in favor of "vulgar language" in January 2018. The Times banned certain words, such as "bitch", "whore", and "sluts", from Wordle in 2022. Journalists for The New York Times do not write their own headlines, but rather copy editors who specifically write headlines. The Times's guidelines insist headline editors get to the main point of an article but avoid giving away endings, if present. Other guidelines include using slang "sparingly", avoiding tabloid headlines, not ending a line on a preposition, article, or adjective, and chiefly, not to pun. The New York Times Manual of Style and Usage states that wordplay, such as "Rubber Industry Bounces Back", is to be tested on a colleague as a canary is to be tested in a coal mine; "when no song bursts forth, start rewriting". The New York Times has amended headlines due to controversy. In 2019, following two back-to-back mass shootings in El Paso and Dayton, the Times used the headline, "Trump Urges Unity vs. Racism", to describe then-president Donald Trump's words after the shootings. After criticism from FiveThirtyEight founder Nate Silver, the headline was changed to, "Assailing Hate But Not Guns". Online, The New York Times's headlines do not face the same length restrictions as headlines that appear in print; print headlines must fit within a column, often six words. Additionally, headlines must "break" properly, containing a complete thought on each line without splitting up prepositions and adverbs. Writers may edit a headline to fit an article more aptly if further developments occur. The Times uses A/B testing for articles on the front page, placing two headlines against each other. At the end of the test, the headlines that receives more traffic is chosen. The alteration of a headline regarding intercepted Russian data used in the Mueller special counsel investigation was noted by Trump in a March 2017 interview with Time, in which he claimed that the headline used the word "wiretapped" in the print version of the paper on January 20, while the digital article on January 19 omitted the word. The headline was intentionally changed in the print version to use "wiretapped" in order to fit within the print guidelines. The nameplate of The New York Times has been unaltered since 1967. In creating the initial nameplate, Henry Jarvis Raymond took as his model the British newspaper The Times, which used a Blackletter style called Textura, popularized following the fall of the Western Roman Empire and regional variations of Alcuin's script, as well as a period. With the change to The New-York Times on September 14, 1857, the nameplate followed. Under George Jones, the terminals of the "N", "r", and "s" were intentionally exaggerated into swashes. The nameplate in the January 15, 1894, issue trimmed the terminals once more, smoothed the edges, and turned the stem supporting the "T" into an ornament. The hyphen was dropped on December 1, 1896, after Adolph Ochs purchased the paper. The descender of the "h" was shortened on December 30, 1914. The largest change to the nameplate was introduced on February 21, 1967, when type designer Ed Benguiat redesigned the logo, most prominently turning the arrow ornament into a diamond. Notoriously, the new logo dropped the period that had followed the word Times up until that point; one reader compared the omission of the period to "performing plastic surgery on Helen of Troy." Picture editor John Radosta worked with a New York University professor to determine that dropping the period saved the paper US$41.28 (equivalent to $398.59 in 2025). Print edition As of December 2023, The New York Times has printed sixty thousand issues, a statistic represented in the paper's masthead to the right of the volume number, the Times's years in publication written in Roman numerals. The volume and issues are separated by four dots representing the edition number of that issue; on the day of the 2000 presidential election, the Times was revised four separate times, necessitating the use of an em dash in place of an ellipsis. The em dash issue was printed hundreds times over before being replaced by the one-dot issue. Despite efforts by newsroom employees to recycle copies sent to The New York Times's office, several copies were kept, including one put on display at the Museum at The Times. From February 7, 1898, to December 31, 1999, the Times's issue number was incorrect by five hundred issues, an error suspected by The Atlantic to be the result of a careless front page type editor. The misreporting was noticed by news editor Aaron Donovan, who was calculating the number of issues in a spreadsheet and noticed the discrepancy. The New York Times celebrated fifty thousand issues on March 14, 1995, an observance that should have occurred on July 26, 1996. The New York Times has reduced the physical size of its print edition while retaining its broadsheet format. The New-York Daily Times debuted at 18 inches (460 mm) across. By the 1950s, the Times was being printed at 16 inches (410 mm) across. In 1953, an increase in paper costs to US$10 (equivalent to $120.34 in 2025) a ton increased newsprint costs to US$21.7 million (equivalent to $326,110,074.63 in 2025) On December 28, 1953, the pages were reduced to 15.5 inches (390 mm). On February 14, 1955, a further reduction to 15 inches (380 mm) occurred, followed by 14.5 and 13.5 inches (370 and 340 mm). On August 6, 2007, the largest cut occurred when the pages were reduced to 12 inches (300 mm),[k] a decision that other broadsheets had previously considered. Then-executive editor Bill Keller stated that a narrower paper would be more beneficial to the reader but acknowledged a net loss in article space of five percent. In 1985, The New York Times Company established a minority stake in a US$21.7 million (equivalent to $326,110,074.63 in 2025) newsprint plant in Clermont, Quebec through Donahue Malbaie. The company sold its equity interest in Donahue Malbaie in 2017. The New York Times often uses large, bolded headlines for major events. For the print version of the Times, these headlines are written by one copy editor, reviewed by two other copy editors, approved by the masthead editors, and polished by other print editors. The process is completed before 8 p.m., but it may be repeated if further development occur, as did take place during the 2020 presidential election. On the day Joe Biden was declared the winner, The New York Times utilized a "hammer headline" reading, "Biden Beats Trump", in all caps and bolded. A dozen journalists discussed several potential headlines, such as "It's Biden" or "Biden's Moment", and prepared for a Donald Trump victory, in which they would use "Trump Prevails". During Trump's first impeachment, the Times drafted the hammer headline, "Trump Impeached". The New York Times altered the ligatures between the E and the A, as not doing so would leave a noticeable gap due to the stem of the A sloping away from the E. The Times reused the tight kerning for "Biden Beats Trump" and Trump's second impeachment, which simply read, "Impeached". In cases where two major events occur on the same day or immediately after each other, The New York Times has used a "paddle wheel" headline, where both headlines are used but split by a line. The term dates back to August 8, 1959, when it was revealed that the United States was monitoring Soviet missile firings and when Explorer 6—shaped like a paddle wheel—launched. Since then, the paddle wheel has been used several times, including on January 21, 1981, when Ronald Reagan was sworn in minutes before Iran released fifty-two American hostages, ending the Iran hostage crisis. At the time, most newspapers favored the end of the hostage crisis, but the Times placed the inauguration above the crisis. Other occasions in which the paddle wheel has been used include on July 26, 2000, when the 2000 Camp David Summit ended without an agreement and when Bush announced that Dick Cheney would be his running mate, and on June 24, 2016, when the United Kingdom European Union membership referendum passed, beginning Brexit, and when the Supreme Court deadlocked in United States v. Texas. The New York Times has run editorials from its editorial board on the front page twice. On June 13, 1920, the Times ran an editorial opposing Warren G. Harding, who was nominated during that year's Republican Party presidential primaries. Amid growing acceptance to run editorials on the front pages from publications such as the Detroit Free Press, The Patriot-News, The Arizona Republic, and The Indianapolis Star, The New York Times ran an editorial on its front page on December 5, 2015, following a terrorist attack in San Bernardino, California, in which fourteen people were killed. The editorial advocates for the prohibition of "slightly modified combat rifles" used in the San Bernardino shooting and "certain kinds of ammunition". Conservative figures, including Texas senator Ted Cruz, The Weekly Standard editor Bill Kristol, Fox & Friends co-anchor Steve Doocy, and then-New Jersey governor Chris Christie criticized the Times. Talk radio host Erick Erickson acquired an issue of The New York Times to fire several rounds into the paper, posting a picture online. Since 1997, The New York Times's primary distribution center is located in College Point, Queens. The facility is 300,000 ft2 (28,000 m2) and employs 170 people as of 2017. The College Point distribution center prints 300,000 to 800,000 newspapers daily. On most occasions, presses start before 11 p.m. and finish before 3 a.m. A robotic crane grabs a roll of newsprint and several rollers ensure ink can be printed on paper. The final newspapers are wrapped in plastic and shipped out. As of 2018, the College Point facility accounted for 41 percent of production. Other copies are printed at 26 other publications, such as The Atlanta Journal-Constitution, The Dallas Morning News, The Santa Fe New Mexican, and the Courier Journal. With the decline of newspapers, particularly regional publications, the Times must travel further; for example, newspapers for Hawaii are flown from San Francisco on United Airlines, and Sunday papers are flown from Los Angeles on Hawaiian Airlines. Computer glitches, mechanical issues, and weather phenomena affect circulation but do not stop the paper from reaching customers. The College Point facility prints over two dozen other papers, including The Wall Street Journal and USA Today. The New York Times has halted its printing process several times to account for major developments. The first printing stoppage occurred on March 31, 1968, when then-president Lyndon B. Johnson announced that he would not seek a second term. Other press stoppages include May 19, 1994, for the death of former first lady Jacqueline Kennedy Onassis, and July 17, 1996, for Trans World Airlines Flight 800. The 2000 presidential election necessitated two press stoppages. Al Gore appeared to concede on November 8, forcing then-executive editor Joseph Lelyveld to stop the Times's presses to print a new headline, "Bush Appears to Defeat Gore", with a story that stated George W. Bush was elected president. However, Gore held off his concession speech over doubts over Florida. Lelyveld reran the headline, "Bush and Gore Vie for an Edge". Since 2000, three printing stoppages have been issued for the death of William Rehnquist on September 3, 2005, for the killing of Osama bin Laden on May 1, 2011, and for the passage of the Marriage Equality Act in the New York State Assembly and subsequent signage by then-governor Andrew Cuomo on June 24, 2011. Online platforms The New York Times website is hosted at nytimes.com. It has undergone several major redesigns and infrastructure developments since its debut. In April 2006, The New York Times redesigned its website with an emphasis on multimedia. In preparation for Super Tuesday in February 2008, the Times developed a live election system using the Associated Press's File Transfer Protocol (FTP) service and a Ruby on Rails application; nytimes.com experienced its largest traffic on Super Tuesday and the day after. The NYTimes application debuted with the introduction of the App Store on July 10, 2008. Engadget's Scott McNulty wrote critically of the app, negatively comparing it to The New York Times's mobile website. An iPad version with select articles was released on April 3, 2010, with the release of the first-generation iPad. In October, The New York Times expanded NYT Editors' Choice to include the paper's full articles. NYT for iPad was free until 2011. The Times applications on iPhone and iPad began offering in-app subscriptions in July 2011. The Times released a web application for iPad—featuring a format summarizing trending headlines on Twitter—and a Windows 8 application in October 2012. Efforts to ensure profitability through an online magazine and a "Need to Know" subscription emerged in Adweek in July 2013. In March 2014, The New York Times announced three applications—NYT Now, an application that offers pertinent news in a blog format, and two unnamed applications, later known as NYT Opinion and NYT Cooking—to diversify its product laterals. The Daily is the modern front page of The New York Times. The New York Times manages several podcasts, including multiple podcasts with Serial Productions. The Times's longest-running podcast is The Book Review Podcast, debuting as Inside The New York Times Book Review in April 2006. The New York Times's defining podcast is The Daily, a daily news podcast hosted by Michael Barbaro which debuted on February 1, 2017. Between March 2022 and March 2025, the approximately 30 minute programme was co-hosted with Sabrina Tavernise. Beginning in April 2025 Barbaro was joined by two new regular co-hosts, Natalie Kitroeff and Rachel Abrams. The Interview was launched in 2024 and is hosted weekly by David Marchese and Lulu Garcia-Navarro. Episodes typically last 40 to 50 minutes. Condensed versions of the interviews are published simultaneously in The New York Times Magazine. Guests have included politicians, actors, influential experts, media figures and high-profile writers. In October 2021, The New York Times began testing "New York Times Audio", an application featuring podcasts from the Times, audio versions of articles—including from other publications through Audm, and archives from This American Life. The application debuted in May 2023 exclusively on iOS for Times subscribers. New York Times Audio includes exclusive podcasts such as The Headlines, a daily news recap, and Shorts, short audio stories under ten minutes. In addition, a "Reporter Reads" section features Times journalists reading their articles and providing commentary. The New York Times has used video games as part of its journalistic efforts, among the first publications to do so, contributing to an increase in Internet traffic; the publication has also developed its own video games. In 2014, The New York Times Magazine introduced Spelling Bee, a word game in which players guess words from a set of letters in a honeycomb and are awarded points for the length of the word and receive extra points if the word is a pangram. The game was proposed by Will Shortz, created by Frank Longo, and has been maintained by Sam Ezersky. In May 2018, Spelling Bee was published on nytimes.com, furthering its popularity. In February 2019, the Times introduced Letter Boxed, in which players form words from letters placed on the edges of a square box, followed in June 2019 by Tiles, a matching game in which players form sequences of tile pairings, and Vertex, in which players connect vertices to assemble an image. In July 2023, The New York Times introduced Connections, in which players identify groups of words that are connected by a common property. In April, the Times introduced Digits, a game that required using operations on different values to reach a set number; Digits was shut down in August. In March 2024, The New York Times released Strands, a themed word search. In January 2022, The New York Times Company acquired Wordle, a word game developed by Josh Wardle in 2021, at a valuation in the "low-seven figures". The acquisition was proposed by David Perpich, a member of the Sulzberger family who proposed the purchase to Knight over Slack after reading about the game. The Washington Post purportedly considered acquiring Wordle, according to Vanity Fair. At the 2022 Game Developers Conference, Wardle stated that he was overwhelmed by the volume of Wordle facsimiles and overzealous monetization practices in other games. Concerns over The New York Times monetizing Wordle by implementing a paywall mounted; Wordle is a client-side browser game and can be played offline by downloading its webpage. Wordle moved to the Times's servers and website in February. The game was added to the NYT Games application in August, necessitating it be rewritten in the JavaScript library React. In November, The New York Times announced that Tracy Bennett would be the Wordle's editor. Other publications The New York Times Magazine and The Boston Globe Magazine are the only weekly Sunday magazines following The Washington Post Magazine's cancellation in December 2022. In February 2016, The New York Times introduced a Spanish website, The New York Times en Español. The website, intended to be read on mobile devices, would contain translated articles from the Times and reporting from journalists based in Mexico City. The Times en Español's style editor is Paulina Chavira, who has advocated for pluralistic Spanish to accommodate the variety of nationalities in the newsroom's journalists and wrote a stylebook for The New York Times en Español. Articles the Times intends to publish in Spanish are sent to a translation agency and adapted for Spanish writing conventions; the present progressive tense may be used for forthcoming events in English, but other tenses are preferable in Spanish. The Times en Español consults the Real Academia Española and Fundéu and frequently modifies the use of diacritics—such as using an acute accent for the Cártel de Sinaloa but not the Cartel de Medellín—and using the gender-neutral pronoun elle. Headlines in The New York Times en Español are not capitalized. The Times en Español publishes El Times, a newsletter led by Elda Cantú intended for all Spanish speakers. In September 2019, The New York Times ended The New York Times en Español's separate operations. A study published in The Translator in 2023 found that the Times en Español engaged in tabloidization. In June 2012, The New York Times introduced a Chinese website, 纽约时报中文, in response to Chinese editions created by The Wall Street Journal and the Financial Times. Conscious to censorship, the Times established servers outside of China and affirmed that the website would uphold the paper's journalistic standards; the government of China had previously blocked articles from nytimes.com through the Great Firewall, and the website was blocked in China until August 2001 after then-general secretary Jiang Zemin met with journalists from The New York Times. Then-foreign editor Joseph Kahn assisted in the establishment of cn.nytimes.com, an effort that contributed to his appointment as executive editor in April 2022. In October 2012, 纽约时报中文 published an article detailing the wealth of then-premier Wen Jiabao's family. In response, the government of China blocked access to nytimes.com and cn.nytimes.com and references to the Times and Wen were censored on microblogging service Sina Weibo. In March 2015, a mirror of 纽约时报中文 and the website for GreatFire were the targets for a government-sanctioned distributed denial of service attack on GitHub in March 2015, disabling access to the service for several days. Chinese authorities requested the removal of The New York Times's news applications from the App Store in December 2016. Awards and recognition As of 2023, The New York Times has received 137 Pulitzer Prizes, the most of any publication. The New York Times is considered a newspaper of record in the United States.[l] The Times is the largest metropolitan newspaper in the United States; as of 2022, The New York Times is the second-largest newspaper by print circulation in the United States behind The Wall Street Journal. A study published in Science, Technology, & Human Values in 2013 found that The New York Times received more citations in academic journals than the American Sociological Review, Research Policy, or the Harvard Law Review. With sixteen million unique records, the Times is the third-most referenced source in Common Crawl, a collection of online material used in datasets such as GPT-3, behind Wikipedia and a United States patent database. The New Yorker's Max Norman wrote in March 2023 that the Times has shaped mainstream English usage. In a January 2018 article for The Washington Post, Margaret Sullivan stated that The New York Times affects the "whole media and political ecosystem". The New York Times's nascent success has led to concerns over media consolidation, particularly amid the decline of newspapers. In 2006, economists Lisa George and Joel Waldfogel examined the consequences of the Times's national distribution strategy and audience with circulation of local newspapers, finding that local circulation decreased among college-educated readers. The effect of The New York Times in this manner was observed in The Forum of Fargo-Moorhead, the newspaper of record for Fargo, North Dakota. Axios founder Jim VandeHei opined that the Times is "going to basically be a monopoly" in an opinion piece written by then-media columnist and former BuzzFeed News editor-in-chief Ben Smith; in the article, Smith cites the strength of The New York Times's journalistic workforce, broadening content, and the expropriation of Gawker editor-in-chief Choire Sicha, Recode editor-in-chief Kara Swisher, and Quartz editor-in-chief Kevin Delaney. Smith compared the Times to the New York Yankees during their 1927 season containing Murderers' Row. Controversies Since 2003, studies analyzing coverage of the Israeli–Palestinian conflict in the New York Times have demonstrated a bias against Palestinians and in favor of Israel.[m] The New York Times has received criticism for its coverage of the Gaza war and genocide. In April 2024, The Intercept reported that a November 2023 internal memorandum by Susan Wessling and Philip Pan instructed journalists to reduce using the terms "genocide" and "ethnic cleansing" and to avoid using the phrase "occupied territory" in the context of Palestinian land, "Palestine" except in rare circumstances, and the term "refugee camps" to describe areas of Gaza despite recognition from the United Nations. A spokesperson from the Times stated that issuing guidance was standard practice. An analysis by The Intercept noted that The New York Times described Israeli deaths as a massacre nearly sixty times, but had only described Palestinian deaths as a massacre once. Writers and editors have left the newspaper due to its coverage of events in Gaza, including Jazmine Hughes and Jamie Lauren Keiles. In December 2023, The New York Times published an investigation titled "'Screams Without Words': How Hamas Weaponized Sexual Violence on Oct. 7", alleging that Hamas weaponized sexual and gender-based violence during its armed incursion on Israel. The investigation was the subject of an article from The Intercept questioning the journalistic acumen of Anat Schwartz, a filmmaker involved in the inquiry who had no prior reporting experience and agreed with a post stating Israel should "violate any norm, on the way to victory", doubting the veracity of the opening claim that Gal Abdush was raped in a timespan disputed by her family, and alleging that the Times was pressured by the Committee for Accuracy in Middle East Reporting in America. The New York Times initiated an inquiry into the leaking of confidential information about the report to other outlets, which received criticism from NewsGuild of New York president Susan DeCarava for purported racial targeting; the Times's investigation was inconclusive, but found gaps in the way proprietary journalistic material is handled. The New York Times Building has been a site of protest action during the Gaza war and genocide, including a November 2023 sit-in demanding that The Times's editorial board publicly call for a ceasefire and accusing the media company of "complicity in laundering genocide", a February 29, 2024, protest and press conference following the release of The Intercept's critical investigation into the NYT "Screams Without Words" exposé, and an action on July 30, 2025, in which protesters spray-painted "NYT Lies, Gaza dies" on the building's glass facade. In addition, protesters blocked The New York Times's distribution center March 14, 2024 and executive editor Joseph Kahn's residence was splattered with red paint on August 25, 2025. The collective Writers Against the War on Gaza, which publishes the mock publication The New York War Crimes, has been associated with protests against The New York Times. On October 27, 2025, 300 writers—including scholars, journalists, and public intellectuals—pledged to boycott The New York Times and withhold contributions to the paper in protest of what they describe as its complicity in the Gaza genocide, demanding 1) a review of anti-Palestinian bias in the newsroom, 2) a retraction of "Screams Without Words", and 3) a call from the editorial board for a US arms embargo on Israel. Among the initial signatories, about 150 had previously contributed to the Times. The New York Times has received criticism regarding its coverage of transgender people. When it published an opinion piece by Weill Cornell Medicine professor Richard A. Friedman called "How Changeable Is Gender?" in August 2015, Vox's German Lopez criticized Friedman as suggesting that parents and doctors might be right in letting children suffer from severe dysphoria in case something changes down the line, and as implying that conversion therapy may work for transgender children. In February 2023, nearly one thousand current and former Times writers and contributors wrote an open letter addressed to standards editor Philip B. Corbett, criticizing the paper's coverage of transgender, non-binary, and gender-nonconforming people; some of the Times's articles have been cited in state legislatures attempting to justify criminalizing gender-affirming care. Contributors wrote in the open letter that "the Times has in recent years treated gender diversity with an eerily familiar mix of pseudoscience and euphemistic, charged language, while publishing reporting on trans children that omits relevant information about its sources."[n] According to former Times journalist Billie Jean Sweeney, a push for writers to challenge “every aspect of being trans”, ranging from gender-inclusive language to access to medical care, came from the top in 2022 after leadership was handed over to A. G. Sulzberger, Joe Kahn, and Carolyn Ryan; as part of an effort to win good will with the Trump campaign without incurring backlash from the general populace. The Times has continually denied any bias in its reporting, insisting that its coverage of “fiercely contested medical and legal debates” is fair and balanced, and that it would not tolerate journalists protesting its transgender coverage. Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_ref-Usborne-2018_8-0] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://www.ynet.co.il/vacation] | [TOKENS: 228] |
גבעות ירוקות ותורמוסים כחולים: זה הזמן המושלם לטייל בנחל תבור | המדריך המלא "חושבים שאני דיילת": הנשים שמסרבות לשבת מאחור מפציצים ומטוסי נוסעים: כך תעקבו אחרי הדרמות שמתחוללות בשמיים "לקח זמן להבין מה מקור השלווה": נפשנו באי שאין בו כלי רכב |
======================================== |
[SOURCE: https://www.mako.co.il/help-sitemap/Article-cd09064c4d2b591027.htm] | [TOKENS: 92] |
We are sorry... ...but your activity and behavior on this website made us think that you are a bot. Please solve this CAPTCHA in helping us understand your behavior to grant access You reached this page when trying to access https://www.mako.co.il/help-sitemap/Article-cd09064c4d2b591027.htm from 79.181.162.231 on February 21 2026, 10:52:51 UTC |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Meta_Platforms#cite_ref-fb2017annuall_220-0] | [TOKENS: 8626] |
Contents Meta Platforms Meta Platforms, Inc. (doing business as Meta) is an American multinational technology company headquartered in Menlo Park, California. Meta owns and operates several prominent social media platforms and communication services, including Facebook, Instagram, WhatsApp, Messenger, Threads and Manus. The company also operates an advertising network for its own sites and third parties; as of 2023[update], advertising accounted for 97.8 percent of its total revenue. Meta has been described as a part of Big Tech, which refers to the largest six tech companies in the United States, Alphabet (Google), Amazon, Apple, Meta (Facebook), Microsoft, and Nvidia, which are also the largest companies in the world by market capitalization. The company was originally established in 2004 as TheFacebook, Inc., and was renamed Facebook, Inc. in 2005. In 2021, it rebranded as Meta Platforms, Inc. to reflect a strategic shift toward developing the metaverse—an interconnected digital ecosystem spanning virtual and augmented reality technologies. In 2023, Meta was ranked 31st on the Forbes Global 2000 list of the world's largest public companies. As of 2022, it was the world's third-largest spender on research and development, with R&D expenses totaling US$35.3 billion. History Facebook filed for an initial public offering (IPO) on January 1, 2012. The preliminary prospectus stated that the company sought to raise $5 billion, had 845 million monthly active users, and a website accruing 2.7 billion likes and comments daily. After the IPO, Zuckerberg would retain 22% of the total shares and 57% of the total voting power in Facebook. Underwriters valued the shares at $38 each, valuing the company at $104 billion, the largest valuation yet for a newly public company. On May 16, one day before the IPO, Facebook announced it would sell 25% more shares than originally planned due to high demand. The IPO raised $16 billion, making it the third-largest in US history (slightly ahead of AT&T Mobility and behind only General Motors and Visa). The stock price left the company with a higher market capitalization than all but a few U.S. corporations—surpassing heavyweights such as Amazon, McDonald's, Disney, and Kraft Foods—and made Zuckerberg's stock worth $19 billion. The New York Times stated that the offering overcame questions about Facebook's difficulties in attracting advertisers to transform the company into a "must-own stock". Jimmy Lee of JPMorgan Chase described it as "the next great blue-chip". Writers at TechCrunch, on the other hand, expressed skepticism, stating, "That's a big multiple to live up to, and Facebook will likely need to add bold new revenue streams to justify the mammoth valuation." Trading in the stock, which began on May 18, was delayed that day due to technical problems with the Nasdaq exchange. The stock struggled to stay above the IPO price for most of the day, forcing underwriters to buy back shares to support the price. At the closing bell, shares were valued at $38.23, only $0.23 above the IPO price and down $3.82 from the opening bell value. The opening was widely described by the financial press as a disappointment. The stock set a new record for trading volume of an IPO. On May 25, 2012, the stock ended its first full week of trading at $31.91, a 16.5% decline. On May 22, 2012, regulators from Wall Street's Financial Industry Regulatory Authority announced that they had begun to investigate whether banks underwriting Facebook had improperly shared information only with select clients rather than the general public. Massachusetts Secretary of State William F. Galvin subpoenaed Morgan Stanley over the same issue. The allegations sparked "fury" among some investors and led to the immediate filing of several lawsuits, one of them a class action suit claiming more than $2.5 billion in losses due to the IPO. Bloomberg estimated that retail investors may have lost approximately $630 million on Facebook stock since its debut. S&P Global Ratings added Facebook to its S&P 500 index on December 21, 2013. On May 2, 2014, Zuckerberg announced that the company would be changing its internal motto from "Move fast and break things" to "Move fast with stable infrastructure". The earlier motto had been described as Zuckerberg's "prime directive to his developers and team" in a 2009 interview in Business Insider, in which he also said, "Unless you are breaking stuff, you are not moving fast enough." In November 2016, Facebook announced the Microsoft Windows client of gaming service Facebook Gameroom, formerly Facebook Games Arcade, at the Unity Technologies developers conference. The client allows Facebook users to play "native" games in addition to its web games. The service was closed in June 2021. Lasso was a short-video sharing app from Facebook similar to TikTok that was launched on iOS and Android in 2018 and was aimed at teenagers. On July 2, 2020, Facebook announced that Lasso would be shutting down on July 10. In 2018, the Oculus lead Jason Rubin sent his 50-page vision document titled "The Metaverse" to Facebook's leadership. In the document, Rubin acknowledged that Facebook's virtual reality business had not caught on as expected, despite the hundreds of millions of dollars spent on content for early adopters. He also urged the company to execute fast and invest heavily in the vision, to shut out HTC, Apple, Google and other competitors in the VR space. Regarding other players' participation in the metaverse vision, he called for the company to build the "metaverse" to prevent their competitors from "being in the VR business in a meaningful way at all". In May 2019, Facebook founded Libra Networks, reportedly to develop their own stablecoin cryptocurrency. Later, it was reported that Libra was being supported by financial companies such as Visa, Mastercard, PayPal and Uber. The consortium of companies was expected to pool in $10 million each to fund the launch of the cryptocurrency coin named Libra. Depending on when it would receive approval from the Swiss Financial Market Supervisory authority to operate as a payments service, the Libra Association had planned to launch a limited format cryptocurrency in 2021. Libra was renamed Diem, before being shut down and sold in January 2022 after backlash from Swiss government regulators and the public. During the COVID-19 pandemic, the use of online services, including Facebook, grew globally. Zuckerberg predicted this would be a "permanent acceleration" that would continue after the pandemic. Facebook hired aggressively, growing from 48,268 employees in March 2020 to more than 87,000 by September 2022. Following a period of intense scrutiny and damaging whistleblower leaks, news started to emerge on October 21, 2021 about Facebook's plan to rebrand the company and change its name. In the Q3 2021 earnings call on October 25, Mark Zuckerberg discussed the ongoing criticism of the company's social services and the way it operates, and pointed to the pivoting efforts to building the metaverse – without mentioning the rebranding and the name change. The metaverse vision and the name change from Facebook, Inc. to Meta Platforms was introduced at Facebook Connect on October 28, 2021. Based on Facebook's PR campaign, the name change reflects the company's shifting long term focus of building the metaverse, a digital extension of the physical world by social media, virtual reality and augmented reality features. "Meta" had been registered as a trademark in the United States in 2018 (after an initial filing in 2015) for marketing, advertising, and computer services, by a Canadian company that provided big data analysis of scientific literature. This company was acquired in 2017 by the Chan Zuckerberg Initiative (CZI), a foundation established by Zuckerberg and his wife, Priscilla Chan, and became one of their projects. Following the rebranding announcement, CZI announced that it had already decided to deprioritize the earlier Meta project, thus it would be transferring its rights to the name to Meta Platforms, and the previous project would end in 2022. Soon after the rebranding, in early February 2022, Meta reported a greater-than-expected decline in profits in the fourth quarter of 2021. It reported no growth in monthly users, and indicated it expected revenue growth to stall. It also expected measures taken by Apple Inc. to protect user privacy to cost it some $10 billion in advertisement revenue, an amount equal to roughly 8% of its revenue for 2021. In meeting with Meta staff the day after earnings were reported, Zuckerberg blamed competition for user attention, particularly from video-based apps such as TikTok. The 27% reduction in the company's share price which occurred in reaction to the news eliminated some $230 billion of value from Meta's market capitalization. Bloomberg described the decline as "an epic rout that, in its sheer scale, is unlike anything Wall Street or Silicon Valley has ever seen". Zuckerberg's net worth fell by as much as $31 billion. Zuckerberg owns 13% of Meta, and the holding makes up the bulk of his wealth. According to published reports by Bloomberg on March 30, 2022, Meta turned over data such as phone numbers, physical addresses, and IP addresses to hackers posing as law enforcement officials using forged documents. The law enforcement requests sometimes included forged signatures of real or fictional officials. When asked about the allegations, a Meta representative said, "We review every data request for legal sufficiency and use advanced systems and processes to validate law enforcement requests and detect abuse." In June 2022, Sheryl Sandberg, the chief operating officer of 14 years, announced she would step down that year. Zuckerberg said that Javier Olivan would replace Sandberg, though in a “more traditional” role. In March 2022, Meta (except Meta-owned WhatsApp) and Instagram were banned in Russia and added to the Russian list of terrorist and extremist organizations for alleged Russophobia and hate speech (up to genocidal calls) amid the ongoing Russian invasion of Ukraine. Meta appealed against the ban, but it was upheld by a Moscow court in June of the same year. Also in March 2022, Meta and Italian eyewear giant Luxottica released Ray-Ban Stories, a series of smartglasses which could play music and take pictures. Meta and Luxottica parent company EssilorLuxottica declined to disclose sales on the line of products as of September 2022, though Meta has expressed satisfaction with its customer feedback. In July 2022, Meta saw its first year-on-year revenue decline when its total revenue slipped by 1% to $28.8bn. Analysts and journalists accredited the loss to its advertising business, which has been limited by Apple's app tracking transparency feature and the number of people who have opted not to be tracked by Meta apps. Zuckerberg also accredited the decline to increasing competition from TikTok. On October 27, 2022, Meta's market value dropped to $268 billion, a loss of around $700 billion compared to 2021, and its shares fell by 24%. It lost its spot among the top 20 US companies by market cap, despite reaching the top 5 in the previous year. In November 2022, Meta laid off 11,000 employees, 13% of its workforce. Zuckerberg said the decision to aggressively increase Meta's investments had been a mistake, as he had wrongly predicted that the surge in e-commerce would last beyond the COVID-19 pandemic. He also attributed the decline to increased competition, a global economic downturn and "ads signal loss". Plans to lay off a further 10,000 employees began in April 2023. The layoffs were part of a general downturn in the technology industry, alongside layoffs by companies including Google, Amazon, Tesla, Snap, Twitter and Lyft. Starting from 2022, Meta scrambled to catch up to other tech companies in adopting specialized artificial intelligence hardware and software. It had been using less expensive CPUs instead of GPUs for AI work, but that approach turned out to be less efficient. The company gifted the Inter-university Consortium for Political and Social Research $1.3 million to finance the Social Media Archive's aim to make their data available to social science research. In 2023, Ireland's Data Protection Commissioner imposed a record EUR 1.2 billion fine on Meta for transferring data from Europe to the United States without adequate protections for EU citizens.: 250 In March 2023, Meta announced a new round of layoffs that would cut 10,000 employees and close 5,000 open positions to make the company more efficient. Meta revenue surpassed analyst expectations for the first quarter of 2023 after announcing that it was increasing its focus on AI. On July 6, Meta launched a new app, Threads, a competitor to Twitter. Meta announced its artificial intelligence model Llama 2 in July 2023, available for commercial use via partnerships with major cloud providers like Microsoft. It was the first project to be unveiled out of Meta's generative AI group after it was set up in February. It would not charge access or usage but instead operate with an open-source model to allow Meta to ascertain what improvements need to be made. Prior to this announcement, Meta said it had no plans to release Llama 2 for commercial use. An earlier version of Llama was released to academics. In August 2023, Meta announced its permanent removal of news content from Facebook and Instagram in Canada due to the Online News Act, which requires Canadian news outlets to be compensated for content shared on its platform. The Online News Act was in effect by year-end, but Meta will not participate in the regulatory process. In October 2023, Zuckerberg said that AI would be Meta's biggest investment area in 2024. Meta finished 2023 as one of the best-performing technology stocks of the year, with its share price up 150 percent. Its stock reached an all-time high in January 2024, bringing Meta within 2% of achieving $1 trillion market capitalization. In November 2023 Meta Platforms launched an ad-free service in Europe, allowing subscribers to opt-out of personal data being collected for targeted advertising. A group of 28 European organizations, including Max Schrems' advocacy group NOYB, the Irish Council for Civil Liberties, Wikimedia Europe, and the Electronic Privacy Information Center, signed a 2024 letter to the European Data Protection Board (EDPB) expressing concern that this subscriber model would undermine privacy protections, specifically GDPR data protection standards. Meta removed the Facebook and Instagram accounts of Iran's Supreme Leader Ali Khamenei in February 2024, citing repeated violations of its Dangerous Organizations & Individuals policy. As of March, Meta was under investigation by the FDA for alleged use of their social media platforms to sell illegal drugs. On 16 May 2024, the European Commission began an investigation into Meta over concerns related to child safety. In May 2023, Iraqi social media influencer Esaa Ahmed-Adnan encountered a troubling issue when Instagram removed his posts, citing false copyright violations despite his content being original and free from copyrighted material. He discovered that extortionists were behind these takedowns, offering to restore his content for $3,000 or provide ongoing protection for $1,000 per month. This scam, exploiting Meta’s rights management tools, became widespread in the Middle East, revealing a gap in Meta’s enforcement in developing regions. An Iraqi nonprofit Tech4Peace’s founder, Aws al-Saadi helped Ahmed-Adnan and others, but the restoration process was slow, leading to significant financial losses for many victims, including prominent figures like Ammar al-Hakim. This situation highlighted Meta’s challenges in balancing global growth with effective content moderation and protection. On 16 September 2024, Meta announced it had banned Russian state media outlets from its platforms worldwide due to concerns about "foreign interference activity." This decision followed allegations that RT and its employees funneled $10 million through shell companies to secretly fund influence campaigns on various social media channels. Meta's actions were part of a broader effort to counter Russian covert influence operations, which had intensified since the invasion. At its 2024 Connect conference, Meta presented Orion, its first pair of augmented reality glasses. Though Orion was originally intended to be sold to consumers, the manufacturing process turned out to be too complex and expensive. Instead, the company pivoted to producing a small number of the glasses to be used internally. On 4 October 2024, Meta announced about its new AI model called Movie Gen, capable of generating realistic video and audio clips based on user prompts. Meta stated it would not release Movie Gen for open development, preferring to collaborate directly with content creators and integrate it into its products by the following year. The model was built using a combination of licensed and publicly available datasets. On October 31, 2024, ProPublica published an investigation into deceptive political advertisement scams that sometimes use hundreds of hijacked profiles and facebook pages run by organized networks of scammers. The authors cited spotty enforcement by Meta as a major reason for the extent of the issue. In November 2024, TechCrunch reported that Meta were considering building a $10bn global underwater cable spanning 25,000 miles. In the same month, Meta closed down 2 million accounts on Facebook and Instagram that were linked to scam centers in Myanmar, Laos, Cambodia, the Philippines, and the United Arab Emirates doing pig butchering scams. In December 2024, Meta announced that, beginning February 2025, they would require advertisers to run ads about financial services in Australia to verify information about who are the beneficiary and the payer in a bid to regulate scams. On December 4, 2024, Meta announced it will invest US$10 billion for its largest AI data center in northeast Louisiana, powered by natural gas facilities. On the 11th of that month, Meta experienced a global outage, impacting accounts on all of their social media and messaging applications. Outage reports from DownDetector reached 70,000+ and 100,000+ within minutes for Instagram and Facebook, respectively. In January 2025, Meta announced plans to roll back its diversity, equity, and inclusion (DEI) initiatives, citing shifts in the "legal and policy landscape" in the United States following the 2024 presidential election. The decision followed reports that CEO Mark Zuckerberg sought to align the company more closely with the incoming Trump administration, including changes to content moderation policies and executive leadership. The new content moderation policies continued to bar insults about a person's intellect or mental illness, but made an exception to allow calling LGBTQ people mentally ill because they are gay or transgender. Later that month, Meta agreed to pay $25 million to settle a 2021 lawsuit brought by Donald Trump for suspending his social media accounts after the January 6 riots. Changes to Meta's moderation policies were controversial among its oversight board, with a significant divide in opinion between the board's US conservatives and its global members. In June 2025, Meta Platforms Inc. has decided to make a multibillion-dollar investment into artificial intelligence startup Scale AI. The financing could exceed $10 billion in value which would make it one of the largest private company funding events of all time. In October 2025, it was announced that Meta would be laying off 600 employees in the artificial intelligence unit to perform better and simpler. They referred to their AI unit as "bloated" and are seeking to trim down the department. This mass layoff is going to impact Meta’s AI infrastructure units, Fundamental Artificial Intelligence Research unit (FAIR) and other product-related positions. Mergers and acquisitions Meta has acquired multiple companies (often identified as talent acquisitions). One of its first major acquisitions was in April 2012, when it acquired Instagram for approximately US$1 billion in cash and stock. In October 2013, Facebook, Inc. acquired Onavo, an Israeli mobile web analytics company. In February 2014, Facebook, Inc. announced it would buy mobile messaging company WhatsApp for US$19 billion in cash and stock. The acquisition was completed on October 6. Later that year, Facebook bought Oculus VR for $2.3 billion in cash and stock, which released its first consumer virtual reality headset in 2016. In late November 2019, Facebook, Inc. announced the acquisition of the game developer Beat Games, responsible for developing one of that year's most popular VR games, Beat Saber. In Late 2022, after Facebook Inc rebranded to Meta Platforms Inc, Oculus was rebranded to Meta Quest. In May 2020, Facebook, Inc. announced it had acquired Giphy for a reported cash price of $400 million. It will be integrated with the Instagram team. However, in August 2021, UK's Competition and Markets Authority (CMA) stated that Facebook, Inc. might have to sell Giphy, after an investigation found that the deal between the two companies would harm competition in display advertising market. Facebook, Inc. was fined $70 million by CMA for deliberately failing to report all information regarding the acquisition and the ongoing antitrust investigation. In October 2022, the CMA ruled for a second time that Meta be required to divest Giphy, stating that Meta already controls half of the advertising in the UK. Meta agreed to the sale, though it stated that it disagrees with the decision itself. In May 2023, Giphy was divested to Shutterstock for $53 million. In November 2020, Facebook, Inc. announced that it planned to purchase the customer-service platform and chatbot specialist startup Kustomer to promote companies to use their platform for business. It has been reported that Kustomer valued at slightly over $1 billion. The deal was closed in February 2022 after regulatory approval. In September 2022, Meta acquired Lofelt, a Berlin-based haptic tech startup. In December 2025, it was announced Meta had acquired the AI-wearables startup, Limitless. In the same month, they also acquired another AI startup, Manus AI, for $2 billion. Manus announced in December that its platform had achieved $100mm in recurring revenue just 8 months after its launch and Meta said it will scale the platform to many other businesses. In January 2026, it was announced Meta proposed acquisition of Manus was undergoing preliminary scrutiny by Chinese regulators. The examination concerns the cross-border transfer of artificial intelligence technology developed in China. Lobbying In 2020, Facebook, Inc. spent $19.7 million on lobbying, hiring 79 lobbyists. In 2019, it had spent $16.7 million on lobbying and had a team of 71 lobbyists, up from $12.6 million and 51 lobbyists in 2018. Facebook was the largest spender of lobbying money among the Big Tech companies in 2020. The lobbying team includes top congressional aide John Branscome, who was hired in September 2021, to help the company fend off threats from Democratic lawmakers and the Biden administration. In December 2024, Meta donated $1 million to the inauguration fund for then-President-elect Donald Trump. In 2025, Meta was listed among the donors funding the construction of the White House State Ballroom. Partnerships February 2026, Meta announced a long-term partnership with Nvidia. Censorship In August 2024, Mark Zuckerberg sent a letter to Jim Jordan indicating that during the COVID-19 pandemic the Biden administration repeatedly asked Meta to limit certain COVID-19 content, including humor and satire, on Facebook and Instagram. In 2016 Meta hired Jordana Cutler, formerly an employee at the Israeli Embassy to the United States, as its policy chief for Israel and the Jewish Diaspora. In this role, Cutler pushed for the censorship of accounts belonging to Students for Justice in Palestine chapters in the United States. Critics have said that Cutler's position gives the Israeli government an undue influence over Meta policy, and that few countries have such high levels of contact with Meta policymakers. Following the election of Donald Trump in 2025, various sources noted possible censorship related to the Democratic Party on Instagram and other Meta platforms. In February 2025, a Meta rep flagged journalist Gil Duran's article and other "critiques of tech industry figures" as spam or sensitive content, limiting their reach. In March 2025, Meta attempted to block former employee Sarah Wynn-Williams from promoting or further distributing her memoir, Careless People, that includes allegations of unaddressed sexual harassment in the workplace by senior executives. The New York Times reports that the arbitration is among Meta's most forcible attempts to repudiate a former employee's account of workplace dynamics. Publisher Macmillan reacted to the ruling by the Emergency International Arbitral Tribunal by stating that it will ignore its provisions. As of 15 March 2025[update], hardback and digital versions of Careless People were being offered for sale by major online retailers. From October 2025, Meta began removing and restricting access for accounts related to LGBTQ, reproductive health and abortion information pages on its platforms. Martha Dimitratou, executive director of Repro Uncensored, called Meta's shadow-banning of these issues "One of the biggest waves of censorship we are seeing". Disinformation concerns Since its inception, Meta has been accused of being a host for fake news and misinformation. In the wake of the 2016 United States presidential election, Zuckerberg began to take steps to eliminate the prevalence of fake news, as the platform had been criticized for its potential influence on the outcome of the election. The company initially partnered with ABC News, the Associated Press, FactCheck.org, Snopes and PolitiFact for its fact-checking initiative; as of 2018, it had over 40 fact-checking partners across the world, including The Weekly Standard. A May 2017 review by The Guardian found that the platform's fact-checking initiatives of partnering with third-party fact-checkers and publicly flagging fake news were regularly ineffective, and appeared to be having minimal impact in some cases. In 2018, journalists working as fact-checkers for the company criticized the partnership, stating that it had produced minimal results and that the company had ignored their concerns. In 2024 Meta's decision to continue to disseminate a falsified video of US president Joe Biden, even after it had been proven to be fake, attracted criticism and concern. In January 2025, Meta ended its use of third-party fact-checkers in favor of a user-run community notes system similar to the one used on X. While Zuckerberg supported these changes, saying that the amount of censorship on the platform was excessive, the decision received criticism by fact-checking institutions, stating that the changes would make it more difficult for users to identify misinformation. Meta also faced criticism for weakening its policies on hate speech that were designed to protect minorities and LGBTQ+ individuals from bullying and discrimination. While moving its content review teams from California to Texas, Meta changed their hateful conduct policy to eliminate restrictions on anti-LGBT and anti-immigrant hate speech, as well as explicitly allowing users to accuse LGBT people of being mentally ill or abnormal based on their sexual orientation or gender identity. In January 2025, Meta faced significant criticism for its role in removing LGBTQ+ content from its platforms, amid its broader efforts to address anti-LGBTQ+ hate speech. The removal of LGBTQ+ themes was noted as part of the wider crackdown on content deemed to violate its community guidelines. Meta's content moderation policies, which were designed to combat harmful speech and protect users from discrimination, inadvertently led to the removal or restriction of LGBTQ+ content, particularly posts highlighting LGBTQ+ identities, support, or political issues. According to reports, LGBTQ+ posts, including those that simply celebrated pride or advocated for LGBTQ+ rights, were flagged and removed for reasons that some critics argue were vague or inconsistently applied. Many LGBTQ+ activists and users on Meta's platforms expressed concern that such actions stifled visibility and expression, potentially isolating LGBTQ+ individuals and communities, especially in spaces that were historically important for outreach and support. Lawsuits Numerous lawsuits have been filed against the company, both when it was known as Facebook, Inc., and as Meta Platforms. In March 2020, the Office of the Australian Information Commissioner (OAIC) sued Facebook, for significant and persistent infringements of the rule on privacy involving the Cambridge Analytica fiasco. Every violation of the Privacy Act is subject to a theoretical cumulative liability of $1.7 million. The OAIC estimated that a total of 311,127 Australians had been exposed. On December 8, 2020, the U.S. Federal Trade Commission and 46 states (excluding Alabama, Georgia, South Carolina, and South Dakota), the District of Columbia and the territory of Guam, launched Federal Trade Commission v. Facebook as an antitrust lawsuit against Facebook. The lawsuit concerns Facebook's acquisition of two competitors—Instagram and WhatsApp—and the ensuing monopolistic situation. FTC alleges that Facebook holds monopolistic power in the U.S. social networking market and seeks to force the company to divest from Instagram and WhatsApp to break up the conglomerate. William Kovacic, a former chairman of the Federal Trade Commission, argued the case will be difficult to win as it would require the government to create a counterfactual argument of an internet where the Facebook-WhatsApp-Instagram entity did not exist, and prove that harmed competition or consumers. In November 2025, it was ruled that Meta did not violate antitrust laws and holds no monopoly in the market. On December 24, 2021, a court in Russia fined Meta for $27 million after the company declined to remove unspecified banned content. The fine was reportedly tied to the company's annual revenue in the country. In May 2022, a lawsuit was filed in Kenya against Meta and its local outsourcing company Sama. Allegedly, Meta has poor working conditions in Kenya for workers moderating Facebook posts. According to the lawsuit, 260 screeners were declared redundant with confusing reasoning. The lawsuit seeks financial compensation and an order that outsourced moderators be given the same health benefits and pay scale as Meta employees. In June 2022, 8 lawsuits were filed across the U.S. over the allege that excessive exposure to platforms including Facebook and Instagram has led to attempted or actual suicides, eating disorders and sleeplessness, among other issues. The litigation follows a former Facebook employee's testimony in Congress that the company refused to take responsibility. The company noted that tools have been developed for parents to keep track of their children's activity on Instagram and set time limits, in addition to Meta's "Take a break" reminders. In addition, the company is providing resources specific to eating disorders as well as developing AI to prevent children under the age of 13 signing up for Facebook or Instagram. In June 2022, Meta settled a lawsuit with the US Department of Justice. The lawsuit, which was filed in 2019, alleged that the company enabled housing discrimination through targeted advertising, as it allowed homeowners and landlords to run housing ads excluding people based on sex, race, religion, and other characteristics. The U.S. Department of Justice stated that this was in violation of the Fair Housing Act. Meta was handed a penalty of $115,054 and given until December 31, 2022, to shadow the algorithm tool. In January 2023, Meta was fined €390 million for violations of the European Union General Data Protection Regulation. In May 2023, the European Data Protection Board fined Meta a record €1.2 billion for breaching European Union data privacy laws by transferring personal data of Facebook users to servers in the U.S. In July 2024, Meta agreed to pay the state of Texas US$1.4 billion to settle a lawsuit brought by Texas Attorney General Ken Paxton accusing the company of collecting users' biometric data without consent, setting a record for the largest privacy-related settlement ever obtained by a state attorney general. In October 2024, Meta Platforms faced lawsuits in Japan from 30 plaintiffs who claimed they were defrauded by fake investment ads on Facebook and Instagram, featuring false celebrity endorsements. The plaintiffs are seeking approximately $2.8 million in damages. In April 2025, the Kenyan High Court ruled that a US$2.4 billion lawsuit in which three plaintiffs claim that Facebook inflamed civil violence in Ethiopia in 2021 could proceed. In April 2025, Meta was fined €200 million ($230 million) for breaking the Digital Markets Act, by imposing a “consent or pay” system that forces users to either allow their personal data to be used to target advertisements, or pay a subscription fee for advertising-free versions of Facebook and Instagram. In late April 2025, a case was filed against Meta in Ghana over the alleged psychological distress experienced by content moderators employed to take down disturbing social media content including depictions of murders, extreme violence and child sexual abuse. Meta moved the moderation service to the Ghanaian capital of Accra after legal issues in the previous location Kenya. The new moderation company is Teleperformance, a multinational corporation with a history of worker's rights violation. Reports suggests the conditions are worse here than in the previous Kenyan location, with many workers afraid of speaking out due to fear of returning to conflict zones. Workers reported developing mental illnesses, attempted suicides, and low pay. In 26 January 2026, a New Mexico state court case was filed, suggesting that Mark Zuckerberg approved allowing minors to access artificial intelligence chatbot companions that safety staffers warned were capable of sexual interactions. In 2020, the company UReputation, which had been involved in several cases concerning the management of digital armies[clarification needed], filed a lawsuit against Facebook, accusing it of unlawfully transmitting personal data to third parties. Legal actions were initiated in Tunisia, France, and the United States. In 2025, the United States District court for the Northern District of Georgia approved a discovery procedure, allowing UReputation to access documents and evidence held by Meta. Structure Meta's key management consists of: As of October 2022[update], Meta had 83,553 employees worldwide. As of June 2024[update], Meta's board consisted of the following directors; Meta Platforms is mainly owned by institutional investors, who hold around 80% of all shares. Insiders control the majority of voting shares. The three largest individual investors in 2024 were Mark Zuckerberg, Sheryl Sandberg and Christopher K. Cox. The largest shareholders in late 2024/early 2025 were: Roger McNamee, an early Facebook investor and Zuckerberg's former mentor, said Facebook had "the most centralized decision-making structure I have ever encountered in a large company". Facebook co-founder Chris Hughes has stated that chief executive officer Mark Zuckerberg has too much power, that the company is now a monopoly, and that, as a result, it should be split into multiple smaller companies. In an op-ed in The New York Times, Hughes said he was concerned that Zuckerberg had surrounded himself with a team that did not challenge him, and that it is the U.S. government's job to hold him accountable and curb his "unchecked power". He also said that "Mark's power is unprecedented and un-American." Several U.S. politicians agreed with Hughes. European Union Commissioner for Competition Margrethe Vestager stated that splitting Facebook should be done only as "a remedy of the very last resort", and that it would not solve Facebook's underlying problems. Revenue Facebook ranked No. 34 in the 2020 Fortune 500 list of the largest United States corporations by revenue, with almost $86 billion in revenue most of it coming from advertising. One analysis of 2017 data determined that the company earned US$20.21 per user from advertising. According to New York, since its rebranding, Meta has reportedly lost $500 billion as a result of new privacy measures put in place by companies such as Apple and Google which prevents Meta from gathering users' data. In February 2015, Facebook announced it had reached two million active advertisers, with most of the gain coming from small businesses. An active advertiser was defined as an entity that had advertised on the Facebook platform in the last 28 days. In March 2016, Facebook announced it had reached three million active advertisers with more than 70% from outside the United States. Prices for advertising follow a variable pricing model based on auctioning ad placements, and potential engagement levels of the advertisement itself. Similar to other online advertising platforms like Google and Twitter, targeting of advertisements is one of the chief merits of digital advertising compared to traditional media. Marketing on Meta is employed through two methods based on the viewing habits, likes and shares, and purchasing data of the audience, namely targeted audiences and "look alike" audiences. The U.S. IRS challenged the valuation Facebook used when it transferred IP from the U.S. to Facebook Ireland (now Meta Platforms Ireland) in 2010 (which Facebook Ireland then revalued higher before charging out), as it was building its double Irish tax structure. The case is ongoing and Meta faces a potential fine of $3–5bn. The U.S. Tax Cuts and Jobs Act of 2017 changed Facebook's global tax calculations. Meta Platforms Ireland is subject to the U.S. GILTI tax of 10.5% on global intangible profits (i.e. Irish profits). On the basis that Meta Platforms Ireland Limited is paying some tax, the effective minimum US tax for Facebook Ireland will be circa 11%. In contrast, Meta Platforms Inc. would incur a special IP tax rate of 13.125% (the FDII rate) if its Irish business relocated to the U.S. Tax relief in the U.S. (21% vs. Irish at the GILTI rate) and accelerated capital expensing, would make this effective U.S. rate around 12%. The insignificance of the U.S./Irish tax difference was demonstrated when Facebook moved 1.5bn non-EU accounts to the U.S. to limit exposure to GDPR. Facilities Users outside of the U.S. and Canada contract with Meta's Irish subsidiary, Meta Platforms Ireland Limited (formerly Facebook Ireland Limited), allowing Meta to avoid US taxes for all users in Europe, Asia, Australia, Africa and South America. Meta is making use of the Double Irish arrangement which allows it to pay 2–3% corporation tax on all international revenue. In 2010, Facebook opened its fourth office, in Hyderabad, India, which houses online advertising and developer support teams and provides support to users and advertisers. In India, Meta is registered as Facebook India Online Services Pvt Ltd. It also has offices or planned sites in Chittagong, Bangladesh; Dublin, Ireland; and Austin, Texas, among other cities. Facebook opened its London headquarters in 2017 in Fitzrovia in central London. Facebook opened an office in Cambridge, Massachusetts in 2018. The offices were initially home to the "Connectivity Lab", a group focused on bringing Internet access to those who do not have access to the Internet. In April 2019, Facebook opened its Taiwan headquarters in Taipei. In March 2022, Meta opened new regional headquarters in Dubai. In September 2023, it was reported that Meta had paid £149m to British Land to break the lease on Triton Square London office. Meta reportedly had another 18 years left on its lease on the site. As of 2023, Facebook operated 21 data centers. It committed to purchase 100% renewable energy and to reduce its greenhouse gas emissions 75% by 2020. Its data center technologies include Fabric Aggregator, a distributed network system that accommodates larger regions and varied traffic patterns. Reception US Representative Alexandria Ocasio-Cortez responded in a tweet to Zuckerberg's announcement about Meta, saying: "Meta as in 'we are a cancer to democracy metastasizing into a global surveillance and propaganda machine for boosting authoritarian regimes and destroying civil society ... for profit!'" Ex-Facebook employee Frances Haugen and whistleblower behind the Facebook Papers responded to the rebranding efforts by expressing doubts about the company's ability to improve while led by Mark Zuckerberg, and urged the chief executive officer to resign. In November 2021, a video published by Inspired by Iceland went viral, in which a Zuckerberg look-alike promoted the Icelandverse, a place of "enhanced actual reality without silly looking headsets". In a December 2021 interview, SpaceX and Tesla chief executive officer Elon Musk said he could not see a compelling use-case for the VR-driven metaverse, adding: "I don't see someone strapping a frigging screen to their face all day." In January 2022, Louise Eccles of The Sunday Times logged into the metaverse with the intention of making a video guide. She wrote: Initially, my experience with the Oculus went well. I attended work meetings as an avatar and tried an exercise class set in the streets of Paris. The headset enabled me to feel the thrill of carving down mountains on a snowboard and the adrenaline rush of climbing a mountain without ropes. Yet switching to the social apps, where you mingle with strangers also using VR headsets, it was at times predatory and vile. Eccles described being sexually harassed by another user, as well as "accents from all over the world, American, Indian, English, Australian, using racist, sexist, homophobic and transphobic language". She also encountered users as young as 7 years old on the platform, despite Oculus headsets being intended for users over 13. See also References External links 37°29′06″N 122°08′54″W / 37.48500°N 122.14833°W / 37.48500; -122.14833 |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/History_of_Israel#Hellenistic_period_(333–64_BCE)] | [TOKENS: 14912] |
Contents History of Israel The history of Israel covers the Southern Levant region also known as Canaan, Palestine, or the Holy Land, which is the location of Israel and Palestine. From prehistory, as part of the Levantine corridor, the area witnessed waves of early humans from Africa, then the emergence of Natufian culture c. 10,000 BCE. The region entered the Bronze Age c. 2,000 BCE with the development of Canaanite civilization. In the Iron Age, the kingdoms of Israel and Judah were established, entities central to the origins of the Abrahamic religions. This has given rise to Judaism, Samaritanism, Christianity, Islam, Druzism, Baha'ism. The Land of Israel has seen many conflicts, been controlled by various polities, and hosted various ethnic groups. In the following centuries, the Assyrian, Babylonian, Achaemenid, and Macedonian empires conquered the region. Ptolemies and Seleucids vied for control during the Hellenistic period. Through the Hasmonean dynasty, the Jews maintained independence for a century before incorporation into the Roman Republic. As a result of the Jewish–Roman wars in the 1st and 2nd centuries CE, many Jews were killed, or sold into slavery. Following the advent of Christianity, demographics shifted towards newfound Christians, who replaced Jews as the majority by the 4th century. In the 7th century, Byzantine Christian rule over Israel was superseded in the Muslim conquest of the Levant by the Rashidun Caliphate, to later be ruled by the Umayyad, Abbasid, and Fatimid caliphates, before being conquered by the Seljuks in the 1070s. Throughout the 12th and 13th centuries, the Land of Israel saw wars between Christians and Muslims as part of the Crusades, with the Kingdom of Jerusalem overrun by Saladin's Ayyubids in the 12th century. The Crusaders hung on to decreasing territories for another century. In the 13th century, the Land of Israel became subject to Mongol conquest, though this was stopped by the Mamluk Sultanate, under whose rule it remained until the 16th century. The Mamluks were defeated by the Ottoman Empire, and the region became an Ottoman province until the early 20th century. The 19th century saw the rise of a Jewish nationalist movement in Europe known as Zionism; aliyah, Jewish immigration to Israel from the diaspora, increased. During World War I, the Sinai and Palestine campaign of the Allies led to the partition of the Ottoman Empire. Britain was granted control of the region by a League of Nations mandate, known as Mandatory Palestine. The British committed to the creation of a Jewish homeland in the 1917 Balfour Declaration. Palestinian Arabs sought to prevent Jewish immigration, and tensions grew during British administration. In 1947, the UN voted for the partition of Mandate Palestine and creation of a Jewish and Arab state. The Jews accepted the plan, while the Arabs rejected it. A civil war ensued, won by the Jews. In May 1948, the Israeli Declaration of Independence sparked the 1948 War in which Israel repelled the armies of the neighbouring states. It resulted in the 1948 Palestinian expulsion and flight and led to Jewish emigration from other parts of the Middle East. About 40% of the global Jewish population resides in Israel. In 1979, the Egypt–Israel peace treaty was signed. In 1993, Israel signed the Oslo I Accord with the Palestine Liberation Organization, which was followed by the establishment of the Palestinian Authority. In 1994, the Israel–Jordan peace treaty was signed. Despite a long-running Israeli–Palestinian peace process, the conflict continues. Prehistory The oldest evidence of early humans in the territory of modern Israel, dating to 1.5 million years ago, was found in Ubeidiya near the Sea of Galilee. Flint tool artefacts have been discovered at Yiron, the oldest stone tools found anywhere outside Africa.[dubious – discuss] The Daughters of Jacob Bridge over the Jordan River provides evidence of the control of fire by early humans around 780,000 years ago, one of the oldest known examples. In the Mount Carmel area at el-Tabun, and Es Skhul, Neanderthal and early modern human remains were found, showing the longest stratigraphic record in the region, spanning 600,000 years of human activity, from the Lower Paleolithic to the present day, representing roughly a million years of human evolution. Other significant Paleolithic sites include Qesem cave. A 200,000-year-old fossil from Misliya Cave is the second-oldest evidence of anatomically modern humans found outside Africa. Other notable finds include the Skhul and Qafzeh hominins, as well as Manot 1. Around 10th millennium BCE, the Natufian culture existed in the area. The beginning of agriculture in the region during the Neolithic Revolution is evidenced by sites such as Nahal Oren and Gesher. Here is one of the more common periodisations. Bronze Age Canaan The Canaanites are archaeologically attested in the Middle Bronze Age (2100–1550 BCE). There were probably independent or semi-independent city-states. Cities were often surrounded by massive earthworks, resulting in the archaeological mounds, or 'tells' common in the region today. In the late Middle Bronze Age, the Nile Delta in Egypt was settled by Canaanites who maintained close connections with Canaan. During that period, the Hyksos, dynasties of Canaanite/Asiatic origin, ruled much of Lower Egypt before being overthrown in the 16th century BCE. During the Late Bronze Age (1550–1200 BCE), there were Canaanite vassal states paying tribute to the New Kingdom of Egypt, which governed from Gaza. In 1457 BCE, Egyptian forces under the command of Pharaoh Thutmose III defeated a rebellious coalition of Canaanite vassal states led by Kadesh's king at the Battle of Megiddo. In the Late Bronze Age there was a period of civilizational collapse in the Middle East, Canaan fell into chaos, and Egyptian control ended. There is evidence that urban centers such as Hazor, Beit She'an, Megiddo, Ekron, Isdud and Ascalon were damaged or destroyed. Two groups appear at this time, and are associated with the transition to the Iron Age (they used iron weapons/tools which were better than earlier bronze): the Sea Peoples, particularly the Philistines, who migrated from the Aegean world and settled on the southern coast, and the Israelites, whose settlements dotted the highlands. Some 2nd millennium inscriptions about the semi-nomadic Habiru people are believed to be connected to the Hebrews, who were generally synonymous with the Biblical Israelites. Many scholars regard this connection to be plausible since the two ethnonyms have similar etymologies, although others argue that Habiru refers to a social class found in every Near Eastern society, including Hebrew societies. Ancient Israel and Judah: Iron Age to Babylonian period The earliest recorded evidence of a people by the name of Israel (as ysrỉꜣr) occurs in the Egyptian Merneptah Stele, erected for Pharaoh Merneptah c. 1209 BCE. Archeological evidence indicates that during the early Iron Age I, hundreds of small villages were established on the highlands of Canaan on both sides of the Jordan River, primarily in Samaria, north of Jerusalem. These villages had populations of up to 400, were largely self-sufficient and lived from herding, grain cultivation, and growing vines and olives with some economic interchange. The pottery was plain and undecorated. Writing was known and available for recording, even in small sites. William G. Dever sees this "Israel" in the central highlands as a cultural and probably political entity, more an ethnic group rather than an organized state. Modern scholars believe that the Israelites and their culture branched out of the Canaanite peoples and their cultures through the development of a distinct monolatristic—and later monotheistic—religion centred on a national god Yahweh. According to McNutt, "It is probably safe to assume that sometime during Iron Age I a population began to identify itself as 'Israelite'", differentiating itself from the Canaanites through such markers as the prohibition of intermarriage, an emphasis on family history and genealogy, and religion. Philistine cooking tools and the prevalence of pork in their diets, and locally made Mycenaean pottery—which later evolved into bichrome Philistine pottery—all support their foreign origin. Their cities were large and elaborate, which—together with the findings—point to a complex, hierarchical society. Israel Finkelstein believes that the oldest Abraham traditions originated in the Iron Age, which focus on the themes of land and offspring and possibly, his altars in Hebron. Abraham's Mesopotamian heritage is not discussed. In the 10th century BCE, the Israelite kingdoms of Judah and Israel emerged. The Hebrew Bible states that these were preceded by a single kingdom ruled by Saul, David and Solomon, who is said to have built the First Temple. Archaeologists have debated whether the united monarchy ever existed,[Notes 1] with those in favor of such a polity existing further divided between maximalists who support the Biblical accounts, and minimalists who argue that any such polity was likely smaller than suggested. Historians and archaeologists agree that the northern Kingdom of Israel existed by ca. 900 BCE and the Kingdom of Judah existed by ca. 850 BCE. The Kingdom of Israel was the more prosperous of the two kingdoms and soon developed into a regional power; during the days of the Omride dynasty, it controlled Samaria, Galilee, the upper Jordan Valley, the Sharon and large parts of the Transjordan. Samaria, the capital, was home to one of the largest Iron Age structures in the Levant. The Kingdom of Israel's capital moved between Shechem, Penuel and Tirzah before Omri settled it in Samaria, and the royal succession was often settled by a military coup d'état. The Kingdom of Judah was smaller but more stable; the Davidic dynasty ruled the kingdom for the four centuries of its existence, with the capital always in Jerusalem, controlling the Judaean Mountains, most of the Shephelah and the Beersheba valley in the northern Negev. In 854 BCE, according to the Kurkh Monoliths, an alliance between Ahab of Israel and Ben Hadad II of Aram-Damascus managed to repulse the incursions of the Assyrians, with a victory at the Battle of Qarqar. Another important discovery of the period is the Mesha Stele, a Moabite stele found in Dhiban when Emir Sattam Al-Fayez led Henry Tristram to it as they toured the lands of the vassals of the Bani Sakher. The stele is now in the Louvre. In the stele, Mesha, king of Moab, tells how Chemosh, the god of Moab, had been angry with his people and had allowed them to be subjugated to the Kingdom of Israel, but at length, Chemosh returned and assisted Mesha to throw off the yoke of Israel and restore the lands of Moab. It refers to Omri, king of Israel, to the god Yahweh, and may contain another early reference to the House of David. The Kingdom of Israel fell to the Assyrians following a long siege of the capital Samaria around 720 BCE. The records of Sargon II indicate that he captured Samaria and deported 27,290 inhabitants to Mesopotamia. It is likely that Shalmaneser captured the city since both the Babylonian Chronicles and the Hebrew Bible viewed the fall of Israel as the signature event of his reign. The Assyrian deportations became the basis for the Jewish idea of the Ten Lost Tribes. Foreign groups were settled by the Assyrians in the territories of the fallen kingdom. The Samaritans claim to be descended from Israelites of ancient Samaria who were not expelled by the Assyrians. It is believed that refugees from the destruction of Israel moved to Judah, massively expanding Jerusalem and leading to construction of the Siloam Tunnel during the rule of King Hezekiah (ruled 715–686 BCE). The Siloam inscription, a plaque written in Hebrew left by the construction team, was discovered in the tunnel in 1880s, and is today held by the Istanbul Archaeology Museum. During Hezekiah's rule, Sennacherib, the son of Sargon, attempted but failed to capture Judah. Assyrian records say that Sennacherib levelled 46 walled cities and besieged Jerusalem, leaving after receiving extensive tribute. Sennacherib erected the Lachish reliefs in Nineveh to commemorate a second victory at Lachish. The writings of four different "prophets" are believed to date from this period: Hosea and Amos in Israel and Micah and Isaiah of Judah. These men were mostly social critics who warned of the Assyrian threat and acted as religious spokesmen. They exercised some form of free speech and may have played a significant social and political role in Israel and Judah. They urged rulers and the general populace to adhere to god-conscious ethical ideals, seeing the Assyrian invasions as a divine punishment of the collective resulting from ethical failures. Under King Josiah (ruler from 641 to 619 BCE), the Book of Deuteronomy was either rediscovered or written. The Book of Joshua and the accounts of the kingship of David and Solomon in the Book of Kings are believed to have the same author. The books are known as Deuteronomist and considered to be a key step in the emergence of monotheism in Judah. They emerged at a time that Assyria was weakened by the emergence of Babylon and may be a committing to text of pre-writing verbal traditions. During the late 7th century BCE, Judah became a vassal state of the Neo-Babylonian Empire. In 601 BCE, Jehoiakim of Judah allied with Babylon's principal rival, Egypt, despite the strong remonstrances of the prophet Jeremiah. As a punishment, the Babylonians besieged Jerusalem in 597 BCE, and the city surrendered. The defeat was recorded by the Babylonians. Nebuchadnezzar pillaged Jerusalem and deported king Jechoiachin (Jeconiah), along with other prominent citizens, to Babylon; Zedekiah, his uncle, was installed as king. A few years later, Zedekiah launched another revolt against Babylon, and an army was sent to conquer Jerusalem. In 587 or 586 BCE, King Nebuchadnezzar II of Babylon conquered Jerusalem, destroyed the First Temple and razed the city. The Kingdom of Judah was abolished, and many of its citizens were exiled to Babylon. The former territory of Judah became a Babylonian province called Yehud with its center in Mizpah, north of the destroyed Jerusalem. Tablets that describe King Jehoiachin's rations were found in the ruins of Babylon. He was eventually released by the Babylonians. According to both the Bible and the Talmud, the Davidic dynasty continued as head of Babylonian Jewry, called the "Rosh Galut" (exilarch or head of exile). Arab and Jewish sources show that the Rosh Galut continued to exist for another 1,500 years in what is now Iraq, ending in the eleventh century. Second Temple period In 538 BCE, Cyrus the Great of the Achaemenid Empire conquered Babylon and took over its empire. Cyrus issued a proclamation granting religious freedom to all peoples subjugated by the Babylonians (see the Cyrus Cylinder). According to the Bible, Jewish exiles in Babylon, including 50,000 Judeans led by Zerubabel, returned to Judah to rebuild the Temple in Jerusalem. The Second Temple was subsequently completed c. 515 BCE. A second group of 5,000, led by Ezra and Nehemiah, returned to Judah in 456 BCE. The first was empowered by the Persian king to enforce religious rules, the second had the status of governor and a royal mission to restore the walls of the city. The country remained a province of the Achaemenid empire called Yehud until 332 BCE. The final text of the Torah is thought to have been written during the Persian period (probably 450–350 BCE). The text was formed by editing and unifying earlier texts. The returning Israelites adopted an Aramaic script (also known as the Ashuri alphabet), which they brought back from Babylon; this is the current Hebrew script. The Hebrew calendar closely resembles the Babylonian calendar and probably dates from this period. The Bible describes tension between the returnees, the elite of the First Temple period, and those who had remained in Judah. It is possible that the returnees, supported by the Persian monarchy, became large landholders at the expense of the people who had remained to work the land in Judah, whose opposition to the Second Temple would have reflected a fear that exclusion from the cult would deprive them of land rights. Judah had become in practice a theocracy, ruled by hereditary High Priests and a Persian-appointed governor, frequently Jewish, charged with keeping order and seeing that tribute was paid. A Judean military garrison was placed by the Persians on Elephantine Island near Aswan in Egypt. In the early 20th century, 175 papyrus documents recording activity in this community were discovered, including the "Passover Papyrus", a letter instructing the garrison on how to correctly conduct the Passover feast. In 332 BCE, Alexander the Great of Macedon conquered the region as part of his campaign against the Achaemenid Empire. After his death in 322 BCE, his generals divided the empire and Judea became a frontier region between the Seleucid Empire and Ptolemaic Kingdom in Egypt. Following a century of Ptolemaic rule, Judea was conquered by the Seleucid Empire in 200 BCE at the battle of Panium. Hellenistic rulers generally respected Jewish culture and protected Jewish institutions. Judea was ruled by the hereditary office of the High Priest of Israel as a Hellenistic vassal. Nevertheless, the region underwent a process of Hellenization, which heightened tensions between Greeks, Hellenized Jews, and observant Jews. These tensions escalated into clashes involving a power struggle for the position of high priest and the character of the holy city of Jerusalem. When Antiochus IV Epiphanes consecrated the temple, forbade Jewish practices, and forcibly imposed Hellenistic norms on the Jews, several centuries of religious tolerance under Hellenistic control came to an end. In 167 BCE, the Maccabean revolt erupted after Mattathias, a Jewish priest of the Hasmonean lineage, killed a Hellenized Jew and a Seleucid official who participated in sacrifice to the Greek gods in Modi'in. His son Judas Maccabeus defeated the Seleucids in several battles, and in 164 BCE, he captured Jerusalem and restored temple worship, an event commemorated by the Jewish festival of Hannukah. After Judas' death, his brothers Jonathan Apphus and Simon Thassi were able to establish and consolidate a vassal Hasmonean state in Judea, capitalizing on the Seleucid Empire's decline as a result of internal instability and wars with the Parthians, and by forging ties with the rising Roman Republic. Hasmonean leader John Hyrcanus was able to gain independence, doubling Judea's territories. He took control of Idumaea, where he converted the Edomites to Judaism, and invaded Scythopolis and Samaria, where he demolished the Samaritan Temple. Hyrcanus was also the first Hasmonean leader to mint coins. Under his sons, kings Aristobulus I and Alexander Jannaeus, Hasmonean Judea became a kingdom, and its territories continued to expand, now also covering the coastal plain, Galilee and parts of the Transjordan. Some scholars argue that the Hasmonean dynasty also institutionalized the final Jewish biblical canon. Under Hasmonean rule, the Pharisees, Sadducees and the mystic Essenes emerged as the principal Jewish social movements. The Pharisee sage Simeon ben Shetach is credited with establishing the first schools based around meeting houses. This was a key step in the emergence of Rabbinical Judaism. After Jannaeus' widow, queen Salome Alexandra, died in 67 BCE, her sons Hyrcanus II and Aristobulus II engaged in a civil war over succession. The conflicting parties requested Pompey's assistance on their behalf, which paved the way for a Roman takeover of the kingdom. In 63 BCE, the Roman Republic conquered Judaea, ending Jewish independence under the Hasmoneans. Roman general Pompey intervened in a dynastic civil war and, after capturing Jerusalem, reinstated Hyrcanus II as high priest but denied him the title of king. Rome soon installed the Herodian dynasty—of Idumean descent but Jewish by conversion—as a loyal replacement for the nationalist Hasmoneans. In 37 BCE, Herod the Great, the first client king of this line, took power after defeating the restored Hasmonean king Antigonus II Mattathias. Herod imposed heavy taxes, suppressed opposition, and centralized authority, which fostered widespread resentment. Herod also carried out major monumental construction projects throughout his kingdom, and significantly expanded the Second Temple, which he transformed into one of the largest religious structures in the ancient world. After his death in 4 BCE, his kingdom was divided among his sons into a tetrarchy under continued Roman oversight. In 6 CE, Roman emperor Augustus transformed Judaea into a Roman province, deposing its last Jewish ruler, Herod Archelaus, and appointing a Roman governor in his place. That same year, a census triggered a small uprising by Judas of Galilee, the founder of a movement that rejected foreign authority and recognized only God as king. Over the next six decades, with the brief exception of a short period of Jewish autonomy under the client king Herod Agrippa I, the province remained under direct Roman administration. Some governors ruled with brutality and showed little regard for Jewish religious sensitivities, deepening resentment among the local population. This discontent was also fueled by poor governance, corruption, and growing economic inequality, along with rising tensions between Jews and neighboring populations over ethnic, religious, and territorial disputes. At the same time, collective memory of the Maccabean revolt and the period of Hasmonean independence continued to inspire hopes for national liberation from Roman control. In 64 CE, the Temple High Priest Joshua ben Gamla introduced a religious requirement for Jewish boys to learn to read from the age of six. Over the next few hundred years this requirement became steadily more ingrained in Jewish tradition. The Jewish–Roman wars were a series of large-scale revolts by Jewish subjects against the Roman Empire between 66 and 135 CE. The term primarily applies to the First Jewish–Roman War (66–73 CE) and the Bar Kokhba revolt (132–136 CE), both nationalist rebellions aimed at restoring Jewish independence in Judea. Some sources also include the Diaspora Revolt (115–117 CE), an ethno-religious conflict fought across the Eastern Mediterranean and including the Kitos War in Judaea. The Jewish–Roman wars had a devastating impact on the Jewish people, transforming them from a major population in the Eastern Mediterranean into a dispersed and persecuted minority. The First Jewish-Roman War culminated in the destruction of Jerusalem and other towns and villages in Judaea, resulting in significant loss of life and a considerable segment of the population being uprooted or displaced. Those who remained were stripped of any form of political autonomy. Subsequently, the brutal suppression of the Bar Kokhba revolt resulted in even more severe consequences. Judea witnessed a significant depopulation, as many Jews were killed, expelled, or sold into slavery. The outcome of the conflict marked the termination of efforts to reestablish a Jewish state until the modern era. Jews were banned from residing in the vicinity of Jerusalem, which the Romans rebuilt into the pagan colony of Aelia Capitolina, and the province of Judaea was renamed Syria Palaestina. Collectively, these events enhanced the role of Jewish diaspora, relocating the Jewish demographic and cultural center to Galilee and eventually to Babylonia, with smaller communities across the Mediterranean, the Middle East, and beyond. The Jewish–Roman wars also had a major impact on Judaism, after the central worship site of Second Temple Judaism, the Second Temple in Jerusalem, was destroyed by Titus's troops in 70 CE. The destruction of the Temple led to a transformation in Jewish religious practices, emphasizing prayer, Torah study, and communal gatherings in synagogues. This pivotal shift laid the foundation for the emergence of Rabbinic Judaism, which has been the dominant form of Judaism since late antiquity, after the codification of the Babylonian Talmud. Late Roman and Byzantine periods As a result of the disastrous effects of the Bar Kokhba revolt, Jewish presence in the region significantly dwindled. Over the next centuries, more Jews left to communities in the Diaspora, especially the large, speedily growing Jewish communities in Babylonia and Arabia. Others remained in the Land of Israel, where the spiritual and demographic center shifted from the depopulated Judea to Galilee. Jewish presence also continued in the southern Hebron Hills, in Ein Gedi, and on the coastal plain. The Mishnah and the Jerusalem Talmud, huge compendiums of Rabbinical discussions, were compiled during the 2nd to 4th centuries CE in Tiberias and Jerusalem. Following the revolt, Judea's countryside was penetrated by pagan populations, including migrants from the nearby provinces of Syria, Phoenicia, and Arabia, whereas Aelia Capitolina, its immediate vicinity, and administrative centers were now inhabited by Roman veterans and settlers from the western parts of the empire. The Romans permitted a hereditary Rabbinical Patriarch from the House of Hillel, called the "Nasi", to represent the Jews in dealings with the Romans. One prominent figure was Judah ha-Nasi, credited with compiling the final version of the Mishnah, a vast collection of Jewish oral traditions. He also emphasized the importance of education in Judaism, leading to requirements that illiterate Jews be treated as outcasts. This might have contributed to some illiterate Jews converting to Christianity. Jewish seminaries, such as those at Shefaram and Bet Shearim, continued to produce scholars. The best of these became members of the Sanhedrin, which was located first at Sepphoris and later at Tiberias. In the Galillee, many synagogues have been found dating from this period, and the burial site of the Sanhedrin leaders was discovered in Beit She'arim. In the 3rd century, the Roman Empire faced an economic crisis and imposed heavy taxation to fund wars of imperial succession. This situation prompted additional Jewish migration from Syria Palaestina to the Sasanian Empire, known for its more tolerant environment; there, a flourishing Jewish community with important Talmudic academies thrived in Babylonia, engaging in a notable rivalry with the Talmudic academies of Palaestina. Early in the 4th century, the Emperor Constantine made Constantinople the capital of the East Roman Empire and made Christianity an accepted religion. His mother Helena made a pilgrimage to Jerusalem (326–328) and led the construction of the Church of the Nativity (birthplace of Jesus in Bethlehem), the Church of the Holy Sepulchre (burial site of Jesus in Jerusalem) and other key churches that still exist. The name Jerusalem was restored to Aelia Capitolina and became a Christian city. Jews were still banned from living in Jerusalem, but were allowed to visit and worship at the site of the ruined temple. Over the course of the next century Christians worked to eradicate "paganism", leading to the destruction of classical Roman traditions and eradication of their temples. In 351–2, another Jewish revolt in the Galilee erupted against a corrupt Roman governor. The Roman Empire split in 390 CE and the region became part of the Eastern Roman Empire, known as the Byzantine Empire. Under Byzantine rule, much of the region and its non-Jewish population were won over by Christianity, which eventually became the dominant religion in the region. The presence of holy sites drew Christian pilgrims, some of whom chose to settle, contributing to the rise of a Christian majority. Christian authorities encouraged this pilgrimage movement and appropriated lands, constructing magnificent churches at locations linked to biblical narratives. Additionally, monks established monasteries near pagan settlements, encouraging the conversion of local pagans. During the Byzantine period, the Jewish presence in the region declined, and it is believed that Jews lost their majority status in Palestine in the fourth century. While Judaism remained the sole non-Christian religion tolerated, restrictions on Jews gradually increased, prohibiting the construction of new places of worship, holding public office, or owning Christian slaves. In 425, after the death of the last Nasi, Gamliel VI, the Nasi office and the Sanhedrin were officially abolished, and the standing of yeshivot weakened. The leadership void was gradually filled by the Jewish center in Babylonia, which would assume a leading role in the Jewish world for generations after the Byzantine period. During the 5th and 6th centuries CE, the region witnessed a series of Samaritan revolts against Byzantine rule. Their suppression resulted in the decline of Samaritan presence and influence, and further consolidated Christian domination. Though it is acknowledged that some Jews and Samaritans converted to Christianity during the Byzantine period, the reliable historical records are limited, and they pertain to individual conversions rather than entire communities. In 611, Khosrow II, ruler of Sassanid Persia, invaded the Byzantine Empire. He was helped by Jewish fighters recruited by Benjamin of Tiberias and captured Jerusalem in 614. The "True Cross" was captured by the Persians. The Jewish Himyarite Kingdom in Yemen may also have provided support. Nehemiah ben Hushiel was made governor of Jerusalem. Christian historians of the period claimed the Jews massacred Christians in the city, but there is no archeological evidence of destruction, leading modern historians to question their accounts. In 628, Kavad II (son of Kosrow) returned Palestine and the True Cross to the Byzantines and signed a peace treaty with them. Following the Byzantine re-entry, Heraclius massacred the Jewish population of Galilee and Jerusalem, while renewing the ban on Jews entering the latter. Early Muslim period The Levant was conquered by an Arab army under the command of ʿUmar ibn al-Khaṭṭāb in 635, and became the province of Bilad al-Sham of the Rashidun Caliphate. Two military districts—Jund Filastin and Jund al-Urdunn—were established in Palestine. A new city called Ramlah was built as the Muslim capital of Jund Filastin, while Tiberias served as the capital of Jund al-Urdunn. The Byzantine ban on Jews living in Jerusalem came to an end. In 661, Mu'awiya I was crowned Caliph in Jerusalem, becoming the first of the (Damascus-based) Umayyad dynasty. In 691, Umayyad Caliph Abd al-Malik (685–705) constructed the Dome of the Rock shrine on the Temple Mount, where the two Jewish temples had been located. A second building, the Al-Aqsa Mosque, was also erected on the Temple Mount in 705. Both buildings were rebuilt in the 10th century following a series of earthquakes. In 750, Arab discrimination against non-Arab Muslims led to the Abbasid Revolution and the Umayyads were replaced by the Abbasid Caliphs who built a new city, Baghdad, to be their capital. This period is known as the Islamic Golden Age, the Arab Empire was the largest in the world and Baghdad the largest and richest city. Both Arabs and minorities prospered across the region and much scientific progress was made. There were however setbacks: During the 8th century, the Caliph Umar II introduced a law requiring Jews and Christians to wear identifying clothing. Jews were required to wear yellow stars round their neck and on their hats, Christians had to wear Blue. Clothing regulations arose during repressive periods of Arab rule and were more designed to humiliate then persecute non-Muslims. A poll tax was imposed on all non-Muslims by Islamic rulers and failure to pay could result in imprisonment or worse. In 982, Caliph Al-Aziz Billah of the Cairo-based Fatimid dynasty conquered the region. The Fatimids were followers of Isma'ilism, a branch of Shia Islam and claimed descent from Fatima, Mohammed's daughter. Around the year 1010, the Church of Holy Sepulchre (believed to be Jesus burial site), was destroyed by Fatimid Caliph al-Hakim, who relented ten years later and paid for it to be rebuilt. In 1020 al-Hakim claimed divine status and the newly formed Druze religion gave him the status of a messiah. Although the Arab conquest was relatively peaceful and did not cause widespread destruction, it did alter the country's demographics significantly. Over the ensuing several centuries, the region experienced a drastic decline in its population, from an estimated 1 million during Roman and Byzantine times to some 300,000 by the early Ottoman period. This demographic collapse was accompanied by a slow process of Islamization, that resulted from the flight of non-Muslim populations, immigration of Muslims, and local conversion. The majority of the remaining populace belonged to the lowest classes. While the Arab conquerors themselves left the area after the conquest and moved on to other places, the settlement of Arab tribes in the area both before and after the conquest also contributed to the Islamization. As a result, the Muslim population steadily grew and the area became gradually dominated by Muslims on a political and social level. During the early Islamic period, many Christians and Samaritans, belonging to the Byzantine upper class, migrated from the coastal cities to northern Syria and Cyprus, which were still under Byzantine control, while others fled to the central highlands and the Transjordan. As a result, the coastal towns, formerly important economic centers connected with the rest of the Byzantine world, were emptied of most of their residents. Some of these cities—namely Ashkelon, Acre, Arsuf, and Gaza—now fortified border towns, were resettled by Muslim populations, who developed them into significant Muslim centers. The region of Samaria also underwent a process of Islamization as a result of waves of conversion among the Samaritan population and the influx of Muslims into the area. The predominantly Jacobite Monophysitic Christian population had been hostile to Byzantium orthodoxy, and at times for that reason welcomed Muslim rule. There is no strong evidence for forced conversion, or that the jizya tax significantly affected such changes. The demographic situation in Palestine was further altered by urban decline under the Abbasids, and it is thought that the 749 earthquake hastened this process by causing an increase in the number of Jews, Christians, and Samaritans who emigrated to diaspora communities while also leaving behind others who remained in the devastated cities and poor villages until they converted to Islam. Historical records and archeological evidence suggest that many Samaritans converted under Abbasid and Tulunid rule, after suffering through severe difficulties such droughts, earthquakes, religious persecution, heavy taxes and anarchy. The same region also saw the settlement of Arabs. Over the period, the Samaritan population drastically decreased, with the rural Samaritan population converting to Islam, and small urban communities remaining in Nablus and Caesarea, as well as in Cairo, Damascus, Aleppo and Sarepta. Nevertheless, the Muslim population remained a minority in a predominantly Christian area, and it is likely that this status persisted until the Crusader period. Crusades and Mongols In 1095, Pope Urban II called upon Christians to wage a holy war and recapture Jerusalem from Muslim rule. Responding to this call, Christians launched the First Crusade in the same year, a military campaign aimed at retaking the Holy Land, ultimately resulting in the successful siege and conquest of Jerusalem in 1099. In the same year, the Crusaders conquered Beit She'an and Tiberias, and in the following decade, they captured coastal cities with the support of Italian city-state fleets, establishing these coastal ports as crucial strongholds for Crusader rule in the region. Following the First Crusade, several Crusader states were established in the Levant, with the Kingdom of Jerusalem (Regnum Hierosolymitanum) assuming a preeminent position and enjoying special status among them. The population consisted predominantly of Muslims, Christians, Jews, and Samaritans, while the Crusaders remained a minority and relied on the local population who worked the soil. The region saw the construction of numerous robust castles and fortresses, yet efforts to establish permanent European villages proved unsuccessful. Around 1180, Raynald of Châtillon, ruler of Transjordan, caused increasing conflict with the Ayyubid Sultan Saladin (Salah-al-Din), leading to the defeat of the Crusaders in the 1187 Battle of Hattin (above Tiberias). Saladin was able to peacefully take Jerusalem and conquered most of the former Kingdom of Jerusalem. Saladin's court physician was Maimonides, a refugee from Almohad (Muslim) persecution in Córdoba, Spain, where all non-Muslim religions had been banned. The Christian world's response to the loss of Jerusalem came in the Third Crusade of 1190. After lengthy battles and negotiations, Richard the Lionheart and Saladin concluded the Treaty of Jaffa in 1192 whereby Christians were granted free passage to make pilgrimages to the holy sites, while Jerusalem remained under Muslim rule. In 1229, Jerusalem peacefully reverted into Christian control as part of a treaty between Holy Roman Emperor Frederick II and Ayyubid sultan al-Kamil that ended the Sixth Crusade. In 1244, Jerusalem was sacked by the Khwarezmian Tatars who decimated the city's Christian population, drove out the Jews and razed the city. The Khwarezmians were driven out by the Ayyubids in 1247. Mamluk period Between 1258 and 1291, the area was the frontier between Mongol invaders (occasional Crusader allies) and the Mamluks of Egypt. The conflict impoverished the country and severely reduced the population. In Egypt a caste of warrior slaves, known as the Mamluks, gradually took control of the kingdom. The Mamluks were mostly of Turkish origin, and were bought as children and then trained in warfare. They were highly prized warriors, who gave rulers independence of the native aristocracy. In Egypt they took control of the kingdom following a failed invasion by the Crusaders (Seventh Crusade). The first Mamluk Sultan, Qutuz of Egypt, defeated the Mongols in the Battle of Ain Jalut ("Goliath's spring" near Ein Harod), ending the Mongol advances. He was assassinated by one of his Generals, Baibars, who went on to eliminate most of the Crusader outposts. The Mamluks ruled Palestine until 1516, regarding it as part of Syria. In Hebron, Jews were banned from worshipping at the Cave of the Patriarchs (the second-holiest site in Judaism); they were only allowed to enter 7 steps inside the site and the ban remained in place until Israel assumed control of the West Bank in the Six-Day War.[undue weight? – discuss] The Egyptian Mamluk sultan Al-Ashraf Khalil conquered the last outpost of Crusader rule in 1291. The Mamluks, continuing the policy of the Ayyubids, made the strategic decision to destroy the coastal area and to bring desolation to many of its cities, from Tyre in the north to Gaza in the south. Ports were destroyed and various materials were dumped to make them inoperable. The goal was to prevent attacks from the sea, given the fear of the return of the Crusaders. This had a long-term effect on those areas, which remained sparsely populated for centuries. The activity in that time concentrated more inland. With the 1492 expulsion of Jews from Spain and 1497 persecution of Jews and Muslims by Manuel I of Portugal, many Jews moved eastward, with some deciding to settle in the Mamluk Palestine. As a consequence, the local Jewish community underwent significant rejuvenation. The influx of Sephardic Jews began under Mamluk rule in the 15th century, and continued throughout the 16th century and especially after the Ottoman conquest. As city-dwellers, the majority of Sephardic Jews preferred to settle in urban areas, mainly in Safed but also in Jerusalem, while the Musta'arbi community comprised the majority of the villagers' Jews. Ottoman period Under the Mamluks, the area was a province of Bilad a-Sham (Syria). It was conquered by Turkish Sultan Selim I in 1516–17, becoming a part of the province of Ottoman Syria for the next four centuries, first as the Damascus Eyalet and later as the Syria Vilayet (following the Tanzimat reorganization of 1864). With the more favorable conditions that followed the Ottoman conquest, the immigration of Jews fleeing Catholic Europe, which had already begun under Mamluk rule, continued, and soon an influx of exiled Sephardic Jews came to dominate the Jewish community in the area. In 1558, Selim II (1566–1574), successor to Suleiman, whose wife Nurbanu Sultan was Jewish, gave control of Tiberias to Doña Gracia Mendes Nasi, one of the richest women in Europe and an escapee from the Inquisition. She encouraged Jewish refugees to settle in the area and established a Hebrew printing press. Safed became a centre for study of the Kabbalah and other Jewish religious studies, culminating with Joseph Karo's writing of the Shulchan Aruch – published in 1565 in Venice – which became the near-universal standard of Jewish religious law. Doña Nasi's nephew, Joseph Nasi, was made governor of Tiberias and he encouraged Jewish settlement from Italy. In 1660, a Druze power struggle led to the destruction of Safed and Tiberias. In the late 18th century a local Arab sheikh, Zahir al-Umar, created a de facto independent Emirate in the Galilee. Ottoman attempts to subdue the Sheikh failed, but after Zahir's death the Ottomans restored their rule in the area. In 1799, Napoleon briefly occupied the country and planned a proclamation inviting Jews to create a state. The proclamation was shelved following his defeat at Acre. In 1831, Muhammad Ali of Egypt, an Ottoman ruler who left the Empire and tried to modernize Egypt, conquered Ottoman Syria and imposed conscription, leading to the Arab revolt. In 1838, there was another Druze revolt. In 1839 Moses Montefiore met with Muhammed Pasha in Egypt and signed an agreement to establish 100–200 Jewish villages in the Damascus Eyalet of Ottoman Syria, but in 1840 the Egyptians withdrew before the deal was implemented, returning the area to Ottoman governorship. In 1844, Jews constituted the largest population group in Jerusalem. By 1896 Jews constituted an absolute majority in Jerusalem, but the overall population in Palestine was 88% Muslim and 9% Christian. Between 1882 and 1903, approximately 35,000 Jews moved to Palestine, known as the First Aliyah. In the Russian Empire, Jews faced growing persecution and legal restrictions. Half the world's Jews lived in the Russian Empire, where they were restricted to living in the Pale of Settlement. Severe pogroms in the early 1880s and legal repression led to 2 million Jews emigrating from the Russian Empire. 1.5 million went to the United States. Popular destinations were also Germany, France, the United Kingdom, the Netherlands, Argentina and Palestine. The Zionist movement began in earnest in 1882 with Leon Pinsker's pamphlet Auto-Emancipation, which argued for the creation of a Jewish national homeland as a means to avoid the violence plaguing Jewish communities in Eastern Europe. At the 1884 Katowice Conference, Russian Jews established the Bilu and Hovevei Zion ("Lovers of Zion") movements with the aim of settling in Palestine. In 1878, Russian Jewish emigrants established the village of Petah Tikva ("The Beginning of Hope"), followed by Rishon LeZion ("First to Zion") in 1882. The existing Ashkenazi communities were concentrated in the Four Holy Cities, extremely poor and relied on donations (halukka) from groups abroad, while the new settlements were small farming communities, but still relied on funding by the French Baron, Edmond James de Rothschild, who sought to establish profitable enterprises. Many early migrants could not find work and left, but despite the problems, more settlements arose and the community grew. After the Ottoman conquest of Yemen in 1881, a large number of Yemenite Jews also emigrated to Palestine, often driven by Messianism. In 1896 Theodor Herzl published Der Judenstaat (The Jewish State), in which he asserted that the solution to growing antisemitism in Europe (the so-called "Jewish Question") was to establish a Jewish state. In 1897, the World Zionist Organization was founded and the First Zionist Congress proclaimed its aim "to establish a home for the Jewish people in Palestine secured under public law." The Congress chose Hatikvah ("The Hope") as its anthem. Between 1904 and 1914, around 40,000 Jews settled in the area now known as Israel (the Second Aliyah). In 1908, the World Zionist Organization set up the Palestine Bureau (also known as the "Eretz Israel Office") in Jaffa and began to adopt a systematic Jewish settlement policy. In 1909, residents of Jaffa bought land outside the city walls and built the first entirely Hebrew-speaking town, Ahuzat Bayit (later renamed Tel Aviv). In 1915–1916, Talaat Pasha of the Young Turks forced around a million Armenian Christians from their homes in Eastern Turkey, marching them south through Syria, in what is now known as the Armenian genocide. The number of dead is thought to be around 700,000. Hundreds of thousands were forcibly converted to Islam. A community of survivors settled in Jerusalem, one of whom developed the now iconic Armenian pottery. During World War I, most Jews supported the Germans because they were fighting the Russians who were regarded as the Jews' main enemy. In Britain, the government sought Jewish support for the war effort for a variety of reasons including an antisemitic perception of "Jewish power" in the Ottoman Empire's Young Turks movement which was based in Thessaloniki, the most Jewish city in Europe (40% of the 160,000 population were Jewish). The British also hoped to secure American Jewish support for US intervention on Britain's behalf. There was already sympathy for the aims of Zionism in the British government, including the Prime Minister Lloyd George. Over 14,000 Jews were expelled by the Ottoman military commander from the Jaffa area in 1914–1915, due to suspicions they were subjects of Russia, an enemy, or Zionists wishing to detach Palestine from the Ottoman Empire, and when the entire population, including Muslims, of both Jaffa and Tel Aviv was subject to an expulsion order in April 1917, the affected Jews could not return until the British conquest ended in 1918, which drove the Turks out of Southern Syria. A year prior, in 1917, the British foreign minister, Arthur Balfour, sent a public letter to the British Lord Rothschild, a leading member of his party and leader of the Jewish community. The letter subsequently became known as the Balfour Declaration. It stated that the British Government "view[ed] with favour the establishment in Palestine of a national home for the Jewish people". The declaration provided the British government with a pretext for claiming and governing the country. New Middle Eastern boundaries were decided by an agreement between British and French bureaucrats. A Jewish Legion composed largely of Zionist volunteers organized by Ze'ev Jabotinsky and Joseph Trumpeldor participated in the British invasion. It also participated in the failed Gallipoli Campaign. The Nili Zionist spy network provided the British with details of Ottoman plans and troop concentrations. The Ottoman Empire chose to ally itself with Germany when the first war began. Arab leaders dreamed of freeing themselves from Ottoman rule and establishing self-government or forming an independent Arab state. Therefore, Britain contacted Hussein bin Ali of the Kingdom of Hejaz and proposed cooperation. Together they organized the Arab revolt that Britain supplied with very large quantities of rifles and ammunition. In cooperation between British artillery and Arab infantry, the city of Aqaba on the Red Sea was conquered. The Arab army then continued north while Britain attacked the ottomans from the sea. In 1917–1918, Jerusalem and Damascus were conquered from the ottomans. Britain then broke off cooperation with the Arab army. It turned out that Britain had already entered into the secret Sykes–Picot Agreement that meant that only Britain and France would be allowed to administer the land conquered from the Ottoman Empire. After pushing out the Ottomans, Palestine came under martial law. The British, French and Arab Occupied Enemy Territory Administration governed the area shortly before the armistice with the Ottomans until the promulgation of the mandate in 1920. Mandatory Palestine The British Mandate (in effect, British rule) of Palestine, including the Balfour Declaration, was confirmed by the League of Nations in 1922 and came into effect in 1923. The territory of Transjordan was also covered by the Mandate but under separate rules that excluded it from the Balfour Declaration. Britain signed a treaty with the United States (which did not join the League of Nations) in which the United States endorsed the terms of the Mandate, which was approved unanimously by both the U.S. Senate and House of Representatives. The Balfour declaration was published on the 2nd of November 1917 and the Bolsheviks seized control of Russia a week later. This led to civil war in the Russian Empire. Between 1918 and 1921, a series of pogroms led to the death of at least 100,000 Jews (mainly in what is now Ukraine), and the displacement as refugees of a further 600,000. This led to further migration to Palestine. Between 1919 and 1923, some 40,000 Jews arrived in Palestine in what is known as the Third Aliyah. Many of the Jewish immigrants of this period were Socialist Zionists and supported the Bolsheviks. The migrants became known as pioneers (halutzim), experienced or trained in agriculture who established self-sustaining communes called kibbutzim. Malarial marshes in the Jezreel Valley and Hefer Plain were drained and converted to agricultural use. Land was bought by the Jewish National Fund, a Zionist charity that collected money abroad for that purpose. After the French victory over the Arab Kingdom of Syria ended hopes of Arab independence, there were clashes between Arabs and Jews in Jerusalem during the 1920 Nebi Musa riots and in Jaffa the following year, leading to the establishment of the Haganah underground Jewish militia. A Jewish Agency was created which issued the entry permits granted by the British and distributed funds donated by Jews abroad. Between 1924 and 1929, over 80,000 Jews arrived in the Fourth Aliyah, fleeing antisemitism and heavy tax burdens imposed on trade in Poland and Hungary, inspired by Zionism and motivated by the closure of United States borders by the Immigration Act of 1924 which severely limited immigration from Eastern and Southern Europe. Pinhas Rutenberg, a former Commissar of St Petersburg in Russia's pre-Bolshevik Kerensky Government, built the first electricity generators in Palestine. In 1925, the Jewish Agency established the Hebrew University in Jerusalem and the Technion (technological university) in Haifa. British authorities introduced the Palestine pound (worth 1000 "mils") in 1927, replacing the Egyptian pound as the unit of currency in the Mandate. From 1928, the democratically elected Va'ad Leumi (Jewish National Council or JNC) became the main administrative institution of the Palestine Jewish community (Yishuv) and included non-Zionist Jews. As the Yishuv grew, the JNC adopted more government-type functions, such as education, health care, and security. With British permission, the Va'ad Leumi raised its own taxes and ran independent services for the Jewish population. In 1929, tensions grew over the Kotel (Wailing Wall), the holiest spot in the world for modern Judaism,[citation needed] which was then a narrow alleyway where the British banned Jews from using chairs or curtains: Many of the worshippers were elderly and needed seats; they also wanted to separate women from men. The Mufti of Jerusalem said it was Muslim property and deliberately had cattle driven through the alley.[citation needed] He alleged that the Jews were seeking control of the Temple Mount. This provided the spark for the August 1929 Palestine riots. The main victims were the (non-Zionist) ancient Jewish community at Hebron, who were massacred. The riots led to right-wing Zionists establishing their own militia in 1931, the Irgun Tzvai Leumi (National Military Organization, known in Hebrew by its acronym "Etzel"), which was committed to a more aggressive policy towards the Arab population. During the interwar period, the perception grew that there was an irreconciliable tension between the two Mandatory functions, of providing for a Jewish homeland in Palestine, and the goal of preparing the country for self-determination. The British rejected the principle of majority rule or any other measure that would give the Arab population, who formed the majority of the population, control over Palestinian territory. Between 1929 and 1938, 250,000 Jews arrived in Palestine (Fifth Aliyah). In 1933, the Jewish Agency and the Nazis negotiated the Ha'avara Agreement (transfer agreement), under which 50,000 German Jews would be transferred to Palestine. The Jews' possessions were confiscated and in return the Nazis allowed the Ha'avara organization to purchase 14 million pounds worth of German goods for export to Palestine and use it to compensate the immigrants. Although many Jews wanted to leave Nazi Germany, the Nazis prevented Jews from taking any money and restricted them to two suitcases so few could pay the British entry tax.[citation needed] The agreement was controversial and the Labour Zionist leader who negotiated the agreement, Haim Arlosoroff, was assassinated in Tel Aviv in 1933. The assassination was used by the British to create tension between the Zionist left and the Zionist right.[citation needed] Arlosoroff had been the boyfriend of Magda Ritschel some years before she married Joseph Goebbels. There has been speculation that he was assassinated by the Nazis to hide the connection but there is no evidence for it. Between 1933 and 1936, 174,000 arrived despite the large sums the British demanded for immigration permits: Jews had to prove they had 1,000 pounds for families with capital (equivalent to £85,824 in 2023), 500 pounds if they had a profession and 250 pounds if they were skilled labourers.[better source needed] Jewish immigration and Nazi propaganda contributed to the large-scale 1936–1939 Arab revolt in Palestine, a largely nationalist uprising directed at ending British rule. The head of the Jewish Agency, Ben-Gurion, responded to the Arab Revolt with a policy of "Havlagah"—self-restraint and a refusal to be provoked by Arab attacks in order to prevent polarization. The Etzel group broke off from the Haganah in opposition to this policy. The British responded to the revolt with the Peel Commission (1936–37), a public inquiry that recommended that an exclusively Jewish territory be created in the Galilee and western coast (including the population transfer of 225,000 Arabs); the rest becoming an exclusively Arab area. The two main Jewish leaders, Chaim Weizmann and David Ben-Gurion, had convinced the Zionist Congress to approve equivocally the Peel recommendations as a basis for more negotiation. The plan was rejected outright by the Palestinian Arab leadership and they renewed the revolt, which caused the British to abandon the plan as unworkable. Testifying before the Peel Commission, Weizmann said "There are in Europe 6,000,000 people ... for whom the world is divided into places where they cannot live and places where they cannot enter." In 1938, the US called an international conference to address the question of the vast numbers of Jews trying to escape Europe. Britain made its attendance contingent on Palestine being kept out of the discussion. No Jewish representatives were invited. The Nazis proposed their own solution: that the Jews of Europe be shipped to Madagascar (the Madagascar Plan). The agreement proved fruitless, and the Jews were stuck in Europe. With millions of Jews trying to leave Europe and every country closed to Jewish migration, the British decided to close Palestine. The White Paper of 1939, recommended that an independent Palestine, governed jointly by Arabs and Jews, be established within 10 years. The White Paper agreed to allow 75,000 Jewish immigrants into Palestine over the period 1940–44, after which migration would require Arab approval. Both the Arab and Jewish leadership rejected the White Paper. In March 1940 the British High Commissioner for Palestine issued an edict banning Jews from purchasing land in 95% of Palestine. Jews now resorted to illegal immigration: (Aliyah Bet or "Ha'apalah"), often organized by the Mossad Le'aliyah Bet and the Irgun. With no outside help and no countries ready to admit them, very few Jews managed to escape Europe between 1939 and 1945. Those caught by the British were mostly imprisoned in Mauritius. During the Second World War, the Jewish Agency worked to establish a Jewish army that would fight alongside the British forces. Churchill supported the plan but British military and government opposition led to its rejection. The British demanded that the number of Jewish recruits match the number of Arab recruits. In June 1940, Italy declared war on the British Commonwealth and sided with Germany. Within a month, Italian planes bombed Tel Aviv and Haifa, inflicting multiple casualties. In May 1941, the Palmach was established to defend the Yishuv against the planned Axis invasion through North Africa. The British refusal to provide arms to the Jews, even when Rommel's forces were advancing through Egypt in June 1942 (intent on occupying Palestine), and the 1939 White Paper led to the emergence of a Zionist leadership in Palestine that believed conflict with Britain was inevitable. Despite this, the Jewish Agency called on Palestine's Jewish youth to volunteer for the British Army. 30,000 Palestinian Jews and 12,000 Palestinian Arabs enlisted in the British armed forces during the war. In June 1944 the British agreed to create a Jewish Brigade that would fight in Italy. Approximately 1.5 million Jews around the world served in every branch of the allied armies, mainly in the Soviet and US armies. 200,000 Jews died serving in the Soviet army alone. A small group (about 200 activists), dedicated to resisting the British administration in Palestine, broke away from the Etzel (which advocated support for Britain during the war) and formed the "Lehi" (Stern Gang), led by Avraham Stern. In 1942, the USSR released the Revisionist Zionist leader Menachem Begin from the Gulag and he went to Palestine, taking command of the Etzel organization with a policy of increased conflict against the British. At about the same time Yitzhak Shamir escaped from the camp in Eritrea where the British were holding Lehi activists without trial, taking command of the Lehi (Stern Gang). Jews in the Middle East were also affected by the war. Most of North Africa came under Nazi control and many Jews were used as slaves. The 1941 pro-Axis coup in Iraq was accompanied by massacres of Jews. The Jewish Agency put together plans for a last stand in the event of Rommel invading Palestine (the Nazis planned to exterminate Palestine's Jews). Between 1939 and 1945, the Nazis, aided by local forces, led systematic efforts to kill every person of Jewish extraction in Europe (The Holocaust), causing the deaths of approximately 6 million Jews. A quarter of those killed were children. The Polish and German Jewish communities, which played an important role in defining the pre-1945 Jewish world, mostly ceased to exist. In the United States and Palestine, Jews of European origin became disconnected from their families and roots. As the Holocaust mainly affected Ashkenazi Jews, Sepharadi and Mizrahi Jews, who had been a minority, became a much more significant factor in the Jewish world. Those Jews who survived in central Europe, were displaced persons (refugees); an Anglo-American Committee of Inquiry, established to examine the Palestine issue, surveyed their ambitions and found that over 95% wanted to migrate to Palestine. In the Zionist movement the moderate Pro-British (and British citizen) Weizmann, whose son died flying in the RAF, was undermined by Britain's anti-Zionist policies. Leadership of the movement passed to the Jewish Agency in Palestine, now led by the anti-British Socialist-Zionist party (Mapai) led by David Ben-Gurion. The British Empire was severely weakened by the war. In the Middle East, the war had made Britain conscious of its dependence on Arab oil. Shortly after VE Day, the Labour Party won the general election in Britain. Although Labour Party conferences had for years called for the establishment of a Jewish state in Palestine, the Labour government now decided to maintain the 1939 White Paper policies. Illegal migration (Aliyah Bet) became the main form of Jewish entry into Palestine. Across Europe Bricha ("flight"), an organization of former partisans and ghetto fighters, smuggled Holocaust survivors from Eastern Europe to Mediterranean ports, where small boats tried to breach the British blockade of Palestine. Meanwhile, Jews from Arab countries began moving into Palestine overland. Despite British efforts to curb immigration, during the 14 years of the Aliyah Bet, over 110,000 Jews entered Palestine. By the end of World War II, the Jewish population of Palestine had increased to 33% of the total population. In an effort to win independence, Zionists now waged a guerrilla war against the British. The main underground Jewish militia, the Haganah, formed an alliance called the Jewish Resistance Movement with the Etzel and Stern Gang to fight the British. In June 1946, following instances of Jewish sabotage, such as in the Night of the Bridges, the British launched Operation Agatha, arresting 2,700 Jews, including the leadership of the Jewish Agency, whose headquarters were raided. Those arrested were held without trial. On 4 July 1946 a massive pogrom in Poland led to a wave of Holocaust survivors fleeing Europe for Palestine. Three weeks later, Irgun bombed the British Military Headquarters of the King David Hotel in Jerusalem, killing 91 people. In the days following the bombing, Tel Aviv was placed under curfew and over 120,000 Jews, nearly 20% of the Jewish population of Palestine, were questioned by the police. In the US, Congress criticized British handling of the situation and considered delaying loans that were vital to British post-war recovery. The alliance between Haganah and Etzel was dissolved after the King David bombings. Between 1945 and 1948, 100,000–120,000 Jews left Poland. Their departure was largely organized by Zionist activists under the umbrella of the semi-clandestine organization Berihah ("Flight"). Berihah was also responsible for the organized emigration of Jews from Romania, Hungary, Czechoslovakia and Yugoslavia, totalling 250,000 (including Poland) Holocaust survivors. The British imprisoned the Jews trying to enter Palestine in the Atlit detainee camp and Cyprus internment camps. Those held were mainly Holocaust survivors, including large numbers of children and orphans. In response to Cypriot fears that the Jews would never leave and because the 75,000 quota established by the 1939 White Paper had never been filled, the British allowed the refugees to enter Palestine at a rate of 750 per month. On 2 April 1947, the United Kingdom requested that the question of Palestine be handled by the General Assembly. The General Assembly created a committee, United Nations Special Committee on Palestine (UNSCOP), to report on "the question of Palestine". In July 1947 the UNSCOP visited Palestine and met with Jewish and Zionist delegations. The Arab Higher Committee boycotted the meetings. During the visit the British Foreign Secretary Ernest Bevin ordered that passengers from an Aliyah Bet ship, SS Exodus 1947, be sent back to Europe. The Holocaust surviving migrants on the ship were forcibly removed by British troops at Hamburg, Germany. The principal non-Zionist Orthodox Jewish (or Haredi) party, Agudat Israel, recommended to UNSCOP that a Jewish state be set up after reaching a religious status quo agreement with Ben-Gurion. The agreement granted an exemption from military service to a quota of yeshiva (religious seminary) students and to all Orthodox women, made the Sabbath the national weekend, guaranteed kosher food in government institutions and allowed Orthodox Jews to maintain a separate education system. The majority report of UNSCOP proposed "an independent Arab State, an independent Jewish State, and the City of Jerusalem", the last to be under "an International Trusteeship System". On 29 November 1947, in Resolution 181 (II), the General Assembly adopted the majority report of UNSCOP, but with slight modifications. The Plan also called for the British to allow "substantial" Jewish migration by 1 February 1948. Neither Britain nor the UN Security Council took any action to implement the recommendation made by the resolution and Britain continued detaining Jews attempting to enter Palestine. Concerned that partition would severely damage Anglo-Arab relations, Britain denied UN representatives access to Palestine during the period between the adoption of Resolution 181 (II) and the termination of the British Mandate. The British withdrawal was completed in May 1948. However, Britain continued to hold Jewish immigrants of "fighting age" and their families on Cyprus until March 1949. The General Assembly's vote caused joy in the Jewish community and anger in the Arab community. Violence broke out between the sides, escalating into civil war. From January 1948, operations became increasingly militarized, with the intervention of a number of Arab Liberation Army regiments inside Palestine, each active in a variety of distinct sectors around the different coastal towns. They consolidated their presence in Galilee and Samaria. Abd al-Qadir al-Husayni came from Egypt with several hundred men of the Army of the Holy War. Having recruited a few thousand volunteers, he organized the blockade of the 100,000 Jewish residents of Jerusalem. The Yishuv tried to supply the city using convoys of up to 100 armoured vehicles, but largely failed. By March, almost all Haganah's armoured vehicles had been destroyed, the blockade was in full operation, and hundreds of Haganah members who had tried to bring supplies into the city were killed. Up to 100,000 Arabs, from the urban upper and middle classes in Haifa, Jaffa and Jerusalem, or Jewish-dominated areas, evacuated abroad or to Arab centres eastwards. This situation caused the US to withdraw their support for the Partition plan, thus encouraging the Arab League to believe that the Palestinian Arabs, reinforced by the Arab Liberation Army, could put an end to the plan for partition. The British, on the other hand, decided on 7 February 1948 to support the annexation of the Arab part of Palestine by Transjordan. The Jordanian army was commanded by the British. David Ben-Gurion reorganized the Haganah and made conscription obligatory. Every Jewish man and woman in the country had to receive military training. Thanks to funds raised by Golda Meir from sympathisers in the United States, and Stalin's decision to support the Zionist cause, the Jewish representatives of Palestine were able to purchase important arms in Eastern Europe. Ben-Gurion gave Yigael Yadin the responsibility to plan for the announced intervention of the Arab states. The result of his analysis was Plan Dalet, in which Haganah passed from the defensive to the offensive. The plan sought to establish Jewish territorial continuity by conquering mixed zones. Tiberias, Haifa, Safed, Beisan, Jaffa and Acre fell, resulting in the flight of more than 250,000 Palestinian Arabs. On 14 May 1948, on the day the last British forces left Haifa, the Jewish People's Council gathered at the Tel Aviv Museum and proclaimed the establishment of a Jewish state, to be known as the State of Israel. State of Israel In 1948, following the 1947–1948 war in Mandatory Palestine, the Israeli Declaration of Independence sparked the 1948 Arab–Israeli War. This resulted in the 1948 Palestinian expulsion and flight from the land that the State of Israel came to control, and led to waves of Jewish immigration from other parts of the Middle East. The latter half of the 20th century saw further conflicts between Israel and its neighbouring Arab nations. In 1967, the Six-Day War erupted; in its aftermath, Israel captured and occupied the Golan Heights from Syria, the West Bank from Jordan, and the Gaza Strip and the Sinai Peninsula from Egypt. In 1973, the Yom Kippur War began with an attack by Egypt on the Israeli-occupied Sinai Peninsula. In 1979, the Egypt–Israel peace treaty was signed, based on the Camp David Accords. In 1993, Israel signed the Oslo I Accord with the Palestine Liberation Organization, which was followed by the establishment of the Palestinian National Authority. In 1994, the Israel–Jordan peace treaty was signed. Despite efforts to finalize the peace agreement, the conflict continues. Demographics See also Notes References Further reading External links Israeli settlementsTimeline, International law West BankJudea and Samaria Area Gaza StripHof Aza Regional Council |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Military_budget_of_the_United_States] | [TOKENS: 7080] |
Contents Military budget of the United States The military budget of the United States is the largest portion of the discretionary federal budget allocated to the Department of Defense (DoD), or more broadly, the portion of the budget that goes to any military-related expenditures. It pays the salaries, training, and health care of uniformed and civilian personnel, maintains arms, equipment and facilities, funds operations, and develops and buys new items. The budget funds six branches of the US military: the Army, Navy, Marine Corps, Coast Guard, Air Force, and Space Force. Critics contend that recent U.S. defense budgets have disproportionately invested in long-term developmental programs instead of producing weapons systems needed in the near term. Budget for FY2026 As of May 2, 2025, the U.S. Department of Defense's (DoD) fiscal year 2026 (FY2026) budget request was $892.6 billion, maintaining near-flat nominal growth compared to FY2025 levels. On 11 June 2025, the House Armed Services Committee advanced the FY2026 National Defense Authorization Act (NDAA), proposing a total authorization of $925 billion, including $878.7 billion specifically for the Department of Defense. Debate in Congress focused on aligning resources to counter rising threats, particularly in the Indo-Pacific region, while balancing domestic fiscal pressures. The overall budget requested by the US military department was $961.6 billion, which included $197.4 billion for the Department of the Army, $292.2 billion for the Department of the Navy, $301.1 billion for the Department of the Air Force and $170.9 billion for defence wide expenditure. Furthermore, the budget was divided into the various military agencies — Budget for FY2025 As of 11 March 2024 the US Department of Defense fiscal year 2025 (FY2025) budget request was $849.8 billion.[a] On 20 December 2024 the House approved a Continuing Resolution to fund DoD and DoE operations at the FY2024 levels until 14 March 2025, at which time the Appropriations process for the NDAA is to be revisited by the 119th Congress. On 21 December 2024 the Senate approved the Continuing Resolution for President Biden's signature into law. Budget for FY2024 As of 10 March 2023 the fiscal year 2024 (FY2024) presidential budget request was $842 billion.[b] In January 2023 Treasury Secretary Janet Yellen announced the US government would hit its $31.4 trillion debt ceiling on 19 January 2023; the date on which the US government would no longer be able to use extraordinary measures such as issuance of Treasury securities is estimated to be in June 2023. On 3 June 2023, the debt ceiling was suspended until 2025. The $886 billion National Defense Authorization Act is facing reconciliation of the House and Senate bills after passing both houses 27 July 2023; the conferees have to be chosen, next. As of September 2023, a Continuing resolution is needed to prevent a Government shutdown. A shutdown was avoided on 30 September for 45 days (until 17 November 2023), with passage of the NDAA on 14 December 2023. The Senate will next undertake negotiations on supplemental spending for 2024. A government shutdown was averted on 23 March 2024 with the signing of a $1.2 trillion bill to cover FY2024. Budget for FY2023 As of March 2022[update], the defense department was operating under a continuing resolution, which constrains spending even though DoD has to respond to world events, such as the 2022 Russian invasion of Ukraine; the FY2023 defense budget request will exceed $773 billion, according to the chairman of the House Armed Services Committee. By 9 March 2022 a bipartisan agreement on a $782 billion defense budget had been reached (as part of an overall $1.5 trillion budget for FY2022 – thus avoiding a government shutdown). As of 4 April 2022 the FY2023 presidential budget request of $773 billion included $177.5 billion for the Army, $194 billion for the Air Force and Space Force, and $230.8 billion for the Navy and Marine Corps (up 4.1% from FY2022 request). As of 12 December 2022 the House and Senate versions of the FY2023 National Defense Authorization Act (FY2023 NDAA) were to be $839 billion, and $847 billion, for the HASC, and SASC respectively, for a compromise $857.9 billion top line. By 16 December 2022 the current budget extension resolution will have expired. The President signed the FY2023 Appropriations bill on 23 December 2022. US military spending in 2021 reached $801 billion per year according to the Stockholm International Peace Research Institute. Budget for FY2022 In May 2021, the President's defense budget request for FY2022 was $715 billion, up $10 billion from the $705 billion FY2021 request. The total FY2022 defense budget request, including the Department of Energy, was $753 billion, up $12 billion from FY2021's request. On 22 July 2021 the Senate Armed Services Committee approved a budget $25 billion greater than the President's request. The National Defense Authorization Act, budgeting $740 billion for defense, was signed 27 December 2021. By military department, the Army's portion of the budget request, $173 billion, dropped $3.6 billion from the enacted FY2021 budget; the Department of the Navy's portion of the budget request, $211.7 billion, rose 1.8% from the enacted FY2021 budget, largely due to a 6% increase for the Marine Corps' restructuring into a littoral combat force (Navy request: $163.9 billion, or just 0.6% over FY2021, Marine Corps request: $47.9 billion, a 6.2% increase over FY2021); the Air Force's $156.3 billion request for FY2022 is a 2.3% increase over FY2021 enacted budget; the Space Force budget of $17.4 billion is a 13.1% increase over FY2021 enacted budget. Overseas contingency operations (OCOs) are now replaced by "direct war and enduring costs", which are now migrated into the budget. After the release of the FY2022 budget requests to Congress, the military departments also posted their Unfunded priorities/requirements lists for the Congressional Armed Services Committees. Budget for FY2021 For FY2021, the Department of Defense's discretionary budget authority was approximately $705.39 billion ($705,390,000,000). Mandatory spending of $10.77 billion, the Department of Energy and defense-related spending of $37.335 billion added up to the total FY2021 Defense budget of $753.5 billion. FY2021 was the last year for OCOs as shown by the troop withdrawal from Afghanistan. Research, Development, Test, and Evaluation (RDT&E) investments for the future are offset by the OCO cuts, and by reduced procurement of legacy materiel. (Expenditures listed in millions of dollars) Budget for FY2020 For fiscal year 2020 (FY2020), the Department of Defense's budget authority was approximately $721.5 billion ($721,531,000,000). Approximately $712.6 billion is discretionary spending with approximately $8.9 billion in mandatory spending. The Department of Defense estimates that $689.6 billion ($689,585,000,000) will actually be spent (outlays). Both central right-wing and right-wing commentators have advocated for the cutting of military spending. Budget for FY2019 For FY2019, the Department of Defense's budget authority was $693,058,000,000 (including discretionary and mandatory budget authority). In February 2018, the Pentagon requested $686 billion for FY2019. The John S. McCain National Defense Authorization Act authorized Department of Defense appropriations for 2019 and established policies, but it did not contain the budget itself. On 26 July, this bill passed in the House of Representatives by 359–54. On 1 August, the US Senate passed it by 87–10. The bill was presented to President Trump two days later. He signed it on 13 August. On 28 September 2018, Trump signed the Department of Defense appropriations bill. The approved 2019 Department of Defense discretionary budget was $686.1 billion. It has also been described as "$617 billion for the base budget and another $69 billion for war funding." Personnel payment and benefits take up approximately 39.14% of the total budget of $686,074,048,000. Overseas contingency operations (OCO) funds are sometimes called war funds. The MHS offers, but does not always provide, a health care benefit to 9.5 million eligible beneficiaries, which includes active military members and their families, military retirees and their families, dependent survivors, and certain eligible reserve component members and their families. The unified medical budget (UMB), which comprises the funding and personnel needed to support the MHS' mission, consumes nearly 9% of the department's topline budget authority. Thus, it is a significant line item in the department's financial portfolio. Budget authority: the authority to legally incur binding obligations (like signing contracts and placing orders), that will result in current and future outlays. When "military budget" is mentioned, people generally are referring to discretionary budget authority. Outlays: Also known as expenditures or disbursements, it is the liquidation of obligations and general represent cash payments. Total obligational authority: DoD financial term expressing the value of the direct defense program for a given fiscal year, exclusive of the obligation authority from other sources (such as reimbursable orders accepted) Discretionary: Annually appropriated by Congress, subject to budget caps. Mandatory: budget authority authorized by permanent law. Previous budgets As of 2013, the Department of Defense was the third largest executive branch department and utilized 20% of the federal budget. For the 2011 fiscal year, the president's base budget for the Department of Defense and spending on overseas contingency operations totaled $664.84 billion. When the budget was signed into law on 28 October 2009, the final size of the Department of Defense's budget was $680 billion, $16 billion more than President Obama had requested. An additional $37 billion supplemental bill to support the wars in Iraq and Afghanistan was expected to pass in the spring of 2010, but has been delayed by the House of Representatives after passing the Senate. The military operations in Iraq and Afghanistan were largely funded through supplementary spending bills that supplemented the annual military budget requests for each fiscal year. However, the wars in Iraq and Afghanistan were categorized as overseas contingency operations beginning in fiscal year 2010, and the budget is included in the federal budget.[citation needed] By the end of 2008, the US had spent approximately $900 billion in direct costs on the wars in Iraq and Afghanistan. The government also incurred indirect costs, which include interests on additional debt and incremental costs, financed by the Veterans Affairs Department, of caring for more than 33,000 wounded. Some experts estimate the indirect costs will eventually exceed the direct costs. As of June 2011, the total cost of the wars was approximately $1.3 trillion. The federally budgeted (see below) military expenditure of the Department of Defense for fiscal year 2013 is as follows. While data is provided from the 2015 budget, data for 2014 and 2015 is estimated, and thus data is shown for the last year for which definite data exists (2013). The Department of Defense's FY2011 $137.5 billion procurement and $77.2 billion RDT&E budget requests included several programs worth more than $1.5 billion. This does not include many military-related items that are outside of the Defense Department budget, such as nuclear weapons research, maintenance, cleanup, and production, which are in the Atomic Energy Defense Activities section, Veterans Affairs, the Treasury Department's payments in pensions to military retirees and widows and their families, interest on debt incurred in past wars, or State Department financing of foreign arms sales and militarily-related development assistance. Neither does it include defense spending that is domestic rather than international in nature, such as the Department of Homeland Security, counter-terrorism spending by the Federal Bureau of Investigation, and intelligence-gathering spending by NSA, although these programs contain certain weapons, military and security components. Accounting for non DoD military-related expenditure gives a total budget in excess of $1.4 trillion. On 16 March 2017 President Trump submitted his request to Congress for $639 billion in military spending (an increase of $54 billion, 10% for FY2018, as well as $30 billion for FY2017, which ends in September). With a total federal budget of $3.9 trillion for FY2018, the increase in military spending would result in deep cuts to many other federal agencies and domestic programs, as well as the State Department. Trump had pledged to "rebuild" the military as part of his 2016 presidential campaign. In April 2017, journalist Scot J. Paltrow raised concerns about the increase in spending with the Pentagon's history of "faulty accounting". On 14 July, the National Defense Authorization Act 2018 was passed by the US House of Representatives 344–81, with 8 not voting. 60% of Democrats voted for the bill, which represented an 18% increase in defense spending. Congress increased the budget to total $696 billion. The currently available budget request for 2017 was filed on 9 February 2016, under then-President Barack Obama. The press release of the proposal specifies the structure and goals for the FY2017 budget: The FY2017 budget reflects recent strategic threats and changes that have taken place in Asia, the Middle East and Europe. Russian aggression, terrorism by the Islamic State of Iraq and the Levant (ISIL) and others, and China's island building and claims of sovereignty in international waters all necessitate changes in our strategic outlook and in our operational commitments. Threats and actions originating in Iran and North Korea negatively affect our interests and our allies. These challenges have sharpened the focus of our planning and budgeting. The proposal also includes a comparison of the 2016 and the proposed 2017 request amounts, a summary of acquisitions requested for 2017 and enacted in 2016, and provides in detail a breakdown of specific programs to be funded. Amounts are in billions of dollars. These are the top 25 DoD weapon programs described in detail. Quantity refers to the number of items requested: This program's purpose is to "invest in and develop capabilities that advance the technical superiority of the US military to counter new and emerging threats." It has a budget of $12.5 billion, but is separate from the overall Research, Development, Test, and Evaluation portfolio, which comprises $71.8 billion. Efforts funded apply to the Obama administration's refocusing of the US military to Asia, identifying investments to "sustain and advance [the] DoD's military dominance for the 21st century", counter the "technological advances of US foes", and support Manufacturing Initiative institutes. A breakdown of the amounts provided, by tier of research, is provided: Amounts in thousands of dollars Amounts in thousands of dollars This portion of the military budget comprises roughly one third to one half of the total defense budget, considering only military personnel or additionally including civilian personnel, respectively. These expenditures will typically be, the single largest expense category for the department. Since 2001, military pay and benefits have increased by 85%, but remained roughly one third of the total budget due to an overall increased budget. Military pay remains at about the 70th percentile compared to the private sector to attract sufficient amounts of qualified personnel. The request for 2017 amounts to $48.8 billion. The system has 9.4 million beneficiaries, including active, retired, and eligible reserve component military personnel and their families, and dependent survivors. On 9 February 2016, the Department of Defense under President Obama released a statement outlining the proposed 2016 and 2017 defense spending budgets that "[reflect] the priorities necessary for our force today and in the future to best serve and protect our nation in a rapidly changing security environment." Again in 2011, the Government Accountability Office (GAO) could not "render an opinion on the 2011 consolidated financial statements of the federal government", with a major obstacle again being "serious financial management problems at the Department of Defense (DOD) that made its financial statements unauditable". In December 2011, the GAO found that "neither the Navy nor the Marine Corps have implemented effective processes for reconciling their FBWT." According to the GAO, "An agency's FBWT account is similar in concept to a corporate bank account. The difference is that instead of a cash balance, FBWT represents unexpended spending authority in appropriations." In addition, "As of April 2011, there were more than $22 billion unmatched disbursements and collections affecting more than 10,000 lines of accounting." The GAO was unable to provide an audit opinion on the 2010 financial statements of the US Government due to "widespread material internal control weaknesses, significant uncertainties, and other limitations." The GAO cited as the principal obstacle to its provision of an audit opinion "serious financial management problems at the Department of Defense that made its financial statements unauditable". In FY2010, six out of thirty-three DoD reporting entities received unqualified audit opinions. Robert F. Hale, Chief Financial Officer and Under Secretary of Defense, acknowledged enterprise-wide problems with systems and processes, while the DoD's Inspector General reported "material internal control weaknesses ... that affect the safeguarding of assets, proper use of funds, and impair the prevention and identification of fraud, waste, and abuse". Further management discussion in the FY2010 DoD Financial Report states "it is not feasible to deploy a vast number of accountants to manually reconcile our books" and concludes that "although the financial statements are not auditable for FY2010, the Department's financial managers are meeting warfighter needs". The accompanying graphs show that US military spending as a percent of gross domestic product (GDP) peaked during World War II. The table shows historical spending on defense from 1993 to 2025. The defense budget is shown in billions of dollars and total budget in trillions of dollars. The percentage of the total US federal budget spent on defense is indicated in the third row, and change in defense spending from the previous year in the final row. Support service contractors The role of support service contractors has increased since 2001 and in 2007 payments for contractor services exceeded investments in equipment for the armed forces for the first time. In the 2010 budget, the support service contractors will be reduced from the current 39 percent of the workforce down to the pre-2001 level of 26 percent. In a Pentagon review of January 2011, service contractors were found to be "increasingly unaffordable." Military budget and total federal spending The Department of Defense budget accounted in FY2017 for about 14.8% of federal budgeted expenditures. According to the Congressional Budget Office, defense spending grew 9% annually on average in fiscal years 2000–2009. Because of constitutional limitations, military funding is appropriated in a discretionary spending account. (Such accounts permit government planners to have more flexibility to change spending each year, as opposed to mandatory spending accounts that mandate spending on programs in accordance with the law, outside of the budgetary process.) In recent years, discretionary spending as a whole has amounted to about one-third of total federal outlays. Department of Defense spending's share of discretionary spending was 50.5% in 2003, and has risen to between 53% and 54% in recent years. For FY2017, Department of Defense spending amounts to 3.42% of GDP. Because the US GDP has grown over time, the military budget can rise in absolute terms while shrinking as a percentage of the GDP. For example, the Department of Defense budget was slated to be $664 billion in 2010 (including the cost of operations in Iraq and Afghanistan previously funded through supplementary budget legislation), higher than at any other point in American history, but still 1.1–1.4% lower as a percentage of GDP than the amount spent on military during the peak of Cold-War military spending in the late 1980s. Admiral Mike Mullen, former Chairman of the Joint Chiefs of Staff, has called four percent an "absolute floor". This calculation does not take into account some other military-related non-DoD spending, such as Veterans Affairs, Homeland Security, and interest paid on debt incurred in past wars, which has increased even as a percentage of the national GDP. In 2015, Pentagon and related spending totaled $598 billion. In addition, the US will spend at least $179 billion over the fiscal years of 2010–2018 on its nuclear arsenal, averaging $20 billion per year. Despite President Barack Obama's attempts in the media to reduce the scope of the current nuclear arms race, the US intends to spend an additional $1 trillion over the next 30 years modernizing its nuclear arsenal. In September 2017 the Senate followed President Donald Trump's plan to expand military spending, which will boost spending to $700 billion, about 91.4% of which will be spent on maintaining the armed forces and primary Pentagon costs. Military spending is increasing regularly and more money is being spent every year on employee pay, operation and maintenance, and benefits including health benefits. Methods to counteract rapidly increasing spending include shutting down bases, but that was banned by the Bipartisan Budget Act of 2013. War Resisters League (WRL) has been publishing yearly (since 2001 or earlier) federal budget breakdowns which show that military-related spending is a much larger part of the US federal budget than typically reported by official sources. For example, for FY2024, WRL claims that military-related spending makes up 43% of the US budget. Federal waste As of September 2014, the Department of Defense was estimated to have "$857 million in excess parts and supplies". This figure has risen over the past years, and of the Pentagon waste that has been calculated, two figures are especially worth mentioning: the expenditure of "$150 million on private villas for a handful of Pentagon employees in Afghanistan and the procurement of the JLENS air-defense balloon" which, throughout the program's development over the past two decades, is estimated to have cost $2.7 billion. Comparison with other countries The US spends more on national defense than China, India, Russia, Saudi Arabia, France, Germany, United Kingdom, Japan, South Korea, and Brazil combined. The 2018 US military budget accounts for approximately 36% of global arms spending (for comparison, US GDP is only 24% of global GDP). The 2018 budget is approximately 2.5 times larger than the $250 billion military budget of China. The US and its close allies are responsible for two-thirds to three-quarters of the world's military spending (of which, in turn, the US is responsible for the majority). The US also maintains the largest number of military bases on foreign soil in the world. While there are no freestanding foreign bases permanently located in the US, there are now around 800 US bases in foreign countries. Military spending makes up nearly 16% of entire federal spending and approximately half of discretionary spending. In a general sense discretionary spending (defense and non-defense spending) makes up one-third of the annual federal budget. In 2016, the US spent 3.29% of its GDP on its military (considering only basic Department of Defense budget spending), more than France's 2.26% and less than Saudi Arabia's 9.85%. This is historically low for the US since it peaked in 1944 at 37.8% of GDP (it reached the lowest point of 3.0% in 1999–2001). Even during the peak of the Vietnam War the percentage reached a high of 9.4% in 1968. In 2018, the US spent 3.2% of its GDP on its military, while Saudi Arabia spent 8.8%, Israel spent 4.3%, Pakistan spent 4.0%, Russia spent 3.9%, South Korea spent 2.6%, China spent 1.9%, United Kingdom spent 1.8%, and Germany spent 1.2% of its GDP on defense. The US military's budget has plateaued in 2009, but is still considerably larger than any other military power. Past commentary on military budget In 2009 Robert Gates, then Secretary of Defense, wrote that the US should adjust its priorities and spending to address the changing nature of threats in the world: "What all these potential adversaries—from terrorist cells to rogue nations to rising powers—have in common is that they have learned that it is unwise to confront the United States directly on conventional military terms. The United States cannot take its current dominance for granted and needs to invest in the programs, platforms, and personnel that will ensure that dominance's persistence. But it is also important to keep some perspective. As much as the US Navy has shrunk since the end of the Cold War, for example, in terms of tonnage, its battle fleet is still larger than the next 13 navies combined—and 11 of those 13 navies are US allies or partners." Secretary Gates announced some of his budget recommendations in April 2009. According to a 2009 Congressional Research Service report there was a discrepancy between a budget that is declining as a percentage of GDP while the responsibilities of the DoD have not decreased and additional pressures on the military budget have arisen due to broader missions in the post-9/11 world, dramatic increases in personnel and operating costs, and new requirements resulting from wartime lessons in the Iraq War and Operation Enduring Freedom. Expenses for fiscal years 2001 through 2010 were analyzed by Russell Rumbaugh, a retired Army officer and ex-CIA military analyst, in a report for the Stimson Center. Rumbaugh wrote: "Between 1981 and 1990, the Air Force bought 2,063 fighters. In contrast, between 2001 and 2010, it bought only 220. Yet between 2001 and 2010 the Air Force spent $38B of procurement funding just on fighter aircraft in inflation-adjusted dollars, compared with the $68B it spent between 1981 and 1990. In other words, the Air Force spent 55 percent as much money to get 10 percent as many fighters." As Adam Weinstein explained one of the report's findings: "Of the roughly $1 trillion spent on gadgetry since 9/11, 22 percent of it came from supplemental war funding – annual outlays that are voted on separately from the regular defense budget." Most of the $5 billion in budget cuts for 2013 that were mandated by Congress in 2012 really only shifted expenses from the general military budget to the Afghanistan war budget. Declaring that nearly 65,000 troops were temporary rather than part of the permanent forces resulted in the reallocation of $4 billion in existing expenses to this different budget. In May 2012, as part of Obama's East Asia "pivot", his 2013 national military request moved funding from the Army and Marines to favor the Navy, but Congress has resisted this. Reports emerged in February 2014 that Secretary of Defense Chuck Hagel was planning to trim the defense budget by billions of dollars. The secretary in his first defense budget planned to limit pay rises, increase fees for healthcare benefits, freeze the pay of senior officers, reduce military housing allowances, and reduce the size of the force. In July 2014, American Enterprise Institute scholar Michael Auslin opined in the National Review that the Air Force needs to be fully funded as a priority, due to the air superiority, global airlift, and long-range strike capabilities it provides. In January 2015, the Defense Department published its internal study on how to save $125 billion on its military budget from 2016 to 2020 by renegotiating vendor contracts and pushing for stronger deals, and by offering workers early retirement and retraining. On 5 December 2012, the Department of Defense announced it was planning for automatic spending cuts, which include $500 billion and an additional $487 billion due to the 2011 Budget Control Act, due to the fiscal cliff. According to Politico, the Department of Defense declined to explain to the House of Representatives Appropriations Committee, which controls federal spending, what its plans were regarding the fiscal cliff planning. This was after half a dozen members of Congress very experienced in military matters either resigned from Congress or lost their reelection fights, including Joe Lieberman (I-CT). Lawrence Korb has noted that given recent trends military entitlements and personnel costs will take up the entire defense budget by 2039. GAO audits The GAO was unable to provide an audit opinion on the 2010 financial statements of the US government due to "widespread material internal control weaknesses, significant uncertainties, and other limitations." The GAO cited as the principal obstacle to its provision of an audit opinion "serious financial management problems at the Department of Defense that made its financial statements unauditable." In FY2011, seven out of 33 DoD reporting entities received unqualified audit opinions. Under Secretary of Defense Robert F. Hale acknowledged enterprise-wide weaknesses with controls and systems. Further management discussion in the FY2011 DoD Financial Report states "we are not able to deploy the vast numbers of accountants that would be required to reconcile our books manually". Congress has established a deadline of FY2017 for the DoD to achieve audit readiness. For FYs 1998–2010 the Department of Defense's financial statements were either unauditable or such that no audit opinion could be expressed. Several years behind other government agencies, the first results from an army of about 2,400 contracted DoD auditors were expected on 15 November 2018.[needs update] Post–World War II overview and reform The conclusion of World War II and the start of the Cold War prompted the rapid expansion of an arms race. Subsequently, the reallocation of budgets, prompted by several wars and proxy wars forced the Department of Defense to increase research and development of new military systems and equipment to proliferate on a mass scale to compete with, at the time, the Soviet Union. On 17 January 1961, then-President Dwight D. Eisenhower in a farewell address to the US warned the people and government about the creation of a "military-industrial complex". As prompted by President Eisenhower, the war had arguably become an industry. It was also speculated by Eisenhower that the arms industry would bring war-like industrial influence into the various sectors of government. He stated: "In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist." Following the departure of President Eisenhower, the expenditures and budgets of the US military grew exponentially. The Cold War (1947–1991) developed the largest proliferation of a nuclear arsenal to date. New defense contractors stood up to supply the demand for the military and its various conflicts across the globe. In addition, the Vietnam War was the largest expenditure during the Cold War at approximately $168 billion or about $1 trillion in today's[when?] inflated costs. In a statement of 6 January 2011, Defense Secretary Robert M. Gates stated: "This department simply cannot risk continuing down the same path – where our investment priorities, bureaucratic habits and lax attitude towards costs are increasingly divorced from the real threats of today, the growing perils of tomorrow and the nation's grim financial outlook." Gates has proposed a budget that, if approved by Congress, would reduce the costs of many DoD programs and policies, including reports, the IT infrastructure, fuel, weapon programs, DoD bureaucracies, and personnel. The 2015 expenditure for Army research, development and acquisition changed from $32 billion projected in 2012 for FY2015, to $21 billion for FY2015 expected in 2014. In 2018, it was announced that the Department of Defense was the subject of a comprehensive budgetary audit. This review was conducted by private, third-party accounting consultants. The audit ended and was deemed incomplete due to deficient accounting practices in the department. In FY2022, the US had the largest defense budget and expenditures of any other country in the world totaling around $777.1 billion. The rise in the military budget over the last decade can be traced to the production of new technologies such as a 5th generation fighter aircraft to meet the increase in demand for new combat capabilities. Many of these costs were the result of R&D, or research and development. Research and development is one of the US's primary focuses in the defense budget. Opponents of growing military spending budgets have long argued that the US should refocus and reallocate the military budgets to promote social welfare. However, the projections for the near future are that the defense budget and its expenditures are only going to continue to grow exponentially. In the published FY2022 budget report, the authority has been given to increase the defense budget by about $17 billion ($535 billion of which is a part of contract obligations) from FY2021. In addition, the Biden administration has proposed another increase of the FY2023 budget to $737 billion. On the contrary, proponents of increasing the US Defense budgets have long argued that factors such as China and other adversaries of the US must be kept in check (from a military standpoint). References Further reading External links |
======================================== |
[SOURCE: https://www.theverge.com/streaming] | [TOKENS: 1723] |
Streaming Established streaming industry leaders like Netflix and Amazon are facing more competition than ever. Now legacy entertainment giants are in the game with their own subscription services, like Peacock, HBO Max, Paramount Plus, and the Disney Plus / Hulu / ESPN Plus bundle, while Apple TV Plus attacks around the edges. Meanwhile, the rise of ad-supported free platforms like Roku Channel and Pluto TV has attracted enough attention that Plex, YouTube, and Amazon’s Freevee are trying to get a chunk of the action too. Now that Andor has come to end, series Tony Gilroy is free to speak more openly about what it was like working for Disney, and in a recent interview with The Hollywood Reporter he says that the studio was very insistent on him not using the word “fascism” while talking about his show focused on fighting fascism. [The Hollywood Reporter] I Am Frankelda — co-writer / directors Arturo and Roy Ambriz’s stop motion dark fantasy film about a girl with a strange connection to another dimension — has been acquired by Netflix and is slated to debut on the streamer sometime later this year. HBO’s medical drama has been teasing out a smart story about what makes gen AI so tempting and concerning. YouTube is starting to test its conversational AI tool with a “small group of users” on smart TVs, gaming consoles, and streaming devices. The tool, first introduced in 2023, lets you ask questions about the videos you’re watching. [YouTube Help] Though HBO still hasn’t announced a firm release date for House of the Dragon’s upcoming third season, there’s a new trailer teasing out Rhaenyra’s plan to make her enemies pay using her squad of newly-tamed dragons. The new season drops some time in June. A partial YouTube outage knocked out access to Google’s video service on Tuesday night. The outage appears to have started just before 8PM ET, but at least on the homepage, it appears to be resolved now. A note on YouTube’s support page says it went down due to problems with the recommendations system. “The issue with our recommendations system has been resolved and all of our platforms (YouTube.com, the YouTube app, YouTube Music, Kids, and TV) are back to normal! Update: The service is back online. Starting on February 18th, Dropout fans will be able to drop into a new 24/7, ad-free livestream channel on the comedy-focused streaming service. “Dropout 24/7” will run all of its content, including the TTRPG series Dimension 20, improv comedy shows Game Changer and Make Some Noise, and more, starting with a Dimension 20 marathon. Just after we entered the courtroom, we learned that a juror has been hospitalized. The parties decided to postpone today’s testimony from former Meta employees to see if the juror can return. Regardless, Meta CEO Mark Zuckerberg is expected to testify tomorrow — either before the original juror, or an alternate. The latest trailer for The Mandalorian and Grogu really highlights how very, very young Grogu still is for one of Yoda’s species, which makes it seem that much more absurd that Din Djarin still hasn’t found his son a helmet to protect that (presumably) soft head. One night in the audience of Netflix’s most ambitious live show yet. During the company’s Q4 earnings call, Gustav Söderström revealed that Spotify had fully embraced vibe coding. AI is coming for a lot of jobs, and software developer is high on the list of those in danger. Still, it’s shocking that the top devs at Spotify haven’t written any code in 2026. Per Business Insider: “When I speak to my most senior engineers — the best developers we have — they actually say that they haven’t written a single line of code since December… They actually only generate code and supervise it.” - Spotify CEO Gustav Söderström In a cease and desist letter, Disney includes examples of the new Seedance 2.0 model making videos featuring characters like Spider-Man and Darth Vader, according to Axios. “ByteDance is hijacking Disney’s characters by reproducing, distributing, and creating derivative works featuring those characters,” Disney’s attorney said. [Axios] The platform isn’t the first service to try them. Now, Paramount is offering to cover the $2.8 billion termination fee that Warner Bros. Discovery would owe Netflix for abandoning the $82.7 billion merger agreement. It’s also tossing in a $0.25 per share “ticking fee” that it would pay shareholders for every quarter its deal hasn’t closed beyond December 31st, 2026. [The Hollywood Reporter] There’s a lot to see in Netflix’s new trailer for the live-action One Piece’s upcoming second season, but the most surprising reveal here is a fresh look at Tony Tony Chopper’s (Mikaela Hoover) Walk Point form that turns him into a much more normal-looking reindeer. The show’s out March 10th. Similar to the Prompted Playlists that Spotify launched in December, YouTube Music premium subscribers on iOS and Android can now use voice or text descriptions to turn ideas, genres, or vibes into personalized playlists. The next four episodes of Sesame Street are dropping soon on Netflix and will include a cameo from Miley Cyrus. Will they also go back to having a Letter and Number of the Day and a proper ending song? I won’t hold my breath, but I’ll still be bitter. Disney Plus subscribers in a few European countries have spotted that Dolby Vision, HDR10+, and 3D content have disappeared from the service. A Disney statement blames it on “technical challenges,” but FlatpanelsHD points out it may be linked to a patent dispute in Germany. [FlatpanelsHD] Just before kickoff, Disney and Pixar previewed their next CG family flick, Hoppers. As we’ve heard and seen in a teaser trailer, this one follows a world where people put their minds into 3D-printed versions of animals to try and live among them. It’s coming to theaters on March 6th, although the YouTube description mentions an early viewing window on February 28th. The $82.7 billion deal will force Paramount Plus, Disney Plus, Peacock, Apple TV, and other rivals to make some changes. Now that Boys Go to Jupiter is streaming on HBO Max, you should check out our interview with writer / director Julian Glander about the movie and why he feels that open-source software programs like Blender are doing way more to “democratize” art than generative AI. If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement. Planet Money TikToks inspired one of the year’s most brilliant animated movies Todd Harthan and Todd Helbing have joined the series as co-showrunners and executive producers, along with Marc Webb and Rachel Moore as executive producers, as Variety reports. Author Christopher Paolini will also be executive producing the adaptation of his YA fantasy series at Disney Plus, which is also home to the hit adaptation of Percy Jackson. [Variety] As expensive bundled packages, premium live sports pricing, and restrictive walled gardens proliferate, customers have turned to TV piracy instead. Now why do I feel like I’ve seen this one before? NotSoSavvy: Hey, the exact same problem that drove people to Torrenting comes up again! It’s weird that as soon as watching things legally becomes more of a hassle than watching them illegally, people start watching illegally again. It’s almost like people hate walled gardens and it’s generally in a company’s best interests to have wider, easier distribution they get a smaller cut from than try and make their own Netflix or something. Get the day’s best comment and more in my free newsletter, The Verge Daily. While Top Gear has spent years trying to replace Jeremy Clarkson, James May, and Richard Hammond with mainstream celebrities, Amazon is instead appeasing British broadcasting execs’ obsession with online content creators. The Grand Tour season 7 presenters are viral trainspotter Francis Bourgeois, alongside James Engelsman and Thomas Holland, who run the Throttle House YouTube channel. [Deadline] Pagination Most Popular The Verge Daily A free daily digest of the news that matters most. This is the title for the native ad © 2026 Vox Media, LLC. All Rights Reserved |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/BBC_News#Israeli–Palestinian_conflict] | [TOKENS: 8810] |
Contents BBC News BBC News is an operational business division of the British Broadcasting Corporation (BBC) responsible for the gathering and broadcasting of news and current affairs in the UK and around the world. The department is the world's largest broadcast news organisation and generates about 120 hours of radio and television output each day, as well as online news coverage. The service has over 5,500 journalists working across its output including in 50 foreign news bureaus where more than 250 foreign correspondents are stationed. Deborah Turness has been the CEO of news and current affairs since September 2022. In 2019, it was reported in an Ofcom report that the BBC spent £136m on news during the period April 2018 to March 2019. BBC News' domestic, global and online news divisions are housed within the largest live newsroom in Europe, in Broadcasting House in central London. Parliamentary coverage is produced and broadcast from studios in London. Through BBC English Regions, the BBC also has regional centres across England and national news centres in Northern Ireland, Scotland and Wales. All nations and English regions produce their own local news programmes and other current affairs and sport programmes. The BBC is a quasi-autonomous corporation authorised by royal charter, making it operationally independent of the government. As of 2024, the BBC reaches an average of 450 million people per week, with the BBC World Service accounting for 320 million people. History This is London calling – 2LO calling. Here is the first general news bulletin, copyright by Reuters, Press Association, Exchange Telegraph and Central News. — BBC news programme opening during the 1920s The British Broadcasting Company broadcast its first radio bulletin from radio station 2LO on 14 November 1922. Wishing to avoid competition, newspaper publishers persuaded the government to ban the BBC from broadcasting news before 7 pm, and to force it to use wire service copy instead of reporting on its own. The BBC gradually gained the right to edit the copy and, in 1934, created its own news operation. However, it could not broadcast news before 6 p.m. until World War II. In addition to news, Gaumont British and Movietone cinema newsreels had been broadcast on the TV service since 1936, with the BBC producing its own equivalent Television Newsreel programme from January 1948. A weekly Children's Newsreel was inaugurated on 23 April 1950, to around 350,000 receivers. The network began simulcasting its radio news on television in 1946, with a still picture of Big Ben. Televised bulletins began on 5 July 1954, broadcast from leased studios within Alexandra Palace in London. The public's interest in television and live events was stimulated by Elizabeth II's coronation in 1953. It is estimated that up to 27 million people viewed the programme in the UK, overtaking radio's audience of 12 million for the first time. Those live pictures were fed from 21 cameras in central London to Alexandra Palace for transmission, and then on to other UK transmitters opened in time for the event. That year, there were around two million TV Licences held in the UK, rising to over three million the following year, and four and a half million by 1955. Television news, although physically separate from its radio counterpart, was still firmly under radio news' control in the 1950s. Correspondents provided reports for both outlets, and the first televised bulletin, shown on 5 July 1954 on the then BBC television service and presented by Richard Baker, involved his providing narration off-screen while stills were shown. This was then followed by the customary Television Newsreel with a recorded commentary by John Snagge (and on other occasions by Andrew Timothy). On-screen newsreaders were introduced a year later in 1955 – Kenneth Kendall (the first to appear in vision), Robert Dougall, and Richard Baker—three weeks before ITN's launch on 21 September 1955. Mainstream television production had started to move out of Alexandra Palace in 1950 to larger premises – mainly at Lime Grove Studios in Shepherd's Bush, west London – taking Current Affairs (then known as Talks Department) with it. It was from here that the first Panorama, a new documentary programme, was transmitted on 11 November 1953, with Richard Dimbleby becoming anchor in 1955. In 1958, Hugh Carleton Greene became head of News and Current Affairs. On 1 January 1960, Greene became Director-General. Greene made changes that were aimed at making BBC reporting more similar to its competitor ITN, which had been highly rated by study groups held by Greene. A newsroom was created at Alexandra Palace, television reporters were recruited and given the opportunity to write and voice their own scripts, without having to cover stories for radio too. On 20 June 1960, Nan Winton, the first female BBC network newsreader, appeared in vision. 19 September 1960 saw the start of the radio news and current affairs programme The Ten O'clock News. BBC2 started transmission on 20 April 1964 and began broadcasting a new show, Newsroom. The World at One, a lunchtime news programme, began on 4 October 1965 on the then Home Service, and the year before News Review had started on television. News Review was a summary of the week's news, first broadcast on Sunday, 26 April 1964 on BBC 2 and harking back to the weekly Newsreel Review of the Week, produced from 1951, to open programming on Sunday evenings–the difference being that this incarnation had subtitles for the deaf and hard-of-hearing. As this was the decade before electronic caption generation, each superimposition ("super") had to be produced on paper or card, synchronised manually to studio and news footage, committed to tape during the afternoon, and broadcast early evening. Thus Sundays were no longer a quiet day for news at Alexandra Palace. The programme ran until the 1980s – by then using electronic captions, known as Anchor – to be superseded by Ceefax subtitling (a similar Teletext format), and the signing of such programmes as See Hear (from 1981). On Sunday 17 September 1967, The World This Weekend, a weekly news and current affairs programme, launched on what was then Home Service, but soon-to-be Radio 4. Preparations for colour began in the autumn of 1967 and on Thursday 7 March 1968 Newsroom on BBC2 moved to an early evening slot, becoming the first UK news programme to be transmitted in colour – from Studio A at Alexandra Palace. News Review and Westminster (the latter a weekly review of Parliamentary happenings) were "colourised" shortly after. However, much of the insert material was still in black and white, as initially only a part of the film coverage shot in and around London was on colour reversal film stock, and all regional and many international contributions were still in black and white. Colour facilities at Alexandra Palace were technically very limited for the next eighteen months, as it had only one RCA colour Quadruplex videotape machine and, eventually two Pye plumbicon colour telecines–although the news colour service started with just one. Black and white national bulletins on BBC 1 continued to originate from Studio B on weekdays, along with Town and Around, the London regional "opt out" programme broadcast throughout the 1960s (and the BBC's first regional news programme for the South East), until it started to be replaced by Nationwide on Tuesday to Thursday from Lime Grove Studios early in September 1969. Town and Around was never to make the move to Television Centre – instead it became London This Week which aired on Mondays and Fridays only, from the new TVC studios. The BBC moved production out of Alexandra Palace in 1969. BBC Television News resumed operations the next day with a lunchtime bulletin on BBC1 – in black and white – from Television Centre, where it remained until March 2013. This move to a smaller studio with better technical facilities allowed Newsroom and News Review to replace back projection with colour-separation overlay. During the 1960s, satellite communication had become possible; however, it was some years before digital line-store conversion was able to undertake the process seamlessly. On 14 September 1970, the first Nine O'Clock News was broadcast on television. Robert Dougall presented the first week from studio N1 – described by The Guardian as "a sort of polystyrene padded cell"—the bulletin having been moved from the earlier time of 20.50 as a response to the ratings achieved by ITN's News at Ten, introduced three years earlier on the rival ITV. Richard Baker and Kenneth Kendall presented subsequent weeks, thus echoing those first television bulletins of the mid-1950s. Angela Rippon became the first female news presenter of the Nine O'Clock News in 1975. Her work outside the news was controversial at the time, appearing on The Morecambe and Wise Christmas Show in 1976 singing and dancing. The first edition of John Craven's Newsround, initially intended only as a short series and later renamed just Newsround, came from studio N3 on 4 April 1972. Afternoon television news bulletins during the mid to late 1970s were broadcast from the BBC newsroom itself, rather than one of the three news studios. The newsreader would present to camera while sitting on the edge of a desk; behind him staff would be seen working busily at their desks. This period corresponded with when the Nine O'Clock News got its next makeover, and would use a CSO background of the newsroom from that very same camera each weekday evening. Also in the mid-1970s, the late night news on BBC2 was briefly renamed Newsnight, but this was not to last, or be the same programme as we know today – that would be launched in 1980 – and it soon reverted to being just a news summary with the early evening BBC2 news expanded to become Newsday. News on radio was to change in the 1970s, and on Radio 4 in particular, brought about by the arrival of new editor Peter Woon from television news and the implementation of the Broadcasting in the Seventies report. These included the introduction of correspondents into news bulletins where previously only a newsreader would present, as well as the inclusion of content gathered in the preparation process. New programmes were also added to the daily schedule, PM and The World Tonight as part of the plan for the station to become a "wholly speech network". Newsbeat launched as the news service on Radio 1 on 10 September 1973. On 23 September 1974, a teletext system which was launched to bring news content on television screens using text only was launched. Engineers originally began developing such a system to bring news to deaf viewers, but the system was expanded. The Ceefax service became much more diverse before it ceased on 23 October 2012: it not only had subtitling for all channels, it also gave information such as weather, flight times and film reviews. By the end of the decade, the practice of shooting on film for inserts in news broadcasts was declining, with the introduction of ENG technology into the UK. The equipment would gradually become less cumbersome – the BBC's first attempts had been using a Philips colour camera with backpack base station and separate portable Sony U-matic recorder in the latter half of the decade. In 1980, the Iranian Embassy Siege had been shot electronically by the BBC Television News Outside broadcasting team, and the work of reporter Kate Adie, broadcasting live from Prince's Gate, was nominated for BAFTA actuality coverage, but this time beaten by ITN for the 1980 award. Newsnight, the news and current affairs programme, was due to go on air on 23 January 1980, although trade union disagreements meant that its launch from Lime Grove was postponed by a week. On 27 August 1981 Moira Stuart became the first African Caribbean female newsreader to appear on British television. By 1982, ENG technology had become sufficiently reliable for Bernard Hesketh to use an Ikegami camera to cover the Falklands War, coverage for which he won the "Royal Television Society Cameraman of the Year" award and a BAFTA nomination – the first time that BBC News had relied upon an electronic camera, rather than film, in a conflict zone. BBC News won the BAFTA for its actuality coverage, however the event has become remembered in television terms for Brian Hanrahan's reporting where he coined the phrase "I'm not allowed to say how many planes joined the raid, but I counted them all out and I counted them all back" to circumvent restrictions, and which has become cited as an example of good reporting under pressure. The first BBC breakfast television programme, Breakfast Time also launched during the 1980s, on 17 January 1983 from Lime Grove Studio E and two weeks before its ITV rival TV-am. Frank Bough, Selina Scott, and Nick Ross helped to wake viewers with a relaxed style of presenting. The Six O'Clock News first aired on 3 September 1984, eventually becoming the most watched news programme in the UK (however, since 2006 it has been overtaken by the BBC News at Ten). In October 1984, images of millions of people starving to death in the Ethiopian famine were shown in Michael Buerk's Six O'Clock News reports. The BBC News crew were the first to document the famine, with Buerk's report on 23 October describing it as "a biblical famine in the 20th century" and "the closest thing to hell on Earth". The BBC News report shocked Britain, motivating its citizens to inundate relief agencies, such as Save the Children, with donations, and to bring global attention to the crisis in Ethiopia. The news report was also watched by Bob Geldof, who would organise the charity single "Do They Know It's Christmas?" to raise money for famine relief followed by the Live Aid concert in July 1985. Starting in 1981, the BBC gave a common theme to its main news bulletins with new electronic titles–a set of computer-animated "stripes" forming a circle on a red background with a "BBC News" typescript appearing below the circle graphics, and a theme tune consisting of brass and keyboards. The Nine used a similar (striped) number 9. The red background was replaced by a blue from 1985 until 1987. By 1987, the BBC had decided to re-brand its bulletins and established individual styles again for each one with differing titles and music, the weekend and holiday bulletins branded in a similar style to the Nine, although the "stripes" introduction continued to be used until 1989 on occasions where a news bulletin was screened out of the running order of the schedule. In 1987, John Birt resurrected the practice of correspondents working for both TV and radio with the introduction of bi-media journalism. During the 1990s, a wider range of services began to be offered by BBC News, with the split of BBC World Service Television to become BBC World (news and current affairs), and BBC Prime (light entertainment). Content for a 24-hour news channel was thus required, followed in 1997 with the launch of domestic equivalent BBC News 24. Rather than set bulletins, ongoing reports and coverage was needed to keep both channels functioning and meant a greater emphasis in budgeting for both was necessary. In 1998, after 66 years at Broadcasting House, the BBC Radio News operation moved to BBC Television Centre. New technology, provided by Silicon Graphics, came into use in 1993 for a re-launch of the main BBC 1 bulletins, creating a virtual set which appeared to be much larger than it was physically. The relaunch also brought all bulletins into the same style of set with only small changes in colouring, titles, and music to differentiate each. A computer generated cut-glass sculpture of the BBC coat of arms was the centrepiece of the programme titles until the large scale corporate rebranding of news services in 1999. In November 1997, BBC News Online was launched, following individual webpages for major news events such as the 1996 Olympic Games, 1997 general election, and the death of Princess Diana. In 1999, the biggest relaunch occurred, with BBC One bulletins, BBC World, BBC News 24, and BBC News Online all adopting a common style. One of the most significant changes was the gradual adoption of the corporate image by the BBC regional news programmes, giving a common style across local, national and international BBC television news. This also included Newyddion, the main news programme of Welsh language channel S4C, produced by BBC News Wales. Following the relaunch of BBC News in 1999, regional headlines were included at the start of the BBC One news bulletins in 2000. The English regions did however lose five minutes at the end of their bulletins, due to a new headline round-up at 18:55. 2000 also saw the Nine O'Clock News moved to the later time of 22:00. This was in response to ITN who had just moved their popular News at Ten programme to 23:00. ITN briefly returned News at Ten but following poor ratings when head-to-head against the BBC's Ten O'Clock News, the ITN bulletin was moved to 22.30, where it remained until 14 January 2008. The retirement in 2009 of Peter Sissons and departure of Michael Buerk from the Ten O'Clock News led to changes in the BBC One bulletin presenting team on 20 January 2003. The Six O'Clock News became double headed with George Alagiah and Sophie Raworth after Huw Edwards and Fiona Bruce moved to present the Ten. A new set design featuring a projected fictional newsroom backdrop was introduced, followed on 16 February 2004 by new programme titles to match those of BBC News 24. BBC News 24 and BBC World introduced a new style of presentation in December 2003, that was slightly altered on 5 July 2004 to mark 50 years of BBC Television News. On 7 March 2005 director general Mark Thompson launched the "Creative Futures" project to restructure the organisation. The individual positions of editor of the One and Six O'Clock News were replaced by a new daytime position in November 2005. Kevin Bakhurst became the first Controller of BBC News 24, replacing the position of editor. Amanda Farnsworth became daytime editor while Craig Oliver was later named editor of the Ten O'Clock News. Bulletins received new titles and a new set design in May 2006, to allow for Breakfast to move into the main studio for the first time since 1997. The new set featured Barco videowall screens with a background of the London skyline used for main bulletins and originally an image of cirrus clouds against a blue sky for Breakfast. This was later replaced following viewer criticism. The studio bore similarities with the ITN-produced ITV News in 2004, though ITN uses a CSO Virtual studio rather than the actual screens at BBC News. BBC News became part of a new BBC Journalism group in November 2006 as part of a restructuring of the BBC. The then-Director of BBC News, Helen Boaden reported to the then-Deputy Director-General and head of the journalism group, Mark Byford until he was made redundant in 2010. On 18 October 2007, ED Mark Thompson announced a six-year plan, "Delivering Creative Futures" (based on his project begun in March 2005), merging the television current affairs department into a new "News Programmes" division. Thompson's announcement, in response to a £2 billion shortfall in funding, would, he said, deliver "a smaller but fitter BBC" in the digital age, by cutting its payroll and, in 2013, selling Television Centre. The various separate newsrooms for television, radio and online operations were merged into a single multimedia newsroom. Programme making within the newsrooms was brought together to form a multimedia programme making department. BBC World Service director Peter Horrocks said that the changes would achieve efficiency at a time of cost-cutting at the BBC. In his blog, he wrote that by using the same resources across the various broadcast media meant fewer stories could be covered, or by following more stories, there would be fewer ways to broadcast them. A new graphics and video playout system was introduced for production of television bulletins in January 2007. This coincided with a new structure to BBC World News bulletins, editors favouring a section devoted to analysing the news stories reported on. The first new BBC News bulletin since the Six O'Clock News was announced in July 2007 following a successful trial in the Midlands. The summary, lasting 90 seconds, has been broadcast at 20:00 on weekdays since December 2007 and bears similarities with 60 Seconds on BBC Three, but also includes headlines from the various BBC regions and a weather summary. As part of a long-term cost cutting programme, bulletins were renamed the BBC News at One, Six and Ten respectively in April 2008 while BBC News 24 was renamed BBC News and moved into the same studio as the bulletins at BBC Television Centre. BBC World was renamed BBC World News and regional news programmes were also updated with the new presentation style, designed by Lambie-Nairn. 2008 also saw tri-media introduced across TV, radio, and online. The studio moves also meant that Studio N9, previously used for BBC World, was closed, and operations moved to the previous studio of BBC News 24. Studio N9 was later refitted to match the new branding, and was used for the BBC's UK local elections and European elections coverage in early June 2009. A strategy review of the BBC in March 2010, confirmed that having "the best journalism in the world" would form one of five key editorial policies, as part of changes subject to public consultation and BBC Trust approval. After a period of suspension in late 2012, Helen Boaden ceased to be the Director of BBC News. On 16 April 2013, incoming BBC Director-General Tony Hall named James Harding, a former editor of The Times of London newspaper as Director of News and Current Affairs. From August 2012 to March 2013, all news operations moved from Television Centre to new facilities in the refurbished and extended Broadcasting House, in Portland Place. The move began in October 2012, and also included the BBC World Service, which moved from Bush House following the expiry of the BBC's lease. This new extension to the north and east, referred to as "New Broadcasting House", includes several new state-of-the-art radio and television studios centred around an 11-storey atrium. The move began with the domestic programme The Andrew Marr Show on 2 September 2012, and concluded with the move of the BBC News channel and domestic news bulletins on 18 March 2013. The newsroom houses all domestic bulletins and programmes on both television and radio, as well as the BBC World Service international radio networks and the BBC World News international television channel. BBC News and CBS News established an editorial and newsgathering partnership in 2017, replacing an earlier long-standing partnership between BBC News and ABC News. In an October 2018 Simmons Research survey of 38 news organisations, BBC News was ranked the fourth most trusted news organisation by Americans, behind CBS News, ABC News and The Wall Street Journal. In January 2020 the BBC announced a BBC News savings target of £80 million per year by 2022, involving about 450 staff reductions from the current 6,000. BBC director of news and current affairs Fran Unsworth said there would be further moves toward digital broadcasting, in part to attract back a youth audience, and more pooling of reporters to stop separate teams covering the same news. A further 70 staff reductions were announced in July 2020. BBC Three began airing the news programme The Catch Up in February 2022. It is presented by Levi Jouavel, Kirsty Grant, and Callum Tulley and aims to get the channel's target audience (16 to 34-year olds) to make sense of the world around them while also highlighting optimistic stories. Compared to its predecessor 60 Seconds, The Catch Up is three times longer, running for about three minutes and not airing during weekends. According to its annual report as of December 2021[update], India has the largest number of people using BBC services in the world. In May 2025, following the earthquake that hit Myanmar and Thailand, a television news bulletin (BBC News Myanmar) from the Burmese service using a vacated Voice of America satellite frequency began its broadcasts. Programming and reporting In November 2023, BBC News joined with the International Consortium of Investigative Journalists, Paper Trail Media [de] and 69 media partners including Distributed Denial of Secrets and the Organised Crime and Corruption Reporting Project (OCCRP) and more than 270 journalists in 55 countries and territories to produce the 'Cyprus Confidential' report on the financial network which supports the regime of Vladimir Putin, mostly with connections to Cyprus, and showed Cyprus to have strong links with high-up figures in the Kremlin, some of whom have been sanctioned. Government officials including Cyprus president Nikos Christodoulides and European lawmakers began responding to the investigation's findings in less than 24 hours, calling for reforms and launching probes. BBC News is responsible for the news programmes and documentary content on the BBC's general television channels, as well as the news coverage on the BBC News Channel in the UK, and 22 hours of programming for the corporation's international BBC World News channel. Coverage for BBC Parliament is carried out on behalf of the BBC at Millbank Studios, though BBC News provides editorial and journalistic content. BBC News content is also output onto the BBC's digital interactive television services under the BBC Red Button brand, and until 2012, on the Ceefax teletext system. The music on all BBC television news programmes was introduced in 1999 and composed by David Lowe. It was part of the re-branding which commenced in 1999 and features 'BBC Pips'. The general theme was used on bulletins on BBC One, News 24, BBC World and local news programmes in the BBC's Nations and Regions. Lowe was also responsible for the music on Radio One's Newsbeat. The theme has had several changes since 1999, the latest in March 2013. The BBC Arabic Television news channel launched on 11 March 2008, a Persian-language channel followed on 14 January 2009, broadcasting from the Peel wing of Broadcasting House; both include news, analysis, interviews, sports and highly cultural programmes and are run by the BBC World Service and funded from a grant-in-aid from the British Foreign Office (and not the television licence). The BBC Verify service was launched in 2023 to fact-check news stories, followed by BBC Verify Live in 2025. BBC Radio News produces bulletins for the BBC's national radio stations and provides content for local BBC radio stations via the General News Service (GNS), a BBC-internal news distribution service. BBC News does not produce the BBC's regional news bulletins, which are produced individually by the BBC nations and regions themselves. The BBC World Service broadcasts to some 150 million people in English as well as 27 languages across the globe. BBC Radio News is a patron of the Radio Academy. BBC News Online is the BBC's news website. Launched in November 1997, it is one of the most popular news websites, with 1.2 billion website visits in April 2021, as well as being used by 60% of the UK's internet users for news. The website contains international news coverage as well as entertainment, sport, science, and political news. Mobile apps for Android, iOS and Windows Phone systems have been provided since 2010. Many television and radio programmes are also available to view on the BBC iPlayer and BBC Sounds services. The BBC News channel is also available to view 24 hours a day, while video and radio clips are also available within online news articles. In October 2019, BBC News Online launched a mirror on the dark web anonymity network Tor in an effort to circumvent censorship. Criticism The BBC is required by its charter to be free from both political and commercial influence and answers only to its viewers and listeners. This political objectivity is sometimes questioned. For instance, The Daily Telegraph (3 August 2005) carried a letter from the KGB defector Oleg Gordievsky, referring to it as "The Red Service". Books have been written on the subject, including anti-BBC works like Truth Betrayed by W J West and The Truth Twisters by Richard Deacon. The BBC has been accused of bias by Conservative MPs. The BBC's Editorial Guidelines on Politics and Public Policy state that while "the voices and opinions of opposition parties must be routinely aired and challenged", "the government of the day will often be the primary source of news". The BBC is regularly accused by the government of the day of bias in favour of the opposition and, by the opposition, of bias in favour of the government. Similarly, during times of war, the BBC is often accused by the UK government, or by strong supporters of British military campaigns, of being overly sympathetic to the view of the enemy. An edition of Newsnight at the start of the Falklands War in 1982 was described as "almost treasonable" by John Page, MP, who objected to Peter Snow saying "if we believe the British". During the first Gulf War, critics of the BBC took to using the satirical name "Baghdad Broadcasting Corporation". During the Kosovo War, the BBC were labelled the "Belgrade Broadcasting Corporation" (suggesting favouritism towards the FR Yugoslavia government over ethnic Albanian rebels) by British ministers, although Slobodan Milosević (then FRY president) claimed that the BBC's coverage had been biased against his nation. Conversely, some of those who style themselves anti-establishment in the United Kingdom or who oppose foreign wars have accused the BBC of pro-establishment bias or of refusing to give an outlet to "anti-war" voices. Following the 2003 invasion of Iraq, a study by the Cardiff University School of Journalism of the reporting of the war found that nine out of 10 references to weapons of mass destruction during the war assumed that Iraq possessed them, and only one in 10 questioned this assumption. It also found that, out of the main British broadcasters covering the war, the BBC was the most likely to use the British government and military as its source. It was also the least likely to use independent sources, like the Red Cross, who were more critical of the war. When it came to reporting Iraqi casualties, the study found fewer reports on the BBC than on the other three main channels. The report's author, Justin Lewis, wrote "Far from revealing an anti-war BBC, our findings tend to give credence to those who criticised the BBC for being too sympathetic to the government in its war coverage. Either way, it is clear that the accusation of BBC anti-war bias fails to stand up to any serious or sustained analysis." Prominent BBC appointments are constantly assessed by the British media and political establishment for signs of political bias. The appointment of Greg Dyke as Director-General was highlighted by press sources because Dyke was a Labour Party member and former activist, as well as a friend of Tony Blair. The BBC's former Political Editor, Nick Robinson, was some years ago a chairman of the Young Conservatives and did, as a result, attract informal criticism from the former Labour government, but his predecessor Andrew Marr faced similar claims from the right because he was editor of The Independent, a liberal-leaning newspaper, before his appointment in 2000. Mark Thompson, former Director-General of the BBC, admitted the organisation has been biased "towards the left" in the past. He said, "In the BBC I joined 30 years ago, there was, in much of current affairs, in terms of people's personal politics, which were quite vocal, a massive bias to the left". He then added, "The organization did struggle then with impartiality. Now it is a completely different generation. There is much less overt tribalism among the young journalists who work for the BBC." Following the EU referendum in 2016, some critics suggested that the BBC was biased in favour of leaving the EU. For instance, in 2018, the BBC received complaints from people who took issue that the BBC was not sufficiently covering anti-Brexit marches while giving smaller-scale events hosted by former UKIP leader Nigel Farage more airtime. On the other hand, a poll released by YouGov showed that 45% of people who voted to leave the EU thought that the BBC was 'actively anti-Brexit' compared to 13% of the same kinds of voters who think the BBC is pro-Brexit. In 2008, the BBC Hindi was criticised by some Indian outlets for referring to the terrorists who carried out the 2008 Mumbai attacks as "gunmen". The response to this added to prior criticism from some Indian commentators suggesting that the BBC may have an Indophobic bias. In March 2015, the BBC was criticised for a BBC Storyville documentary interviewing one of the rapists in India. In spite of a ban ordered by the Indian High court, the BBC still aired the documentary "India's Daughter" outside India. BBC News was at the centre of a political controversy following the 2003 invasion of Iraq. Three BBC News reports (Andrew Gilligan's on Today, Gavin Hewitt's on The Ten O'Clock News and another on Newsnight) quoted an anonymous source that stated the British government (particularly the Prime Minister's office) had embellished the September Dossier with misleading exaggerations of Iraq's weapons of mass destruction capabilities. The government denounced the reports and accused the corporation of poor journalism. In subsequent weeks the corporation stood by the report, saying that it had a reliable source. Following intense media speculation, David Kelly was named in the press as the source for Gilligan's story on 9 July 2003. Kelly was found dead, by suicide, in a field close to his home early on 18 July. An inquiry led by Lord Hutton was announced by the British government the following day to investigate the circumstances leading to Kelly's death, concluding that "Dr. Kelly took his own life." In his report on 28 January 2004, Lord Hutton concluded that Gilligan's original accusation was "unfounded" and the BBC's editorial and management processes were "defective". In particular, it specifically criticised the chain of management that caused the BBC to defend its story. The BBC Director of News, Richard Sambrook, the report said, had accepted Gilligan's word that his story was accurate in spite of his notes being incomplete. Davies had then told the BBC Board of Governors that he was happy with the story and told the Prime Minister that a satisfactory internal inquiry had taken place. The Board of Governors, under the chairman's, Gavyn Davies, guidance, accepted that further investigation of the Government's complaints were unnecessary. Because of the criticism in the Hutton report, Davies resigned on the day of publication. BBC News faced an important test, reporting on itself with the publication of the report, but by common consent (of the Board of Governors) managed this "independently, impartially and honestly". Davies' resignation was followed by the resignation of Director General, Greg Dyke, the following day, and the resignation of Gilligan on 30 January. While undoubtedly a traumatic experience for the corporation, an ICM poll in April 2003 indicated that it had sustained its position as the best and most trusted provider of news. The BBC has faced accusations of holding both anti-Israel and anti-Palestine bias. Douglas Davis, the London correspondent of The Jerusalem Post, has described the BBC's coverage of the Arab–Israeli conflict as "a relentless, one-dimensional portrayal of Israel as a demonic, criminal state and Israelis as brutal oppressors [which] bears all the hallmarks of a concerted campaign of vilification that, wittingly or not, has the effect of delegitimising the Jewish state and pumping oxygen into a dark old European hatred that dared not speak its name for the past half-century.". However two large independent studies, one conducted by Loughborough University and the other by Glasgow University's Media Group concluded that Israeli perspectives are given greater coverage. Critics of the BBC argue that the Balen Report proves systematic bias against Israel in headline news programming. The Daily Mail and The Daily Telegraph criticised the BBC for spending hundreds of thousands of British tax payers' pounds from preventing the report being released to the public. Jeremy Bowen, the Middle East Editor for BBC world news, was singled out specifically for bias by the BBC Trust which concluded that he violated "BBC guidelines on accuracy and impartiality." An independent panel appointed by the BBC Trust was set up in 2006 to review the impartiality of the BBC's coverage of the Israeli–Palestinian conflict. The panel's assessment was that "apart from individual lapses, there was little to suggest deliberate or systematic bias." While noting a "commitment to be fair accurate and impartial" and praising much of the BBC's coverage the independent panel concluded "that BBC output does not consistently give a full and fair account of the conflict. In some ways the picture is incomplete and, in that sense, misleading." It notes that, "the failure to convey adequately the disparity in the Israeli and Palestinian experience, [reflects] the fact that one side is in control and the other lives under occupation". Writing in the Financial Times, Philip Stephens, one of the panellists, later accused the BBC's director-general, Mark Thompson, of misrepresenting the panel's conclusions. He further opined "My sense is that BBC news reporting has also lost a once iron-clad commitment to objectivity and a necessary respect for the democratic process. If I am right, the BBC, too, is lost". Mark Thompson published a rebuttal in the FT the next day. The description by one BBC correspondent reporting on the funeral of Yassir Arafat that she had been left with tears in her eyes led to other questions of impartiality, particularly from Martin Walker in a guest opinion piece in The Times, who picked out the apparent case of Fayad Abu Shamala, the BBC Arabic Service correspondent, who told a Hamas rally on 6 May 2001, that journalists in Gaza were "waging the campaign shoulder to shoulder together with the Palestinian people". Walker argues that the independent inquiry was flawed for two reasons. Firstly, because the time period over which it was conducted (August 2005 to January 2006) surrounded the Israeli withdrawal from Gaza and Ariel Sharon's stroke, which produced more positive coverage than usual. Furthermore, he wrote, the inquiry only looked at the BBC's domestic coverage, and excluded output on the BBC World Service and BBC World. Tom Gross accused the BBC of glorifying Hamas suicide bombers, and condemned its policy of inviting guests such as Jenny Tonge and Tom Paulin who have compared Israeli soldiers to Nazis. Writing for the BBC, Paulin said Israeli soldiers should be "shot dead" like Hitler's S.S, and said he could "understand how suicide bombers feel". The BBC also faced criticism for not airing a Disasters Emergency Committee aid appeal for Palestinians who suffered in Gaza during 22-day war there between late 2008 and early 2009. Most other major UK broadcasters did air this appeal, but rival Sky News did not. British journalist Julie Burchill has accused BBC of creating a "climate of fear" for British Jews over its "excessive coverage" of Israel compared to other nations. In light of the Gaza war, the BBC suspended seven Arab journalists over allegations of expressing support for Hamas via social media. BBC and ABC share video segments and reporters as needed in producing their newscasts. with the BBC showing ABC World News Tonight with David Muir in the UK. However, in July 2017, the BBC announced a new partnership with CBS News allows both organisations to share video, editorial content, and additional newsgathering resources in New York, London, Washington and around the world. BBC News subscribes to wire services from leading international agencies including PA Media (formerly Press Association), Reuters, and Agence France-Presse. In April 2017, the BBC dropped Associated Press in favour of an enhanced service from AFP. BBC News reporters and broadcasts are now and have in the past been banned in several countries primarily for reporting which has been unfavourable to the ruling government. For example, correspondents were banned by the former apartheid regime of South Africa. The BBC was banned in Zimbabwe under Mugabe for eight years as a terrorist organisation until being allowed to operate again over a year after the 2008 elections. The BBC was banned in Burma (officially Myanmar) after their coverage and commentary on anti-government protests there in September 2007. The ban was lifted four years later in September 2011. Other cases have included Uzbekistan, China, and Pakistan. BBC Persian, the BBC's Persian language news site, was blocked from the Iranian internet in 2006. The BBC News website was made available in China again in March 2008, but as of October 2014[update], was blocked again. In June 2015, the Rwandan government placed an indefinite ban on BBC broadcasts following the airing of a controversial documentary regarding the 1994 Rwandan genocide, Rwanda's Untold Story, broadcast on BBC2 on 1 October 2014. The UK's Foreign Office recognised "the hurt caused in Rwanda by some parts of the documentary". In February 2017, reporters from the BBC (as well as the Daily Mail, The New York Times, Politico, CNN, and others) were denied access to a United States White House briefing. In 2017, BBC India was banned for a period of five years from covering all national parks and sanctuaries in India. Following the withdrawal of CGTN's UK broadcaster licence on 4 February 2021 by Ofcom, China banned BBC News from airing in China. See also References External links |below = Category }} |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_ref-Usborne-2018_8-1] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
[SOURCE: https://www.mako.co.il/special-creating_a_future] | [TOKENS: 81] |
We are sorry... ...but your activity and behavior on this website made us think that you are a bot. Please solve this CAPTCHA in helping us understand your behavior to grant access You reached this page when trying to access https://www.mako.co.il/special-creating_a_future from 79.181.162.231 on February 21 2026, 10:52:54 UTC |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Meta_Platforms#cite_ref-autogenerated1_221-0] | [TOKENS: 8626] |
Contents Meta Platforms Meta Platforms, Inc. (doing business as Meta) is an American multinational technology company headquartered in Menlo Park, California. Meta owns and operates several prominent social media platforms and communication services, including Facebook, Instagram, WhatsApp, Messenger, Threads and Manus. The company also operates an advertising network for its own sites and third parties; as of 2023[update], advertising accounted for 97.8 percent of its total revenue. Meta has been described as a part of Big Tech, which refers to the largest six tech companies in the United States, Alphabet (Google), Amazon, Apple, Meta (Facebook), Microsoft, and Nvidia, which are also the largest companies in the world by market capitalization. The company was originally established in 2004 as TheFacebook, Inc., and was renamed Facebook, Inc. in 2005. In 2021, it rebranded as Meta Platforms, Inc. to reflect a strategic shift toward developing the metaverse—an interconnected digital ecosystem spanning virtual and augmented reality technologies. In 2023, Meta was ranked 31st on the Forbes Global 2000 list of the world's largest public companies. As of 2022, it was the world's third-largest spender on research and development, with R&D expenses totaling US$35.3 billion. History Facebook filed for an initial public offering (IPO) on January 1, 2012. The preliminary prospectus stated that the company sought to raise $5 billion, had 845 million monthly active users, and a website accruing 2.7 billion likes and comments daily. After the IPO, Zuckerberg would retain 22% of the total shares and 57% of the total voting power in Facebook. Underwriters valued the shares at $38 each, valuing the company at $104 billion, the largest valuation yet for a newly public company. On May 16, one day before the IPO, Facebook announced it would sell 25% more shares than originally planned due to high demand. The IPO raised $16 billion, making it the third-largest in US history (slightly ahead of AT&T Mobility and behind only General Motors and Visa). The stock price left the company with a higher market capitalization than all but a few U.S. corporations—surpassing heavyweights such as Amazon, McDonald's, Disney, and Kraft Foods—and made Zuckerberg's stock worth $19 billion. The New York Times stated that the offering overcame questions about Facebook's difficulties in attracting advertisers to transform the company into a "must-own stock". Jimmy Lee of JPMorgan Chase described it as "the next great blue-chip". Writers at TechCrunch, on the other hand, expressed skepticism, stating, "That's a big multiple to live up to, and Facebook will likely need to add bold new revenue streams to justify the mammoth valuation." Trading in the stock, which began on May 18, was delayed that day due to technical problems with the Nasdaq exchange. The stock struggled to stay above the IPO price for most of the day, forcing underwriters to buy back shares to support the price. At the closing bell, shares were valued at $38.23, only $0.23 above the IPO price and down $3.82 from the opening bell value. The opening was widely described by the financial press as a disappointment. The stock set a new record for trading volume of an IPO. On May 25, 2012, the stock ended its first full week of trading at $31.91, a 16.5% decline. On May 22, 2012, regulators from Wall Street's Financial Industry Regulatory Authority announced that they had begun to investigate whether banks underwriting Facebook had improperly shared information only with select clients rather than the general public. Massachusetts Secretary of State William F. Galvin subpoenaed Morgan Stanley over the same issue. The allegations sparked "fury" among some investors and led to the immediate filing of several lawsuits, one of them a class action suit claiming more than $2.5 billion in losses due to the IPO. Bloomberg estimated that retail investors may have lost approximately $630 million on Facebook stock since its debut. S&P Global Ratings added Facebook to its S&P 500 index on December 21, 2013. On May 2, 2014, Zuckerberg announced that the company would be changing its internal motto from "Move fast and break things" to "Move fast with stable infrastructure". The earlier motto had been described as Zuckerberg's "prime directive to his developers and team" in a 2009 interview in Business Insider, in which he also said, "Unless you are breaking stuff, you are not moving fast enough." In November 2016, Facebook announced the Microsoft Windows client of gaming service Facebook Gameroom, formerly Facebook Games Arcade, at the Unity Technologies developers conference. The client allows Facebook users to play "native" games in addition to its web games. The service was closed in June 2021. Lasso was a short-video sharing app from Facebook similar to TikTok that was launched on iOS and Android in 2018 and was aimed at teenagers. On July 2, 2020, Facebook announced that Lasso would be shutting down on July 10. In 2018, the Oculus lead Jason Rubin sent his 50-page vision document titled "The Metaverse" to Facebook's leadership. In the document, Rubin acknowledged that Facebook's virtual reality business had not caught on as expected, despite the hundreds of millions of dollars spent on content for early adopters. He also urged the company to execute fast and invest heavily in the vision, to shut out HTC, Apple, Google and other competitors in the VR space. Regarding other players' participation in the metaverse vision, he called for the company to build the "metaverse" to prevent their competitors from "being in the VR business in a meaningful way at all". In May 2019, Facebook founded Libra Networks, reportedly to develop their own stablecoin cryptocurrency. Later, it was reported that Libra was being supported by financial companies such as Visa, Mastercard, PayPal and Uber. The consortium of companies was expected to pool in $10 million each to fund the launch of the cryptocurrency coin named Libra. Depending on when it would receive approval from the Swiss Financial Market Supervisory authority to operate as a payments service, the Libra Association had planned to launch a limited format cryptocurrency in 2021. Libra was renamed Diem, before being shut down and sold in January 2022 after backlash from Swiss government regulators and the public. During the COVID-19 pandemic, the use of online services, including Facebook, grew globally. Zuckerberg predicted this would be a "permanent acceleration" that would continue after the pandemic. Facebook hired aggressively, growing from 48,268 employees in March 2020 to more than 87,000 by September 2022. Following a period of intense scrutiny and damaging whistleblower leaks, news started to emerge on October 21, 2021 about Facebook's plan to rebrand the company and change its name. In the Q3 2021 earnings call on October 25, Mark Zuckerberg discussed the ongoing criticism of the company's social services and the way it operates, and pointed to the pivoting efforts to building the metaverse – without mentioning the rebranding and the name change. The metaverse vision and the name change from Facebook, Inc. to Meta Platforms was introduced at Facebook Connect on October 28, 2021. Based on Facebook's PR campaign, the name change reflects the company's shifting long term focus of building the metaverse, a digital extension of the physical world by social media, virtual reality and augmented reality features. "Meta" had been registered as a trademark in the United States in 2018 (after an initial filing in 2015) for marketing, advertising, and computer services, by a Canadian company that provided big data analysis of scientific literature. This company was acquired in 2017 by the Chan Zuckerberg Initiative (CZI), a foundation established by Zuckerberg and his wife, Priscilla Chan, and became one of their projects. Following the rebranding announcement, CZI announced that it had already decided to deprioritize the earlier Meta project, thus it would be transferring its rights to the name to Meta Platforms, and the previous project would end in 2022. Soon after the rebranding, in early February 2022, Meta reported a greater-than-expected decline in profits in the fourth quarter of 2021. It reported no growth in monthly users, and indicated it expected revenue growth to stall. It also expected measures taken by Apple Inc. to protect user privacy to cost it some $10 billion in advertisement revenue, an amount equal to roughly 8% of its revenue for 2021. In meeting with Meta staff the day after earnings were reported, Zuckerberg blamed competition for user attention, particularly from video-based apps such as TikTok. The 27% reduction in the company's share price which occurred in reaction to the news eliminated some $230 billion of value from Meta's market capitalization. Bloomberg described the decline as "an epic rout that, in its sheer scale, is unlike anything Wall Street or Silicon Valley has ever seen". Zuckerberg's net worth fell by as much as $31 billion. Zuckerberg owns 13% of Meta, and the holding makes up the bulk of his wealth. According to published reports by Bloomberg on March 30, 2022, Meta turned over data such as phone numbers, physical addresses, and IP addresses to hackers posing as law enforcement officials using forged documents. The law enforcement requests sometimes included forged signatures of real or fictional officials. When asked about the allegations, a Meta representative said, "We review every data request for legal sufficiency and use advanced systems and processes to validate law enforcement requests and detect abuse." In June 2022, Sheryl Sandberg, the chief operating officer of 14 years, announced she would step down that year. Zuckerberg said that Javier Olivan would replace Sandberg, though in a “more traditional” role. In March 2022, Meta (except Meta-owned WhatsApp) and Instagram were banned in Russia and added to the Russian list of terrorist and extremist organizations for alleged Russophobia and hate speech (up to genocidal calls) amid the ongoing Russian invasion of Ukraine. Meta appealed against the ban, but it was upheld by a Moscow court in June of the same year. Also in March 2022, Meta and Italian eyewear giant Luxottica released Ray-Ban Stories, a series of smartglasses which could play music and take pictures. Meta and Luxottica parent company EssilorLuxottica declined to disclose sales on the line of products as of September 2022, though Meta has expressed satisfaction with its customer feedback. In July 2022, Meta saw its first year-on-year revenue decline when its total revenue slipped by 1% to $28.8bn. Analysts and journalists accredited the loss to its advertising business, which has been limited by Apple's app tracking transparency feature and the number of people who have opted not to be tracked by Meta apps. Zuckerberg also accredited the decline to increasing competition from TikTok. On October 27, 2022, Meta's market value dropped to $268 billion, a loss of around $700 billion compared to 2021, and its shares fell by 24%. It lost its spot among the top 20 US companies by market cap, despite reaching the top 5 in the previous year. In November 2022, Meta laid off 11,000 employees, 13% of its workforce. Zuckerberg said the decision to aggressively increase Meta's investments had been a mistake, as he had wrongly predicted that the surge in e-commerce would last beyond the COVID-19 pandemic. He also attributed the decline to increased competition, a global economic downturn and "ads signal loss". Plans to lay off a further 10,000 employees began in April 2023. The layoffs were part of a general downturn in the technology industry, alongside layoffs by companies including Google, Amazon, Tesla, Snap, Twitter and Lyft. Starting from 2022, Meta scrambled to catch up to other tech companies in adopting specialized artificial intelligence hardware and software. It had been using less expensive CPUs instead of GPUs for AI work, but that approach turned out to be less efficient. The company gifted the Inter-university Consortium for Political and Social Research $1.3 million to finance the Social Media Archive's aim to make their data available to social science research. In 2023, Ireland's Data Protection Commissioner imposed a record EUR 1.2 billion fine on Meta for transferring data from Europe to the United States without adequate protections for EU citizens.: 250 In March 2023, Meta announced a new round of layoffs that would cut 10,000 employees and close 5,000 open positions to make the company more efficient. Meta revenue surpassed analyst expectations for the first quarter of 2023 after announcing that it was increasing its focus on AI. On July 6, Meta launched a new app, Threads, a competitor to Twitter. Meta announced its artificial intelligence model Llama 2 in July 2023, available for commercial use via partnerships with major cloud providers like Microsoft. It was the first project to be unveiled out of Meta's generative AI group after it was set up in February. It would not charge access or usage but instead operate with an open-source model to allow Meta to ascertain what improvements need to be made. Prior to this announcement, Meta said it had no plans to release Llama 2 for commercial use. An earlier version of Llama was released to academics. In August 2023, Meta announced its permanent removal of news content from Facebook and Instagram in Canada due to the Online News Act, which requires Canadian news outlets to be compensated for content shared on its platform. The Online News Act was in effect by year-end, but Meta will not participate in the regulatory process. In October 2023, Zuckerberg said that AI would be Meta's biggest investment area in 2024. Meta finished 2023 as one of the best-performing technology stocks of the year, with its share price up 150 percent. Its stock reached an all-time high in January 2024, bringing Meta within 2% of achieving $1 trillion market capitalization. In November 2023 Meta Platforms launched an ad-free service in Europe, allowing subscribers to opt-out of personal data being collected for targeted advertising. A group of 28 European organizations, including Max Schrems' advocacy group NOYB, the Irish Council for Civil Liberties, Wikimedia Europe, and the Electronic Privacy Information Center, signed a 2024 letter to the European Data Protection Board (EDPB) expressing concern that this subscriber model would undermine privacy protections, specifically GDPR data protection standards. Meta removed the Facebook and Instagram accounts of Iran's Supreme Leader Ali Khamenei in February 2024, citing repeated violations of its Dangerous Organizations & Individuals policy. As of March, Meta was under investigation by the FDA for alleged use of their social media platforms to sell illegal drugs. On 16 May 2024, the European Commission began an investigation into Meta over concerns related to child safety. In May 2023, Iraqi social media influencer Esaa Ahmed-Adnan encountered a troubling issue when Instagram removed his posts, citing false copyright violations despite his content being original and free from copyrighted material. He discovered that extortionists were behind these takedowns, offering to restore his content for $3,000 or provide ongoing protection for $1,000 per month. This scam, exploiting Meta’s rights management tools, became widespread in the Middle East, revealing a gap in Meta’s enforcement in developing regions. An Iraqi nonprofit Tech4Peace’s founder, Aws al-Saadi helped Ahmed-Adnan and others, but the restoration process was slow, leading to significant financial losses for many victims, including prominent figures like Ammar al-Hakim. This situation highlighted Meta’s challenges in balancing global growth with effective content moderation and protection. On 16 September 2024, Meta announced it had banned Russian state media outlets from its platforms worldwide due to concerns about "foreign interference activity." This decision followed allegations that RT and its employees funneled $10 million through shell companies to secretly fund influence campaigns on various social media channels. Meta's actions were part of a broader effort to counter Russian covert influence operations, which had intensified since the invasion. At its 2024 Connect conference, Meta presented Orion, its first pair of augmented reality glasses. Though Orion was originally intended to be sold to consumers, the manufacturing process turned out to be too complex and expensive. Instead, the company pivoted to producing a small number of the glasses to be used internally. On 4 October 2024, Meta announced about its new AI model called Movie Gen, capable of generating realistic video and audio clips based on user prompts. Meta stated it would not release Movie Gen for open development, preferring to collaborate directly with content creators and integrate it into its products by the following year. The model was built using a combination of licensed and publicly available datasets. On October 31, 2024, ProPublica published an investigation into deceptive political advertisement scams that sometimes use hundreds of hijacked profiles and facebook pages run by organized networks of scammers. The authors cited spotty enforcement by Meta as a major reason for the extent of the issue. In November 2024, TechCrunch reported that Meta were considering building a $10bn global underwater cable spanning 25,000 miles. In the same month, Meta closed down 2 million accounts on Facebook and Instagram that were linked to scam centers in Myanmar, Laos, Cambodia, the Philippines, and the United Arab Emirates doing pig butchering scams. In December 2024, Meta announced that, beginning February 2025, they would require advertisers to run ads about financial services in Australia to verify information about who are the beneficiary and the payer in a bid to regulate scams. On December 4, 2024, Meta announced it will invest US$10 billion for its largest AI data center in northeast Louisiana, powered by natural gas facilities. On the 11th of that month, Meta experienced a global outage, impacting accounts on all of their social media and messaging applications. Outage reports from DownDetector reached 70,000+ and 100,000+ within minutes for Instagram and Facebook, respectively. In January 2025, Meta announced plans to roll back its diversity, equity, and inclusion (DEI) initiatives, citing shifts in the "legal and policy landscape" in the United States following the 2024 presidential election. The decision followed reports that CEO Mark Zuckerberg sought to align the company more closely with the incoming Trump administration, including changes to content moderation policies and executive leadership. The new content moderation policies continued to bar insults about a person's intellect or mental illness, but made an exception to allow calling LGBTQ people mentally ill because they are gay or transgender. Later that month, Meta agreed to pay $25 million to settle a 2021 lawsuit brought by Donald Trump for suspending his social media accounts after the January 6 riots. Changes to Meta's moderation policies were controversial among its oversight board, with a significant divide in opinion between the board's US conservatives and its global members. In June 2025, Meta Platforms Inc. has decided to make a multibillion-dollar investment into artificial intelligence startup Scale AI. The financing could exceed $10 billion in value which would make it one of the largest private company funding events of all time. In October 2025, it was announced that Meta would be laying off 600 employees in the artificial intelligence unit to perform better and simpler. They referred to their AI unit as "bloated" and are seeking to trim down the department. This mass layoff is going to impact Meta’s AI infrastructure units, Fundamental Artificial Intelligence Research unit (FAIR) and other product-related positions. Mergers and acquisitions Meta has acquired multiple companies (often identified as talent acquisitions). One of its first major acquisitions was in April 2012, when it acquired Instagram for approximately US$1 billion in cash and stock. In October 2013, Facebook, Inc. acquired Onavo, an Israeli mobile web analytics company. In February 2014, Facebook, Inc. announced it would buy mobile messaging company WhatsApp for US$19 billion in cash and stock. The acquisition was completed on October 6. Later that year, Facebook bought Oculus VR for $2.3 billion in cash and stock, which released its first consumer virtual reality headset in 2016. In late November 2019, Facebook, Inc. announced the acquisition of the game developer Beat Games, responsible for developing one of that year's most popular VR games, Beat Saber. In Late 2022, after Facebook Inc rebranded to Meta Platforms Inc, Oculus was rebranded to Meta Quest. In May 2020, Facebook, Inc. announced it had acquired Giphy for a reported cash price of $400 million. It will be integrated with the Instagram team. However, in August 2021, UK's Competition and Markets Authority (CMA) stated that Facebook, Inc. might have to sell Giphy, after an investigation found that the deal between the two companies would harm competition in display advertising market. Facebook, Inc. was fined $70 million by CMA for deliberately failing to report all information regarding the acquisition and the ongoing antitrust investigation. In October 2022, the CMA ruled for a second time that Meta be required to divest Giphy, stating that Meta already controls half of the advertising in the UK. Meta agreed to the sale, though it stated that it disagrees with the decision itself. In May 2023, Giphy was divested to Shutterstock for $53 million. In November 2020, Facebook, Inc. announced that it planned to purchase the customer-service platform and chatbot specialist startup Kustomer to promote companies to use their platform for business. It has been reported that Kustomer valued at slightly over $1 billion. The deal was closed in February 2022 after regulatory approval. In September 2022, Meta acquired Lofelt, a Berlin-based haptic tech startup. In December 2025, it was announced Meta had acquired the AI-wearables startup, Limitless. In the same month, they also acquired another AI startup, Manus AI, for $2 billion. Manus announced in December that its platform had achieved $100mm in recurring revenue just 8 months after its launch and Meta said it will scale the platform to many other businesses. In January 2026, it was announced Meta proposed acquisition of Manus was undergoing preliminary scrutiny by Chinese regulators. The examination concerns the cross-border transfer of artificial intelligence technology developed in China. Lobbying In 2020, Facebook, Inc. spent $19.7 million on lobbying, hiring 79 lobbyists. In 2019, it had spent $16.7 million on lobbying and had a team of 71 lobbyists, up from $12.6 million and 51 lobbyists in 2018. Facebook was the largest spender of lobbying money among the Big Tech companies in 2020. The lobbying team includes top congressional aide John Branscome, who was hired in September 2021, to help the company fend off threats from Democratic lawmakers and the Biden administration. In December 2024, Meta donated $1 million to the inauguration fund for then-President-elect Donald Trump. In 2025, Meta was listed among the donors funding the construction of the White House State Ballroom. Partnerships February 2026, Meta announced a long-term partnership with Nvidia. Censorship In August 2024, Mark Zuckerberg sent a letter to Jim Jordan indicating that during the COVID-19 pandemic the Biden administration repeatedly asked Meta to limit certain COVID-19 content, including humor and satire, on Facebook and Instagram. In 2016 Meta hired Jordana Cutler, formerly an employee at the Israeli Embassy to the United States, as its policy chief for Israel and the Jewish Diaspora. In this role, Cutler pushed for the censorship of accounts belonging to Students for Justice in Palestine chapters in the United States. Critics have said that Cutler's position gives the Israeli government an undue influence over Meta policy, and that few countries have such high levels of contact with Meta policymakers. Following the election of Donald Trump in 2025, various sources noted possible censorship related to the Democratic Party on Instagram and other Meta platforms. In February 2025, a Meta rep flagged journalist Gil Duran's article and other "critiques of tech industry figures" as spam or sensitive content, limiting their reach. In March 2025, Meta attempted to block former employee Sarah Wynn-Williams from promoting or further distributing her memoir, Careless People, that includes allegations of unaddressed sexual harassment in the workplace by senior executives. The New York Times reports that the arbitration is among Meta's most forcible attempts to repudiate a former employee's account of workplace dynamics. Publisher Macmillan reacted to the ruling by the Emergency International Arbitral Tribunal by stating that it will ignore its provisions. As of 15 March 2025[update], hardback and digital versions of Careless People were being offered for sale by major online retailers. From October 2025, Meta began removing and restricting access for accounts related to LGBTQ, reproductive health and abortion information pages on its platforms. Martha Dimitratou, executive director of Repro Uncensored, called Meta's shadow-banning of these issues "One of the biggest waves of censorship we are seeing". Disinformation concerns Since its inception, Meta has been accused of being a host for fake news and misinformation. In the wake of the 2016 United States presidential election, Zuckerberg began to take steps to eliminate the prevalence of fake news, as the platform had been criticized for its potential influence on the outcome of the election. The company initially partnered with ABC News, the Associated Press, FactCheck.org, Snopes and PolitiFact for its fact-checking initiative; as of 2018, it had over 40 fact-checking partners across the world, including The Weekly Standard. A May 2017 review by The Guardian found that the platform's fact-checking initiatives of partnering with third-party fact-checkers and publicly flagging fake news were regularly ineffective, and appeared to be having minimal impact in some cases. In 2018, journalists working as fact-checkers for the company criticized the partnership, stating that it had produced minimal results and that the company had ignored their concerns. In 2024 Meta's decision to continue to disseminate a falsified video of US president Joe Biden, even after it had been proven to be fake, attracted criticism and concern. In January 2025, Meta ended its use of third-party fact-checkers in favor of a user-run community notes system similar to the one used on X. While Zuckerberg supported these changes, saying that the amount of censorship on the platform was excessive, the decision received criticism by fact-checking institutions, stating that the changes would make it more difficult for users to identify misinformation. Meta also faced criticism for weakening its policies on hate speech that were designed to protect minorities and LGBTQ+ individuals from bullying and discrimination. While moving its content review teams from California to Texas, Meta changed their hateful conduct policy to eliminate restrictions on anti-LGBT and anti-immigrant hate speech, as well as explicitly allowing users to accuse LGBT people of being mentally ill or abnormal based on their sexual orientation or gender identity. In January 2025, Meta faced significant criticism for its role in removing LGBTQ+ content from its platforms, amid its broader efforts to address anti-LGBTQ+ hate speech. The removal of LGBTQ+ themes was noted as part of the wider crackdown on content deemed to violate its community guidelines. Meta's content moderation policies, which were designed to combat harmful speech and protect users from discrimination, inadvertently led to the removal or restriction of LGBTQ+ content, particularly posts highlighting LGBTQ+ identities, support, or political issues. According to reports, LGBTQ+ posts, including those that simply celebrated pride or advocated for LGBTQ+ rights, were flagged and removed for reasons that some critics argue were vague or inconsistently applied. Many LGBTQ+ activists and users on Meta's platforms expressed concern that such actions stifled visibility and expression, potentially isolating LGBTQ+ individuals and communities, especially in spaces that were historically important for outreach and support. Lawsuits Numerous lawsuits have been filed against the company, both when it was known as Facebook, Inc., and as Meta Platforms. In March 2020, the Office of the Australian Information Commissioner (OAIC) sued Facebook, for significant and persistent infringements of the rule on privacy involving the Cambridge Analytica fiasco. Every violation of the Privacy Act is subject to a theoretical cumulative liability of $1.7 million. The OAIC estimated that a total of 311,127 Australians had been exposed. On December 8, 2020, the U.S. Federal Trade Commission and 46 states (excluding Alabama, Georgia, South Carolina, and South Dakota), the District of Columbia and the territory of Guam, launched Federal Trade Commission v. Facebook as an antitrust lawsuit against Facebook. The lawsuit concerns Facebook's acquisition of two competitors—Instagram and WhatsApp—and the ensuing monopolistic situation. FTC alleges that Facebook holds monopolistic power in the U.S. social networking market and seeks to force the company to divest from Instagram and WhatsApp to break up the conglomerate. William Kovacic, a former chairman of the Federal Trade Commission, argued the case will be difficult to win as it would require the government to create a counterfactual argument of an internet where the Facebook-WhatsApp-Instagram entity did not exist, and prove that harmed competition or consumers. In November 2025, it was ruled that Meta did not violate antitrust laws and holds no monopoly in the market. On December 24, 2021, a court in Russia fined Meta for $27 million after the company declined to remove unspecified banned content. The fine was reportedly tied to the company's annual revenue in the country. In May 2022, a lawsuit was filed in Kenya against Meta and its local outsourcing company Sama. Allegedly, Meta has poor working conditions in Kenya for workers moderating Facebook posts. According to the lawsuit, 260 screeners were declared redundant with confusing reasoning. The lawsuit seeks financial compensation and an order that outsourced moderators be given the same health benefits and pay scale as Meta employees. In June 2022, 8 lawsuits were filed across the U.S. over the allege that excessive exposure to platforms including Facebook and Instagram has led to attempted or actual suicides, eating disorders and sleeplessness, among other issues. The litigation follows a former Facebook employee's testimony in Congress that the company refused to take responsibility. The company noted that tools have been developed for parents to keep track of their children's activity on Instagram and set time limits, in addition to Meta's "Take a break" reminders. In addition, the company is providing resources specific to eating disorders as well as developing AI to prevent children under the age of 13 signing up for Facebook or Instagram. In June 2022, Meta settled a lawsuit with the US Department of Justice. The lawsuit, which was filed in 2019, alleged that the company enabled housing discrimination through targeted advertising, as it allowed homeowners and landlords to run housing ads excluding people based on sex, race, religion, and other characteristics. The U.S. Department of Justice stated that this was in violation of the Fair Housing Act. Meta was handed a penalty of $115,054 and given until December 31, 2022, to shadow the algorithm tool. In January 2023, Meta was fined €390 million for violations of the European Union General Data Protection Regulation. In May 2023, the European Data Protection Board fined Meta a record €1.2 billion for breaching European Union data privacy laws by transferring personal data of Facebook users to servers in the U.S. In July 2024, Meta agreed to pay the state of Texas US$1.4 billion to settle a lawsuit brought by Texas Attorney General Ken Paxton accusing the company of collecting users' biometric data without consent, setting a record for the largest privacy-related settlement ever obtained by a state attorney general. In October 2024, Meta Platforms faced lawsuits in Japan from 30 plaintiffs who claimed they were defrauded by fake investment ads on Facebook and Instagram, featuring false celebrity endorsements. The plaintiffs are seeking approximately $2.8 million in damages. In April 2025, the Kenyan High Court ruled that a US$2.4 billion lawsuit in which three plaintiffs claim that Facebook inflamed civil violence in Ethiopia in 2021 could proceed. In April 2025, Meta was fined €200 million ($230 million) for breaking the Digital Markets Act, by imposing a “consent or pay” system that forces users to either allow their personal data to be used to target advertisements, or pay a subscription fee for advertising-free versions of Facebook and Instagram. In late April 2025, a case was filed against Meta in Ghana over the alleged psychological distress experienced by content moderators employed to take down disturbing social media content including depictions of murders, extreme violence and child sexual abuse. Meta moved the moderation service to the Ghanaian capital of Accra after legal issues in the previous location Kenya. The new moderation company is Teleperformance, a multinational corporation with a history of worker's rights violation. Reports suggests the conditions are worse here than in the previous Kenyan location, with many workers afraid of speaking out due to fear of returning to conflict zones. Workers reported developing mental illnesses, attempted suicides, and low pay. In 26 January 2026, a New Mexico state court case was filed, suggesting that Mark Zuckerberg approved allowing minors to access artificial intelligence chatbot companions that safety staffers warned were capable of sexual interactions. In 2020, the company UReputation, which had been involved in several cases concerning the management of digital armies[clarification needed], filed a lawsuit against Facebook, accusing it of unlawfully transmitting personal data to third parties. Legal actions were initiated in Tunisia, France, and the United States. In 2025, the United States District court for the Northern District of Georgia approved a discovery procedure, allowing UReputation to access documents and evidence held by Meta. Structure Meta's key management consists of: As of October 2022[update], Meta had 83,553 employees worldwide. As of June 2024[update], Meta's board consisted of the following directors; Meta Platforms is mainly owned by institutional investors, who hold around 80% of all shares. Insiders control the majority of voting shares. The three largest individual investors in 2024 were Mark Zuckerberg, Sheryl Sandberg and Christopher K. Cox. The largest shareholders in late 2024/early 2025 were: Roger McNamee, an early Facebook investor and Zuckerberg's former mentor, said Facebook had "the most centralized decision-making structure I have ever encountered in a large company". Facebook co-founder Chris Hughes has stated that chief executive officer Mark Zuckerberg has too much power, that the company is now a monopoly, and that, as a result, it should be split into multiple smaller companies. In an op-ed in The New York Times, Hughes said he was concerned that Zuckerberg had surrounded himself with a team that did not challenge him, and that it is the U.S. government's job to hold him accountable and curb his "unchecked power". He also said that "Mark's power is unprecedented and un-American." Several U.S. politicians agreed with Hughes. European Union Commissioner for Competition Margrethe Vestager stated that splitting Facebook should be done only as "a remedy of the very last resort", and that it would not solve Facebook's underlying problems. Revenue Facebook ranked No. 34 in the 2020 Fortune 500 list of the largest United States corporations by revenue, with almost $86 billion in revenue most of it coming from advertising. One analysis of 2017 data determined that the company earned US$20.21 per user from advertising. According to New York, since its rebranding, Meta has reportedly lost $500 billion as a result of new privacy measures put in place by companies such as Apple and Google which prevents Meta from gathering users' data. In February 2015, Facebook announced it had reached two million active advertisers, with most of the gain coming from small businesses. An active advertiser was defined as an entity that had advertised on the Facebook platform in the last 28 days. In March 2016, Facebook announced it had reached three million active advertisers with more than 70% from outside the United States. Prices for advertising follow a variable pricing model based on auctioning ad placements, and potential engagement levels of the advertisement itself. Similar to other online advertising platforms like Google and Twitter, targeting of advertisements is one of the chief merits of digital advertising compared to traditional media. Marketing on Meta is employed through two methods based on the viewing habits, likes and shares, and purchasing data of the audience, namely targeted audiences and "look alike" audiences. The U.S. IRS challenged the valuation Facebook used when it transferred IP from the U.S. to Facebook Ireland (now Meta Platforms Ireland) in 2010 (which Facebook Ireland then revalued higher before charging out), as it was building its double Irish tax structure. The case is ongoing and Meta faces a potential fine of $3–5bn. The U.S. Tax Cuts and Jobs Act of 2017 changed Facebook's global tax calculations. Meta Platforms Ireland is subject to the U.S. GILTI tax of 10.5% on global intangible profits (i.e. Irish profits). On the basis that Meta Platforms Ireland Limited is paying some tax, the effective minimum US tax for Facebook Ireland will be circa 11%. In contrast, Meta Platforms Inc. would incur a special IP tax rate of 13.125% (the FDII rate) if its Irish business relocated to the U.S. Tax relief in the U.S. (21% vs. Irish at the GILTI rate) and accelerated capital expensing, would make this effective U.S. rate around 12%. The insignificance of the U.S./Irish tax difference was demonstrated when Facebook moved 1.5bn non-EU accounts to the U.S. to limit exposure to GDPR. Facilities Users outside of the U.S. and Canada contract with Meta's Irish subsidiary, Meta Platforms Ireland Limited (formerly Facebook Ireland Limited), allowing Meta to avoid US taxes for all users in Europe, Asia, Australia, Africa and South America. Meta is making use of the Double Irish arrangement which allows it to pay 2–3% corporation tax on all international revenue. In 2010, Facebook opened its fourth office, in Hyderabad, India, which houses online advertising and developer support teams and provides support to users and advertisers. In India, Meta is registered as Facebook India Online Services Pvt Ltd. It also has offices or planned sites in Chittagong, Bangladesh; Dublin, Ireland; and Austin, Texas, among other cities. Facebook opened its London headquarters in 2017 in Fitzrovia in central London. Facebook opened an office in Cambridge, Massachusetts in 2018. The offices were initially home to the "Connectivity Lab", a group focused on bringing Internet access to those who do not have access to the Internet. In April 2019, Facebook opened its Taiwan headquarters in Taipei. In March 2022, Meta opened new regional headquarters in Dubai. In September 2023, it was reported that Meta had paid £149m to British Land to break the lease on Triton Square London office. Meta reportedly had another 18 years left on its lease on the site. As of 2023, Facebook operated 21 data centers. It committed to purchase 100% renewable energy and to reduce its greenhouse gas emissions 75% by 2020. Its data center technologies include Fabric Aggregator, a distributed network system that accommodates larger regions and varied traffic patterns. Reception US Representative Alexandria Ocasio-Cortez responded in a tweet to Zuckerberg's announcement about Meta, saying: "Meta as in 'we are a cancer to democracy metastasizing into a global surveillance and propaganda machine for boosting authoritarian regimes and destroying civil society ... for profit!'" Ex-Facebook employee Frances Haugen and whistleblower behind the Facebook Papers responded to the rebranding efforts by expressing doubts about the company's ability to improve while led by Mark Zuckerberg, and urged the chief executive officer to resign. In November 2021, a video published by Inspired by Iceland went viral, in which a Zuckerberg look-alike promoted the Icelandverse, a place of "enhanced actual reality without silly looking headsets". In a December 2021 interview, SpaceX and Tesla chief executive officer Elon Musk said he could not see a compelling use-case for the VR-driven metaverse, adding: "I don't see someone strapping a frigging screen to their face all day." In January 2022, Louise Eccles of The Sunday Times logged into the metaverse with the intention of making a video guide. She wrote: Initially, my experience with the Oculus went well. I attended work meetings as an avatar and tried an exercise class set in the streets of Paris. The headset enabled me to feel the thrill of carving down mountains on a snowboard and the adrenaline rush of climbing a mountain without ropes. Yet switching to the social apps, where you mingle with strangers also using VR headsets, it was at times predatory and vile. Eccles described being sexually harassed by another user, as well as "accents from all over the world, American, Indian, English, Australian, using racist, sexist, homophobic and transphobic language". She also encountered users as young as 7 years old on the platform, despite Oculus headsets being intended for users over 13. See also References External links 37°29′06″N 122°08′54″W / 37.48500°N 122.14833°W / 37.48500; -122.14833 |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Friendica] | [TOKENS: 1245] |
Contents Friendica Friendica (formerly Friendika, originally Mistpark) is a free and open-source software distributed social network. It forms one part of the Fediverse, an interconnected and decentralized network of independently operated servers. Features Friendica users can connect with others via their own Friendica server, but may also fully integrate contacts from other platforms including Diaspora, Pump.io, GNU social, email, Discourse and more recently ActivityPub (including Mastodon, Pleroma and Pixelfed) and Bluesky into their 'newsfeed'. In addition to these two way connections, users can also use Friendica as a publishing platform to post content to WordPress, Tumblr, Insanejournal and Libertree. Posting to Google+ was also supported until that service was shut down. In addition, RSS feeds can be ingested. Because users are distributed across many servers, their "addresses" consist of a username, the "@" symbol, and the domain name of the Friendica instance in the same manner email addresses are formed. Twitter support was available but was deprecated due to API changes under Elon Musk's leadership rendering it unusable. Most of the functionality from major microblogging and social networking platforms are available in Friendica; for example, tagging users and groups via "@ mentions"; direct messages; hashtags; photo albums; "likes"; "dislikes"; comments; and re-shares of publicly visible posts. Published items can be edited and updated across the network. Comprehensive settings for privacy and the public visibility of posts allow users to regulate who can read which contributions, or see specific information about the user. Users can also create multiple profiles, allowing different groups of people (such as friends, or work mates) to see a different profile entirely when viewing the same page. User accounts can be downloaded or deleted, and can be imported to a different Friendica server if so required. Public forums can be created under different accounts, which can be switched between if the accounts are registered with the same email address. Development There is no corporation behind Friendica. The developers work on a voluntary basis and the project is run informally; the platform itself is used for the communication between the developers. There are different forums within Friendica, such as "Friendica Developers" and "Friendica Support". The source code of Friendica is hosted on GitHub. Installation The developers aim to make installation of the software as simple as possible for technical laymen. They argue that decentralization on small servers is a key condition for the freedom of users and their self-determination. The difficulty level is similar to an installation of WordPress. However, the installing on shared hosting is sometimes difficult because of missing PHP5 modules. Some volunteers also run public servers so that newcomers can also avoid the installation of their own software. List of clients Friendica implements multiple client-server API variants simultaneously. Along with endpoints needed to use enhanced Friendica features, it also implements the API used by GNU social, Twitter and since version 2021.06 also the one used by Mastodon. As a result, most GNU social and Mastodon clients can be used for Friendica. Examples of Friendica compatible clients include: Raccoon for Friendica, Friendiqa, Fedilab, AndStatus, Twidere and DiCa for Android, friendly for Sailfish OS, friclicli (CLI client), choqok and Friendiqa for Linux and Friendica Mobile for Windows 10. Reception Friendica was cited in January 2012 by Infoshop News as an "alternative to Google+ and Facebook" to be used on the Occupy Nigeria movement. In January 2012 Free Software Foundation Europe's blog cited Friendica as a reasonable alternative to centralized and controlled social networks such as Facebook or Google+. Biblical Notes writer J. Randal Matheny described Friendica in January 2012 as "One social networking option flying under the radar until recently deserves consideration as an already stable platform with a wide range of options, applications, plug-ins, and possibilities for opening up the Internet." In February 2012, the German computer magazine c't wrote: "Friendica demonstrates how decentralized social networks can become widely accepted." Another German publication, the professional magazine t3n listed Friendica as a Facebook rival in an online article in March 2012 about Facebook alternatives. It compared Friendica with similar social networks like Diaspora and identi.ca. MSN Tech & Gadgets contributor Emma Boyes wrote about Friendica in May 2012: "why you'll love it: you can use it to access all the other social networks and get recommendations of new friends and groups to join. Friendica is open source and decentralised. There's no corporation behind it and there are extensive privacy settings. You can choose from a variety of user interfaces and it boasts some cool features—for instance, being able to key in a list of your interests and use the 'profile match' feature to recommend other users who share them with you. A word of warning, though, the site is not as user-friendly as the others on this list, so it may be this one is one for the geeks." Later reviews Acquisition of Twitter by Elon Musk had revitalized public interest in Fediverse technologies in April 2022. Friendica received favorable reviews, with a PCMag article describing it as "mostly comparable to Facebook", drawing a parallel to Google+ and highlighting using it "for planning events, and its multiple profile feature means you can show a different face to your friends, coworkers, and family". The September 2022 issue of Linux Magazine contains a detailed comparison and walk-through of registering to and using basic functions of Diaspora, Friendica and Mastodon. They describe Friendica as "intuitive" and highlight the "huge choice of account settings" and that "Friendica does not require any specific hardware, so you can use an old computer system as a server." Vulnerabilities In September 2020, a hotfix was released to patch a security vulnerability that could leak sensitive information from the server environment since versions released in April 2019 (develop branch) and June 2019 (stable). See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/History_of_Israel#Early_Roman_period_(64_BCE–2nd_century_CE)] | [TOKENS: 14912] |
Contents History of Israel The history of Israel covers the Southern Levant region also known as Canaan, Palestine, or the Holy Land, which is the location of Israel and Palestine. From prehistory, as part of the Levantine corridor, the area witnessed waves of early humans from Africa, then the emergence of Natufian culture c. 10,000 BCE. The region entered the Bronze Age c. 2,000 BCE with the development of Canaanite civilization. In the Iron Age, the kingdoms of Israel and Judah were established, entities central to the origins of the Abrahamic religions. This has given rise to Judaism, Samaritanism, Christianity, Islam, Druzism, Baha'ism. The Land of Israel has seen many conflicts, been controlled by various polities, and hosted various ethnic groups. In the following centuries, the Assyrian, Babylonian, Achaemenid, and Macedonian empires conquered the region. Ptolemies and Seleucids vied for control during the Hellenistic period. Through the Hasmonean dynasty, the Jews maintained independence for a century before incorporation into the Roman Republic. As a result of the Jewish–Roman wars in the 1st and 2nd centuries CE, many Jews were killed, or sold into slavery. Following the advent of Christianity, demographics shifted towards newfound Christians, who replaced Jews as the majority by the 4th century. In the 7th century, Byzantine Christian rule over Israel was superseded in the Muslim conquest of the Levant by the Rashidun Caliphate, to later be ruled by the Umayyad, Abbasid, and Fatimid caliphates, before being conquered by the Seljuks in the 1070s. Throughout the 12th and 13th centuries, the Land of Israel saw wars between Christians and Muslims as part of the Crusades, with the Kingdom of Jerusalem overrun by Saladin's Ayyubids in the 12th century. The Crusaders hung on to decreasing territories for another century. In the 13th century, the Land of Israel became subject to Mongol conquest, though this was stopped by the Mamluk Sultanate, under whose rule it remained until the 16th century. The Mamluks were defeated by the Ottoman Empire, and the region became an Ottoman province until the early 20th century. The 19th century saw the rise of a Jewish nationalist movement in Europe known as Zionism; aliyah, Jewish immigration to Israel from the diaspora, increased. During World War I, the Sinai and Palestine campaign of the Allies led to the partition of the Ottoman Empire. Britain was granted control of the region by a League of Nations mandate, known as Mandatory Palestine. The British committed to the creation of a Jewish homeland in the 1917 Balfour Declaration. Palestinian Arabs sought to prevent Jewish immigration, and tensions grew during British administration. In 1947, the UN voted for the partition of Mandate Palestine and creation of a Jewish and Arab state. The Jews accepted the plan, while the Arabs rejected it. A civil war ensued, won by the Jews. In May 1948, the Israeli Declaration of Independence sparked the 1948 War in which Israel repelled the armies of the neighbouring states. It resulted in the 1948 Palestinian expulsion and flight and led to Jewish emigration from other parts of the Middle East. About 40% of the global Jewish population resides in Israel. In 1979, the Egypt–Israel peace treaty was signed. In 1993, Israel signed the Oslo I Accord with the Palestine Liberation Organization, which was followed by the establishment of the Palestinian Authority. In 1994, the Israel–Jordan peace treaty was signed. Despite a long-running Israeli–Palestinian peace process, the conflict continues. Prehistory The oldest evidence of early humans in the territory of modern Israel, dating to 1.5 million years ago, was found in Ubeidiya near the Sea of Galilee. Flint tool artefacts have been discovered at Yiron, the oldest stone tools found anywhere outside Africa.[dubious – discuss] The Daughters of Jacob Bridge over the Jordan River provides evidence of the control of fire by early humans around 780,000 years ago, one of the oldest known examples. In the Mount Carmel area at el-Tabun, and Es Skhul, Neanderthal and early modern human remains were found, showing the longest stratigraphic record in the region, spanning 600,000 years of human activity, from the Lower Paleolithic to the present day, representing roughly a million years of human evolution. Other significant Paleolithic sites include Qesem cave. A 200,000-year-old fossil from Misliya Cave is the second-oldest evidence of anatomically modern humans found outside Africa. Other notable finds include the Skhul and Qafzeh hominins, as well as Manot 1. Around 10th millennium BCE, the Natufian culture existed in the area. The beginning of agriculture in the region during the Neolithic Revolution is evidenced by sites such as Nahal Oren and Gesher. Here is one of the more common periodisations. Bronze Age Canaan The Canaanites are archaeologically attested in the Middle Bronze Age (2100–1550 BCE). There were probably independent or semi-independent city-states. Cities were often surrounded by massive earthworks, resulting in the archaeological mounds, or 'tells' common in the region today. In the late Middle Bronze Age, the Nile Delta in Egypt was settled by Canaanites who maintained close connections with Canaan. During that period, the Hyksos, dynasties of Canaanite/Asiatic origin, ruled much of Lower Egypt before being overthrown in the 16th century BCE. During the Late Bronze Age (1550–1200 BCE), there were Canaanite vassal states paying tribute to the New Kingdom of Egypt, which governed from Gaza. In 1457 BCE, Egyptian forces under the command of Pharaoh Thutmose III defeated a rebellious coalition of Canaanite vassal states led by Kadesh's king at the Battle of Megiddo. In the Late Bronze Age there was a period of civilizational collapse in the Middle East, Canaan fell into chaos, and Egyptian control ended. There is evidence that urban centers such as Hazor, Beit She'an, Megiddo, Ekron, Isdud and Ascalon were damaged or destroyed. Two groups appear at this time, and are associated with the transition to the Iron Age (they used iron weapons/tools which were better than earlier bronze): the Sea Peoples, particularly the Philistines, who migrated from the Aegean world and settled on the southern coast, and the Israelites, whose settlements dotted the highlands. Some 2nd millennium inscriptions about the semi-nomadic Habiru people are believed to be connected to the Hebrews, who were generally synonymous with the Biblical Israelites. Many scholars regard this connection to be plausible since the two ethnonyms have similar etymologies, although others argue that Habiru refers to a social class found in every Near Eastern society, including Hebrew societies. Ancient Israel and Judah: Iron Age to Babylonian period The earliest recorded evidence of a people by the name of Israel (as ysrỉꜣr) occurs in the Egyptian Merneptah Stele, erected for Pharaoh Merneptah c. 1209 BCE. Archeological evidence indicates that during the early Iron Age I, hundreds of small villages were established on the highlands of Canaan on both sides of the Jordan River, primarily in Samaria, north of Jerusalem. These villages had populations of up to 400, were largely self-sufficient and lived from herding, grain cultivation, and growing vines and olives with some economic interchange. The pottery was plain and undecorated. Writing was known and available for recording, even in small sites. William G. Dever sees this "Israel" in the central highlands as a cultural and probably political entity, more an ethnic group rather than an organized state. Modern scholars believe that the Israelites and their culture branched out of the Canaanite peoples and their cultures through the development of a distinct monolatristic—and later monotheistic—religion centred on a national god Yahweh. According to McNutt, "It is probably safe to assume that sometime during Iron Age I a population began to identify itself as 'Israelite'", differentiating itself from the Canaanites through such markers as the prohibition of intermarriage, an emphasis on family history and genealogy, and religion. Philistine cooking tools and the prevalence of pork in their diets, and locally made Mycenaean pottery—which later evolved into bichrome Philistine pottery—all support their foreign origin. Their cities were large and elaborate, which—together with the findings—point to a complex, hierarchical society. Israel Finkelstein believes that the oldest Abraham traditions originated in the Iron Age, which focus on the themes of land and offspring and possibly, his altars in Hebron. Abraham's Mesopotamian heritage is not discussed. In the 10th century BCE, the Israelite kingdoms of Judah and Israel emerged. The Hebrew Bible states that these were preceded by a single kingdom ruled by Saul, David and Solomon, who is said to have built the First Temple. Archaeologists have debated whether the united monarchy ever existed,[Notes 1] with those in favor of such a polity existing further divided between maximalists who support the Biblical accounts, and minimalists who argue that any such polity was likely smaller than suggested. Historians and archaeologists agree that the northern Kingdom of Israel existed by ca. 900 BCE and the Kingdom of Judah existed by ca. 850 BCE. The Kingdom of Israel was the more prosperous of the two kingdoms and soon developed into a regional power; during the days of the Omride dynasty, it controlled Samaria, Galilee, the upper Jordan Valley, the Sharon and large parts of the Transjordan. Samaria, the capital, was home to one of the largest Iron Age structures in the Levant. The Kingdom of Israel's capital moved between Shechem, Penuel and Tirzah before Omri settled it in Samaria, and the royal succession was often settled by a military coup d'état. The Kingdom of Judah was smaller but more stable; the Davidic dynasty ruled the kingdom for the four centuries of its existence, with the capital always in Jerusalem, controlling the Judaean Mountains, most of the Shephelah and the Beersheba valley in the northern Negev. In 854 BCE, according to the Kurkh Monoliths, an alliance between Ahab of Israel and Ben Hadad II of Aram-Damascus managed to repulse the incursions of the Assyrians, with a victory at the Battle of Qarqar. Another important discovery of the period is the Mesha Stele, a Moabite stele found in Dhiban when Emir Sattam Al-Fayez led Henry Tristram to it as they toured the lands of the vassals of the Bani Sakher. The stele is now in the Louvre. In the stele, Mesha, king of Moab, tells how Chemosh, the god of Moab, had been angry with his people and had allowed them to be subjugated to the Kingdom of Israel, but at length, Chemosh returned and assisted Mesha to throw off the yoke of Israel and restore the lands of Moab. It refers to Omri, king of Israel, to the god Yahweh, and may contain another early reference to the House of David. The Kingdom of Israel fell to the Assyrians following a long siege of the capital Samaria around 720 BCE. The records of Sargon II indicate that he captured Samaria and deported 27,290 inhabitants to Mesopotamia. It is likely that Shalmaneser captured the city since both the Babylonian Chronicles and the Hebrew Bible viewed the fall of Israel as the signature event of his reign. The Assyrian deportations became the basis for the Jewish idea of the Ten Lost Tribes. Foreign groups were settled by the Assyrians in the territories of the fallen kingdom. The Samaritans claim to be descended from Israelites of ancient Samaria who were not expelled by the Assyrians. It is believed that refugees from the destruction of Israel moved to Judah, massively expanding Jerusalem and leading to construction of the Siloam Tunnel during the rule of King Hezekiah (ruled 715–686 BCE). The Siloam inscription, a plaque written in Hebrew left by the construction team, was discovered in the tunnel in 1880s, and is today held by the Istanbul Archaeology Museum. During Hezekiah's rule, Sennacherib, the son of Sargon, attempted but failed to capture Judah. Assyrian records say that Sennacherib levelled 46 walled cities and besieged Jerusalem, leaving after receiving extensive tribute. Sennacherib erected the Lachish reliefs in Nineveh to commemorate a second victory at Lachish. The writings of four different "prophets" are believed to date from this period: Hosea and Amos in Israel and Micah and Isaiah of Judah. These men were mostly social critics who warned of the Assyrian threat and acted as religious spokesmen. They exercised some form of free speech and may have played a significant social and political role in Israel and Judah. They urged rulers and the general populace to adhere to god-conscious ethical ideals, seeing the Assyrian invasions as a divine punishment of the collective resulting from ethical failures. Under King Josiah (ruler from 641 to 619 BCE), the Book of Deuteronomy was either rediscovered or written. The Book of Joshua and the accounts of the kingship of David and Solomon in the Book of Kings are believed to have the same author. The books are known as Deuteronomist and considered to be a key step in the emergence of monotheism in Judah. They emerged at a time that Assyria was weakened by the emergence of Babylon and may be a committing to text of pre-writing verbal traditions. During the late 7th century BCE, Judah became a vassal state of the Neo-Babylonian Empire. In 601 BCE, Jehoiakim of Judah allied with Babylon's principal rival, Egypt, despite the strong remonstrances of the prophet Jeremiah. As a punishment, the Babylonians besieged Jerusalem in 597 BCE, and the city surrendered. The defeat was recorded by the Babylonians. Nebuchadnezzar pillaged Jerusalem and deported king Jechoiachin (Jeconiah), along with other prominent citizens, to Babylon; Zedekiah, his uncle, was installed as king. A few years later, Zedekiah launched another revolt against Babylon, and an army was sent to conquer Jerusalem. In 587 or 586 BCE, King Nebuchadnezzar II of Babylon conquered Jerusalem, destroyed the First Temple and razed the city. The Kingdom of Judah was abolished, and many of its citizens were exiled to Babylon. The former territory of Judah became a Babylonian province called Yehud with its center in Mizpah, north of the destroyed Jerusalem. Tablets that describe King Jehoiachin's rations were found in the ruins of Babylon. He was eventually released by the Babylonians. According to both the Bible and the Talmud, the Davidic dynasty continued as head of Babylonian Jewry, called the "Rosh Galut" (exilarch or head of exile). Arab and Jewish sources show that the Rosh Galut continued to exist for another 1,500 years in what is now Iraq, ending in the eleventh century. Second Temple period In 538 BCE, Cyrus the Great of the Achaemenid Empire conquered Babylon and took over its empire. Cyrus issued a proclamation granting religious freedom to all peoples subjugated by the Babylonians (see the Cyrus Cylinder). According to the Bible, Jewish exiles in Babylon, including 50,000 Judeans led by Zerubabel, returned to Judah to rebuild the Temple in Jerusalem. The Second Temple was subsequently completed c. 515 BCE. A second group of 5,000, led by Ezra and Nehemiah, returned to Judah in 456 BCE. The first was empowered by the Persian king to enforce religious rules, the second had the status of governor and a royal mission to restore the walls of the city. The country remained a province of the Achaemenid empire called Yehud until 332 BCE. The final text of the Torah is thought to have been written during the Persian period (probably 450–350 BCE). The text was formed by editing and unifying earlier texts. The returning Israelites adopted an Aramaic script (also known as the Ashuri alphabet), which they brought back from Babylon; this is the current Hebrew script. The Hebrew calendar closely resembles the Babylonian calendar and probably dates from this period. The Bible describes tension between the returnees, the elite of the First Temple period, and those who had remained in Judah. It is possible that the returnees, supported by the Persian monarchy, became large landholders at the expense of the people who had remained to work the land in Judah, whose opposition to the Second Temple would have reflected a fear that exclusion from the cult would deprive them of land rights. Judah had become in practice a theocracy, ruled by hereditary High Priests and a Persian-appointed governor, frequently Jewish, charged with keeping order and seeing that tribute was paid. A Judean military garrison was placed by the Persians on Elephantine Island near Aswan in Egypt. In the early 20th century, 175 papyrus documents recording activity in this community were discovered, including the "Passover Papyrus", a letter instructing the garrison on how to correctly conduct the Passover feast. In 332 BCE, Alexander the Great of Macedon conquered the region as part of his campaign against the Achaemenid Empire. After his death in 322 BCE, his generals divided the empire and Judea became a frontier region between the Seleucid Empire and Ptolemaic Kingdom in Egypt. Following a century of Ptolemaic rule, Judea was conquered by the Seleucid Empire in 200 BCE at the battle of Panium. Hellenistic rulers generally respected Jewish culture and protected Jewish institutions. Judea was ruled by the hereditary office of the High Priest of Israel as a Hellenistic vassal. Nevertheless, the region underwent a process of Hellenization, which heightened tensions between Greeks, Hellenized Jews, and observant Jews. These tensions escalated into clashes involving a power struggle for the position of high priest and the character of the holy city of Jerusalem. When Antiochus IV Epiphanes consecrated the temple, forbade Jewish practices, and forcibly imposed Hellenistic norms on the Jews, several centuries of religious tolerance under Hellenistic control came to an end. In 167 BCE, the Maccabean revolt erupted after Mattathias, a Jewish priest of the Hasmonean lineage, killed a Hellenized Jew and a Seleucid official who participated in sacrifice to the Greek gods in Modi'in. His son Judas Maccabeus defeated the Seleucids in several battles, and in 164 BCE, he captured Jerusalem and restored temple worship, an event commemorated by the Jewish festival of Hannukah. After Judas' death, his brothers Jonathan Apphus and Simon Thassi were able to establish and consolidate a vassal Hasmonean state in Judea, capitalizing on the Seleucid Empire's decline as a result of internal instability and wars with the Parthians, and by forging ties with the rising Roman Republic. Hasmonean leader John Hyrcanus was able to gain independence, doubling Judea's territories. He took control of Idumaea, where he converted the Edomites to Judaism, and invaded Scythopolis and Samaria, where he demolished the Samaritan Temple. Hyrcanus was also the first Hasmonean leader to mint coins. Under his sons, kings Aristobulus I and Alexander Jannaeus, Hasmonean Judea became a kingdom, and its territories continued to expand, now also covering the coastal plain, Galilee and parts of the Transjordan. Some scholars argue that the Hasmonean dynasty also institutionalized the final Jewish biblical canon. Under Hasmonean rule, the Pharisees, Sadducees and the mystic Essenes emerged as the principal Jewish social movements. The Pharisee sage Simeon ben Shetach is credited with establishing the first schools based around meeting houses. This was a key step in the emergence of Rabbinical Judaism. After Jannaeus' widow, queen Salome Alexandra, died in 67 BCE, her sons Hyrcanus II and Aristobulus II engaged in a civil war over succession. The conflicting parties requested Pompey's assistance on their behalf, which paved the way for a Roman takeover of the kingdom. In 63 BCE, the Roman Republic conquered Judaea, ending Jewish independence under the Hasmoneans. Roman general Pompey intervened in a dynastic civil war and, after capturing Jerusalem, reinstated Hyrcanus II as high priest but denied him the title of king. Rome soon installed the Herodian dynasty—of Idumean descent but Jewish by conversion—as a loyal replacement for the nationalist Hasmoneans. In 37 BCE, Herod the Great, the first client king of this line, took power after defeating the restored Hasmonean king Antigonus II Mattathias. Herod imposed heavy taxes, suppressed opposition, and centralized authority, which fostered widespread resentment. Herod also carried out major monumental construction projects throughout his kingdom, and significantly expanded the Second Temple, which he transformed into one of the largest religious structures in the ancient world. After his death in 4 BCE, his kingdom was divided among his sons into a tetrarchy under continued Roman oversight. In 6 CE, Roman emperor Augustus transformed Judaea into a Roman province, deposing its last Jewish ruler, Herod Archelaus, and appointing a Roman governor in his place. That same year, a census triggered a small uprising by Judas of Galilee, the founder of a movement that rejected foreign authority and recognized only God as king. Over the next six decades, with the brief exception of a short period of Jewish autonomy under the client king Herod Agrippa I, the province remained under direct Roman administration. Some governors ruled with brutality and showed little regard for Jewish religious sensitivities, deepening resentment among the local population. This discontent was also fueled by poor governance, corruption, and growing economic inequality, along with rising tensions between Jews and neighboring populations over ethnic, religious, and territorial disputes. At the same time, collective memory of the Maccabean revolt and the period of Hasmonean independence continued to inspire hopes for national liberation from Roman control. In 64 CE, the Temple High Priest Joshua ben Gamla introduced a religious requirement for Jewish boys to learn to read from the age of six. Over the next few hundred years this requirement became steadily more ingrained in Jewish tradition. The Jewish–Roman wars were a series of large-scale revolts by Jewish subjects against the Roman Empire between 66 and 135 CE. The term primarily applies to the First Jewish–Roman War (66–73 CE) and the Bar Kokhba revolt (132–136 CE), both nationalist rebellions aimed at restoring Jewish independence in Judea. Some sources also include the Diaspora Revolt (115–117 CE), an ethno-religious conflict fought across the Eastern Mediterranean and including the Kitos War in Judaea. The Jewish–Roman wars had a devastating impact on the Jewish people, transforming them from a major population in the Eastern Mediterranean into a dispersed and persecuted minority. The First Jewish-Roman War culminated in the destruction of Jerusalem and other towns and villages in Judaea, resulting in significant loss of life and a considerable segment of the population being uprooted or displaced. Those who remained were stripped of any form of political autonomy. Subsequently, the brutal suppression of the Bar Kokhba revolt resulted in even more severe consequences. Judea witnessed a significant depopulation, as many Jews were killed, expelled, or sold into slavery. The outcome of the conflict marked the termination of efforts to reestablish a Jewish state until the modern era. Jews were banned from residing in the vicinity of Jerusalem, which the Romans rebuilt into the pagan colony of Aelia Capitolina, and the province of Judaea was renamed Syria Palaestina. Collectively, these events enhanced the role of Jewish diaspora, relocating the Jewish demographic and cultural center to Galilee and eventually to Babylonia, with smaller communities across the Mediterranean, the Middle East, and beyond. The Jewish–Roman wars also had a major impact on Judaism, after the central worship site of Second Temple Judaism, the Second Temple in Jerusalem, was destroyed by Titus's troops in 70 CE. The destruction of the Temple led to a transformation in Jewish religious practices, emphasizing prayer, Torah study, and communal gatherings in synagogues. This pivotal shift laid the foundation for the emergence of Rabbinic Judaism, which has been the dominant form of Judaism since late antiquity, after the codification of the Babylonian Talmud. Late Roman and Byzantine periods As a result of the disastrous effects of the Bar Kokhba revolt, Jewish presence in the region significantly dwindled. Over the next centuries, more Jews left to communities in the Diaspora, especially the large, speedily growing Jewish communities in Babylonia and Arabia. Others remained in the Land of Israel, where the spiritual and demographic center shifted from the depopulated Judea to Galilee. Jewish presence also continued in the southern Hebron Hills, in Ein Gedi, and on the coastal plain. The Mishnah and the Jerusalem Talmud, huge compendiums of Rabbinical discussions, were compiled during the 2nd to 4th centuries CE in Tiberias and Jerusalem. Following the revolt, Judea's countryside was penetrated by pagan populations, including migrants from the nearby provinces of Syria, Phoenicia, and Arabia, whereas Aelia Capitolina, its immediate vicinity, and administrative centers were now inhabited by Roman veterans and settlers from the western parts of the empire. The Romans permitted a hereditary Rabbinical Patriarch from the House of Hillel, called the "Nasi", to represent the Jews in dealings with the Romans. One prominent figure was Judah ha-Nasi, credited with compiling the final version of the Mishnah, a vast collection of Jewish oral traditions. He also emphasized the importance of education in Judaism, leading to requirements that illiterate Jews be treated as outcasts. This might have contributed to some illiterate Jews converting to Christianity. Jewish seminaries, such as those at Shefaram and Bet Shearim, continued to produce scholars. The best of these became members of the Sanhedrin, which was located first at Sepphoris and later at Tiberias. In the Galillee, many synagogues have been found dating from this period, and the burial site of the Sanhedrin leaders was discovered in Beit She'arim. In the 3rd century, the Roman Empire faced an economic crisis and imposed heavy taxation to fund wars of imperial succession. This situation prompted additional Jewish migration from Syria Palaestina to the Sasanian Empire, known for its more tolerant environment; there, a flourishing Jewish community with important Talmudic academies thrived in Babylonia, engaging in a notable rivalry with the Talmudic academies of Palaestina. Early in the 4th century, the Emperor Constantine made Constantinople the capital of the East Roman Empire and made Christianity an accepted religion. His mother Helena made a pilgrimage to Jerusalem (326–328) and led the construction of the Church of the Nativity (birthplace of Jesus in Bethlehem), the Church of the Holy Sepulchre (burial site of Jesus in Jerusalem) and other key churches that still exist. The name Jerusalem was restored to Aelia Capitolina and became a Christian city. Jews were still banned from living in Jerusalem, but were allowed to visit and worship at the site of the ruined temple. Over the course of the next century Christians worked to eradicate "paganism", leading to the destruction of classical Roman traditions and eradication of their temples. In 351–2, another Jewish revolt in the Galilee erupted against a corrupt Roman governor. The Roman Empire split in 390 CE and the region became part of the Eastern Roman Empire, known as the Byzantine Empire. Under Byzantine rule, much of the region and its non-Jewish population were won over by Christianity, which eventually became the dominant religion in the region. The presence of holy sites drew Christian pilgrims, some of whom chose to settle, contributing to the rise of a Christian majority. Christian authorities encouraged this pilgrimage movement and appropriated lands, constructing magnificent churches at locations linked to biblical narratives. Additionally, monks established monasteries near pagan settlements, encouraging the conversion of local pagans. During the Byzantine period, the Jewish presence in the region declined, and it is believed that Jews lost their majority status in Palestine in the fourth century. While Judaism remained the sole non-Christian religion tolerated, restrictions on Jews gradually increased, prohibiting the construction of new places of worship, holding public office, or owning Christian slaves. In 425, after the death of the last Nasi, Gamliel VI, the Nasi office and the Sanhedrin were officially abolished, and the standing of yeshivot weakened. The leadership void was gradually filled by the Jewish center in Babylonia, which would assume a leading role in the Jewish world for generations after the Byzantine period. During the 5th and 6th centuries CE, the region witnessed a series of Samaritan revolts against Byzantine rule. Their suppression resulted in the decline of Samaritan presence and influence, and further consolidated Christian domination. Though it is acknowledged that some Jews and Samaritans converted to Christianity during the Byzantine period, the reliable historical records are limited, and they pertain to individual conversions rather than entire communities. In 611, Khosrow II, ruler of Sassanid Persia, invaded the Byzantine Empire. He was helped by Jewish fighters recruited by Benjamin of Tiberias and captured Jerusalem in 614. The "True Cross" was captured by the Persians. The Jewish Himyarite Kingdom in Yemen may also have provided support. Nehemiah ben Hushiel was made governor of Jerusalem. Christian historians of the period claimed the Jews massacred Christians in the city, but there is no archeological evidence of destruction, leading modern historians to question their accounts. In 628, Kavad II (son of Kosrow) returned Palestine and the True Cross to the Byzantines and signed a peace treaty with them. Following the Byzantine re-entry, Heraclius massacred the Jewish population of Galilee and Jerusalem, while renewing the ban on Jews entering the latter. Early Muslim period The Levant was conquered by an Arab army under the command of ʿUmar ibn al-Khaṭṭāb in 635, and became the province of Bilad al-Sham of the Rashidun Caliphate. Two military districts—Jund Filastin and Jund al-Urdunn—were established in Palestine. A new city called Ramlah was built as the Muslim capital of Jund Filastin, while Tiberias served as the capital of Jund al-Urdunn. The Byzantine ban on Jews living in Jerusalem came to an end. In 661, Mu'awiya I was crowned Caliph in Jerusalem, becoming the first of the (Damascus-based) Umayyad dynasty. In 691, Umayyad Caliph Abd al-Malik (685–705) constructed the Dome of the Rock shrine on the Temple Mount, where the two Jewish temples had been located. A second building, the Al-Aqsa Mosque, was also erected on the Temple Mount in 705. Both buildings were rebuilt in the 10th century following a series of earthquakes. In 750, Arab discrimination against non-Arab Muslims led to the Abbasid Revolution and the Umayyads were replaced by the Abbasid Caliphs who built a new city, Baghdad, to be their capital. This period is known as the Islamic Golden Age, the Arab Empire was the largest in the world and Baghdad the largest and richest city. Both Arabs and minorities prospered across the region and much scientific progress was made. There were however setbacks: During the 8th century, the Caliph Umar II introduced a law requiring Jews and Christians to wear identifying clothing. Jews were required to wear yellow stars round their neck and on their hats, Christians had to wear Blue. Clothing regulations arose during repressive periods of Arab rule and were more designed to humiliate then persecute non-Muslims. A poll tax was imposed on all non-Muslims by Islamic rulers and failure to pay could result in imprisonment or worse. In 982, Caliph Al-Aziz Billah of the Cairo-based Fatimid dynasty conquered the region. The Fatimids were followers of Isma'ilism, a branch of Shia Islam and claimed descent from Fatima, Mohammed's daughter. Around the year 1010, the Church of Holy Sepulchre (believed to be Jesus burial site), was destroyed by Fatimid Caliph al-Hakim, who relented ten years later and paid for it to be rebuilt. In 1020 al-Hakim claimed divine status and the newly formed Druze religion gave him the status of a messiah. Although the Arab conquest was relatively peaceful and did not cause widespread destruction, it did alter the country's demographics significantly. Over the ensuing several centuries, the region experienced a drastic decline in its population, from an estimated 1 million during Roman and Byzantine times to some 300,000 by the early Ottoman period. This demographic collapse was accompanied by a slow process of Islamization, that resulted from the flight of non-Muslim populations, immigration of Muslims, and local conversion. The majority of the remaining populace belonged to the lowest classes. While the Arab conquerors themselves left the area after the conquest and moved on to other places, the settlement of Arab tribes in the area both before and after the conquest also contributed to the Islamization. As a result, the Muslim population steadily grew and the area became gradually dominated by Muslims on a political and social level. During the early Islamic period, many Christians and Samaritans, belonging to the Byzantine upper class, migrated from the coastal cities to northern Syria and Cyprus, which were still under Byzantine control, while others fled to the central highlands and the Transjordan. As a result, the coastal towns, formerly important economic centers connected with the rest of the Byzantine world, were emptied of most of their residents. Some of these cities—namely Ashkelon, Acre, Arsuf, and Gaza—now fortified border towns, were resettled by Muslim populations, who developed them into significant Muslim centers. The region of Samaria also underwent a process of Islamization as a result of waves of conversion among the Samaritan population and the influx of Muslims into the area. The predominantly Jacobite Monophysitic Christian population had been hostile to Byzantium orthodoxy, and at times for that reason welcomed Muslim rule. There is no strong evidence for forced conversion, or that the jizya tax significantly affected such changes. The demographic situation in Palestine was further altered by urban decline under the Abbasids, and it is thought that the 749 earthquake hastened this process by causing an increase in the number of Jews, Christians, and Samaritans who emigrated to diaspora communities while also leaving behind others who remained in the devastated cities and poor villages until they converted to Islam. Historical records and archeological evidence suggest that many Samaritans converted under Abbasid and Tulunid rule, after suffering through severe difficulties such droughts, earthquakes, religious persecution, heavy taxes and anarchy. The same region also saw the settlement of Arabs. Over the period, the Samaritan population drastically decreased, with the rural Samaritan population converting to Islam, and small urban communities remaining in Nablus and Caesarea, as well as in Cairo, Damascus, Aleppo and Sarepta. Nevertheless, the Muslim population remained a minority in a predominantly Christian area, and it is likely that this status persisted until the Crusader period. Crusades and Mongols In 1095, Pope Urban II called upon Christians to wage a holy war and recapture Jerusalem from Muslim rule. Responding to this call, Christians launched the First Crusade in the same year, a military campaign aimed at retaking the Holy Land, ultimately resulting in the successful siege and conquest of Jerusalem in 1099. In the same year, the Crusaders conquered Beit She'an and Tiberias, and in the following decade, they captured coastal cities with the support of Italian city-state fleets, establishing these coastal ports as crucial strongholds for Crusader rule in the region. Following the First Crusade, several Crusader states were established in the Levant, with the Kingdom of Jerusalem (Regnum Hierosolymitanum) assuming a preeminent position and enjoying special status among them. The population consisted predominantly of Muslims, Christians, Jews, and Samaritans, while the Crusaders remained a minority and relied on the local population who worked the soil. The region saw the construction of numerous robust castles and fortresses, yet efforts to establish permanent European villages proved unsuccessful. Around 1180, Raynald of Châtillon, ruler of Transjordan, caused increasing conflict with the Ayyubid Sultan Saladin (Salah-al-Din), leading to the defeat of the Crusaders in the 1187 Battle of Hattin (above Tiberias). Saladin was able to peacefully take Jerusalem and conquered most of the former Kingdom of Jerusalem. Saladin's court physician was Maimonides, a refugee from Almohad (Muslim) persecution in Córdoba, Spain, where all non-Muslim religions had been banned. The Christian world's response to the loss of Jerusalem came in the Third Crusade of 1190. After lengthy battles and negotiations, Richard the Lionheart and Saladin concluded the Treaty of Jaffa in 1192 whereby Christians were granted free passage to make pilgrimages to the holy sites, while Jerusalem remained under Muslim rule. In 1229, Jerusalem peacefully reverted into Christian control as part of a treaty between Holy Roman Emperor Frederick II and Ayyubid sultan al-Kamil that ended the Sixth Crusade. In 1244, Jerusalem was sacked by the Khwarezmian Tatars who decimated the city's Christian population, drove out the Jews and razed the city. The Khwarezmians were driven out by the Ayyubids in 1247. Mamluk period Between 1258 and 1291, the area was the frontier between Mongol invaders (occasional Crusader allies) and the Mamluks of Egypt. The conflict impoverished the country and severely reduced the population. In Egypt a caste of warrior slaves, known as the Mamluks, gradually took control of the kingdom. The Mamluks were mostly of Turkish origin, and were bought as children and then trained in warfare. They were highly prized warriors, who gave rulers independence of the native aristocracy. In Egypt they took control of the kingdom following a failed invasion by the Crusaders (Seventh Crusade). The first Mamluk Sultan, Qutuz of Egypt, defeated the Mongols in the Battle of Ain Jalut ("Goliath's spring" near Ein Harod), ending the Mongol advances. He was assassinated by one of his Generals, Baibars, who went on to eliminate most of the Crusader outposts. The Mamluks ruled Palestine until 1516, regarding it as part of Syria. In Hebron, Jews were banned from worshipping at the Cave of the Patriarchs (the second-holiest site in Judaism); they were only allowed to enter 7 steps inside the site and the ban remained in place until Israel assumed control of the West Bank in the Six-Day War.[undue weight? – discuss] The Egyptian Mamluk sultan Al-Ashraf Khalil conquered the last outpost of Crusader rule in 1291. The Mamluks, continuing the policy of the Ayyubids, made the strategic decision to destroy the coastal area and to bring desolation to many of its cities, from Tyre in the north to Gaza in the south. Ports were destroyed and various materials were dumped to make them inoperable. The goal was to prevent attacks from the sea, given the fear of the return of the Crusaders. This had a long-term effect on those areas, which remained sparsely populated for centuries. The activity in that time concentrated more inland. With the 1492 expulsion of Jews from Spain and 1497 persecution of Jews and Muslims by Manuel I of Portugal, many Jews moved eastward, with some deciding to settle in the Mamluk Palestine. As a consequence, the local Jewish community underwent significant rejuvenation. The influx of Sephardic Jews began under Mamluk rule in the 15th century, and continued throughout the 16th century and especially after the Ottoman conquest. As city-dwellers, the majority of Sephardic Jews preferred to settle in urban areas, mainly in Safed but also in Jerusalem, while the Musta'arbi community comprised the majority of the villagers' Jews. Ottoman period Under the Mamluks, the area was a province of Bilad a-Sham (Syria). It was conquered by Turkish Sultan Selim I in 1516–17, becoming a part of the province of Ottoman Syria for the next four centuries, first as the Damascus Eyalet and later as the Syria Vilayet (following the Tanzimat reorganization of 1864). With the more favorable conditions that followed the Ottoman conquest, the immigration of Jews fleeing Catholic Europe, which had already begun under Mamluk rule, continued, and soon an influx of exiled Sephardic Jews came to dominate the Jewish community in the area. In 1558, Selim II (1566–1574), successor to Suleiman, whose wife Nurbanu Sultan was Jewish, gave control of Tiberias to Doña Gracia Mendes Nasi, one of the richest women in Europe and an escapee from the Inquisition. She encouraged Jewish refugees to settle in the area and established a Hebrew printing press. Safed became a centre for study of the Kabbalah and other Jewish religious studies, culminating with Joseph Karo's writing of the Shulchan Aruch – published in 1565 in Venice – which became the near-universal standard of Jewish religious law. Doña Nasi's nephew, Joseph Nasi, was made governor of Tiberias and he encouraged Jewish settlement from Italy. In 1660, a Druze power struggle led to the destruction of Safed and Tiberias. In the late 18th century a local Arab sheikh, Zahir al-Umar, created a de facto independent Emirate in the Galilee. Ottoman attempts to subdue the Sheikh failed, but after Zahir's death the Ottomans restored their rule in the area. In 1799, Napoleon briefly occupied the country and planned a proclamation inviting Jews to create a state. The proclamation was shelved following his defeat at Acre. In 1831, Muhammad Ali of Egypt, an Ottoman ruler who left the Empire and tried to modernize Egypt, conquered Ottoman Syria and imposed conscription, leading to the Arab revolt. In 1838, there was another Druze revolt. In 1839 Moses Montefiore met with Muhammed Pasha in Egypt and signed an agreement to establish 100–200 Jewish villages in the Damascus Eyalet of Ottoman Syria, but in 1840 the Egyptians withdrew before the deal was implemented, returning the area to Ottoman governorship. In 1844, Jews constituted the largest population group in Jerusalem. By 1896 Jews constituted an absolute majority in Jerusalem, but the overall population in Palestine was 88% Muslim and 9% Christian. Between 1882 and 1903, approximately 35,000 Jews moved to Palestine, known as the First Aliyah. In the Russian Empire, Jews faced growing persecution and legal restrictions. Half the world's Jews lived in the Russian Empire, where they were restricted to living in the Pale of Settlement. Severe pogroms in the early 1880s and legal repression led to 2 million Jews emigrating from the Russian Empire. 1.5 million went to the United States. Popular destinations were also Germany, France, the United Kingdom, the Netherlands, Argentina and Palestine. The Zionist movement began in earnest in 1882 with Leon Pinsker's pamphlet Auto-Emancipation, which argued for the creation of a Jewish national homeland as a means to avoid the violence plaguing Jewish communities in Eastern Europe. At the 1884 Katowice Conference, Russian Jews established the Bilu and Hovevei Zion ("Lovers of Zion") movements with the aim of settling in Palestine. In 1878, Russian Jewish emigrants established the village of Petah Tikva ("The Beginning of Hope"), followed by Rishon LeZion ("First to Zion") in 1882. The existing Ashkenazi communities were concentrated in the Four Holy Cities, extremely poor and relied on donations (halukka) from groups abroad, while the new settlements were small farming communities, but still relied on funding by the French Baron, Edmond James de Rothschild, who sought to establish profitable enterprises. Many early migrants could not find work and left, but despite the problems, more settlements arose and the community grew. After the Ottoman conquest of Yemen in 1881, a large number of Yemenite Jews also emigrated to Palestine, often driven by Messianism. In 1896 Theodor Herzl published Der Judenstaat (The Jewish State), in which he asserted that the solution to growing antisemitism in Europe (the so-called "Jewish Question") was to establish a Jewish state. In 1897, the World Zionist Organization was founded and the First Zionist Congress proclaimed its aim "to establish a home for the Jewish people in Palestine secured under public law." The Congress chose Hatikvah ("The Hope") as its anthem. Between 1904 and 1914, around 40,000 Jews settled in the area now known as Israel (the Second Aliyah). In 1908, the World Zionist Organization set up the Palestine Bureau (also known as the "Eretz Israel Office") in Jaffa and began to adopt a systematic Jewish settlement policy. In 1909, residents of Jaffa bought land outside the city walls and built the first entirely Hebrew-speaking town, Ahuzat Bayit (later renamed Tel Aviv). In 1915–1916, Talaat Pasha of the Young Turks forced around a million Armenian Christians from their homes in Eastern Turkey, marching them south through Syria, in what is now known as the Armenian genocide. The number of dead is thought to be around 700,000. Hundreds of thousands were forcibly converted to Islam. A community of survivors settled in Jerusalem, one of whom developed the now iconic Armenian pottery. During World War I, most Jews supported the Germans because they were fighting the Russians who were regarded as the Jews' main enemy. In Britain, the government sought Jewish support for the war effort for a variety of reasons including an antisemitic perception of "Jewish power" in the Ottoman Empire's Young Turks movement which was based in Thessaloniki, the most Jewish city in Europe (40% of the 160,000 population were Jewish). The British also hoped to secure American Jewish support for US intervention on Britain's behalf. There was already sympathy for the aims of Zionism in the British government, including the Prime Minister Lloyd George. Over 14,000 Jews were expelled by the Ottoman military commander from the Jaffa area in 1914–1915, due to suspicions they were subjects of Russia, an enemy, or Zionists wishing to detach Palestine from the Ottoman Empire, and when the entire population, including Muslims, of both Jaffa and Tel Aviv was subject to an expulsion order in April 1917, the affected Jews could not return until the British conquest ended in 1918, which drove the Turks out of Southern Syria. A year prior, in 1917, the British foreign minister, Arthur Balfour, sent a public letter to the British Lord Rothschild, a leading member of his party and leader of the Jewish community. The letter subsequently became known as the Balfour Declaration. It stated that the British Government "view[ed] with favour the establishment in Palestine of a national home for the Jewish people". The declaration provided the British government with a pretext for claiming and governing the country. New Middle Eastern boundaries were decided by an agreement between British and French bureaucrats. A Jewish Legion composed largely of Zionist volunteers organized by Ze'ev Jabotinsky and Joseph Trumpeldor participated in the British invasion. It also participated in the failed Gallipoli Campaign. The Nili Zionist spy network provided the British with details of Ottoman plans and troop concentrations. The Ottoman Empire chose to ally itself with Germany when the first war began. Arab leaders dreamed of freeing themselves from Ottoman rule and establishing self-government or forming an independent Arab state. Therefore, Britain contacted Hussein bin Ali of the Kingdom of Hejaz and proposed cooperation. Together they organized the Arab revolt that Britain supplied with very large quantities of rifles and ammunition. In cooperation between British artillery and Arab infantry, the city of Aqaba on the Red Sea was conquered. The Arab army then continued north while Britain attacked the ottomans from the sea. In 1917–1918, Jerusalem and Damascus were conquered from the ottomans. Britain then broke off cooperation with the Arab army. It turned out that Britain had already entered into the secret Sykes–Picot Agreement that meant that only Britain and France would be allowed to administer the land conquered from the Ottoman Empire. After pushing out the Ottomans, Palestine came under martial law. The British, French and Arab Occupied Enemy Territory Administration governed the area shortly before the armistice with the Ottomans until the promulgation of the mandate in 1920. Mandatory Palestine The British Mandate (in effect, British rule) of Palestine, including the Balfour Declaration, was confirmed by the League of Nations in 1922 and came into effect in 1923. The territory of Transjordan was also covered by the Mandate but under separate rules that excluded it from the Balfour Declaration. Britain signed a treaty with the United States (which did not join the League of Nations) in which the United States endorsed the terms of the Mandate, which was approved unanimously by both the U.S. Senate and House of Representatives. The Balfour declaration was published on the 2nd of November 1917 and the Bolsheviks seized control of Russia a week later. This led to civil war in the Russian Empire. Between 1918 and 1921, a series of pogroms led to the death of at least 100,000 Jews (mainly in what is now Ukraine), and the displacement as refugees of a further 600,000. This led to further migration to Palestine. Between 1919 and 1923, some 40,000 Jews arrived in Palestine in what is known as the Third Aliyah. Many of the Jewish immigrants of this period were Socialist Zionists and supported the Bolsheviks. The migrants became known as pioneers (halutzim), experienced or trained in agriculture who established self-sustaining communes called kibbutzim. Malarial marshes in the Jezreel Valley and Hefer Plain were drained and converted to agricultural use. Land was bought by the Jewish National Fund, a Zionist charity that collected money abroad for that purpose. After the French victory over the Arab Kingdom of Syria ended hopes of Arab independence, there were clashes between Arabs and Jews in Jerusalem during the 1920 Nebi Musa riots and in Jaffa the following year, leading to the establishment of the Haganah underground Jewish militia. A Jewish Agency was created which issued the entry permits granted by the British and distributed funds donated by Jews abroad. Between 1924 and 1929, over 80,000 Jews arrived in the Fourth Aliyah, fleeing antisemitism and heavy tax burdens imposed on trade in Poland and Hungary, inspired by Zionism and motivated by the closure of United States borders by the Immigration Act of 1924 which severely limited immigration from Eastern and Southern Europe. Pinhas Rutenberg, a former Commissar of St Petersburg in Russia's pre-Bolshevik Kerensky Government, built the first electricity generators in Palestine. In 1925, the Jewish Agency established the Hebrew University in Jerusalem and the Technion (technological university) in Haifa. British authorities introduced the Palestine pound (worth 1000 "mils") in 1927, replacing the Egyptian pound as the unit of currency in the Mandate. From 1928, the democratically elected Va'ad Leumi (Jewish National Council or JNC) became the main administrative institution of the Palestine Jewish community (Yishuv) and included non-Zionist Jews. As the Yishuv grew, the JNC adopted more government-type functions, such as education, health care, and security. With British permission, the Va'ad Leumi raised its own taxes and ran independent services for the Jewish population. In 1929, tensions grew over the Kotel (Wailing Wall), the holiest spot in the world for modern Judaism,[citation needed] which was then a narrow alleyway where the British banned Jews from using chairs or curtains: Many of the worshippers were elderly and needed seats; they also wanted to separate women from men. The Mufti of Jerusalem said it was Muslim property and deliberately had cattle driven through the alley.[citation needed] He alleged that the Jews were seeking control of the Temple Mount. This provided the spark for the August 1929 Palestine riots. The main victims were the (non-Zionist) ancient Jewish community at Hebron, who were massacred. The riots led to right-wing Zionists establishing their own militia in 1931, the Irgun Tzvai Leumi (National Military Organization, known in Hebrew by its acronym "Etzel"), which was committed to a more aggressive policy towards the Arab population. During the interwar period, the perception grew that there was an irreconciliable tension between the two Mandatory functions, of providing for a Jewish homeland in Palestine, and the goal of preparing the country for self-determination. The British rejected the principle of majority rule or any other measure that would give the Arab population, who formed the majority of the population, control over Palestinian territory. Between 1929 and 1938, 250,000 Jews arrived in Palestine (Fifth Aliyah). In 1933, the Jewish Agency and the Nazis negotiated the Ha'avara Agreement (transfer agreement), under which 50,000 German Jews would be transferred to Palestine. The Jews' possessions were confiscated and in return the Nazis allowed the Ha'avara organization to purchase 14 million pounds worth of German goods for export to Palestine and use it to compensate the immigrants. Although many Jews wanted to leave Nazi Germany, the Nazis prevented Jews from taking any money and restricted them to two suitcases so few could pay the British entry tax.[citation needed] The agreement was controversial and the Labour Zionist leader who negotiated the agreement, Haim Arlosoroff, was assassinated in Tel Aviv in 1933. The assassination was used by the British to create tension between the Zionist left and the Zionist right.[citation needed] Arlosoroff had been the boyfriend of Magda Ritschel some years before she married Joseph Goebbels. There has been speculation that he was assassinated by the Nazis to hide the connection but there is no evidence for it. Between 1933 and 1936, 174,000 arrived despite the large sums the British demanded for immigration permits: Jews had to prove they had 1,000 pounds for families with capital (equivalent to £85,824 in 2023), 500 pounds if they had a profession and 250 pounds if they were skilled labourers.[better source needed] Jewish immigration and Nazi propaganda contributed to the large-scale 1936–1939 Arab revolt in Palestine, a largely nationalist uprising directed at ending British rule. The head of the Jewish Agency, Ben-Gurion, responded to the Arab Revolt with a policy of "Havlagah"—self-restraint and a refusal to be provoked by Arab attacks in order to prevent polarization. The Etzel group broke off from the Haganah in opposition to this policy. The British responded to the revolt with the Peel Commission (1936–37), a public inquiry that recommended that an exclusively Jewish territory be created in the Galilee and western coast (including the population transfer of 225,000 Arabs); the rest becoming an exclusively Arab area. The two main Jewish leaders, Chaim Weizmann and David Ben-Gurion, had convinced the Zionist Congress to approve equivocally the Peel recommendations as a basis for more negotiation. The plan was rejected outright by the Palestinian Arab leadership and they renewed the revolt, which caused the British to abandon the plan as unworkable. Testifying before the Peel Commission, Weizmann said "There are in Europe 6,000,000 people ... for whom the world is divided into places where they cannot live and places where they cannot enter." In 1938, the US called an international conference to address the question of the vast numbers of Jews trying to escape Europe. Britain made its attendance contingent on Palestine being kept out of the discussion. No Jewish representatives were invited. The Nazis proposed their own solution: that the Jews of Europe be shipped to Madagascar (the Madagascar Plan). The agreement proved fruitless, and the Jews were stuck in Europe. With millions of Jews trying to leave Europe and every country closed to Jewish migration, the British decided to close Palestine. The White Paper of 1939, recommended that an independent Palestine, governed jointly by Arabs and Jews, be established within 10 years. The White Paper agreed to allow 75,000 Jewish immigrants into Palestine over the period 1940–44, after which migration would require Arab approval. Both the Arab and Jewish leadership rejected the White Paper. In March 1940 the British High Commissioner for Palestine issued an edict banning Jews from purchasing land in 95% of Palestine. Jews now resorted to illegal immigration: (Aliyah Bet or "Ha'apalah"), often organized by the Mossad Le'aliyah Bet and the Irgun. With no outside help and no countries ready to admit them, very few Jews managed to escape Europe between 1939 and 1945. Those caught by the British were mostly imprisoned in Mauritius. During the Second World War, the Jewish Agency worked to establish a Jewish army that would fight alongside the British forces. Churchill supported the plan but British military and government opposition led to its rejection. The British demanded that the number of Jewish recruits match the number of Arab recruits. In June 1940, Italy declared war on the British Commonwealth and sided with Germany. Within a month, Italian planes bombed Tel Aviv and Haifa, inflicting multiple casualties. In May 1941, the Palmach was established to defend the Yishuv against the planned Axis invasion through North Africa. The British refusal to provide arms to the Jews, even when Rommel's forces were advancing through Egypt in June 1942 (intent on occupying Palestine), and the 1939 White Paper led to the emergence of a Zionist leadership in Palestine that believed conflict with Britain was inevitable. Despite this, the Jewish Agency called on Palestine's Jewish youth to volunteer for the British Army. 30,000 Palestinian Jews and 12,000 Palestinian Arabs enlisted in the British armed forces during the war. In June 1944 the British agreed to create a Jewish Brigade that would fight in Italy. Approximately 1.5 million Jews around the world served in every branch of the allied armies, mainly in the Soviet and US armies. 200,000 Jews died serving in the Soviet army alone. A small group (about 200 activists), dedicated to resisting the British administration in Palestine, broke away from the Etzel (which advocated support for Britain during the war) and formed the "Lehi" (Stern Gang), led by Avraham Stern. In 1942, the USSR released the Revisionist Zionist leader Menachem Begin from the Gulag and he went to Palestine, taking command of the Etzel organization with a policy of increased conflict against the British. At about the same time Yitzhak Shamir escaped from the camp in Eritrea where the British were holding Lehi activists without trial, taking command of the Lehi (Stern Gang). Jews in the Middle East were also affected by the war. Most of North Africa came under Nazi control and many Jews were used as slaves. The 1941 pro-Axis coup in Iraq was accompanied by massacres of Jews. The Jewish Agency put together plans for a last stand in the event of Rommel invading Palestine (the Nazis planned to exterminate Palestine's Jews). Between 1939 and 1945, the Nazis, aided by local forces, led systematic efforts to kill every person of Jewish extraction in Europe (The Holocaust), causing the deaths of approximately 6 million Jews. A quarter of those killed were children. The Polish and German Jewish communities, which played an important role in defining the pre-1945 Jewish world, mostly ceased to exist. In the United States and Palestine, Jews of European origin became disconnected from their families and roots. As the Holocaust mainly affected Ashkenazi Jews, Sepharadi and Mizrahi Jews, who had been a minority, became a much more significant factor in the Jewish world. Those Jews who survived in central Europe, were displaced persons (refugees); an Anglo-American Committee of Inquiry, established to examine the Palestine issue, surveyed their ambitions and found that over 95% wanted to migrate to Palestine. In the Zionist movement the moderate Pro-British (and British citizen) Weizmann, whose son died flying in the RAF, was undermined by Britain's anti-Zionist policies. Leadership of the movement passed to the Jewish Agency in Palestine, now led by the anti-British Socialist-Zionist party (Mapai) led by David Ben-Gurion. The British Empire was severely weakened by the war. In the Middle East, the war had made Britain conscious of its dependence on Arab oil. Shortly after VE Day, the Labour Party won the general election in Britain. Although Labour Party conferences had for years called for the establishment of a Jewish state in Palestine, the Labour government now decided to maintain the 1939 White Paper policies. Illegal migration (Aliyah Bet) became the main form of Jewish entry into Palestine. Across Europe Bricha ("flight"), an organization of former partisans and ghetto fighters, smuggled Holocaust survivors from Eastern Europe to Mediterranean ports, where small boats tried to breach the British blockade of Palestine. Meanwhile, Jews from Arab countries began moving into Palestine overland. Despite British efforts to curb immigration, during the 14 years of the Aliyah Bet, over 110,000 Jews entered Palestine. By the end of World War II, the Jewish population of Palestine had increased to 33% of the total population. In an effort to win independence, Zionists now waged a guerrilla war against the British. The main underground Jewish militia, the Haganah, formed an alliance called the Jewish Resistance Movement with the Etzel and Stern Gang to fight the British. In June 1946, following instances of Jewish sabotage, such as in the Night of the Bridges, the British launched Operation Agatha, arresting 2,700 Jews, including the leadership of the Jewish Agency, whose headquarters were raided. Those arrested were held without trial. On 4 July 1946 a massive pogrom in Poland led to a wave of Holocaust survivors fleeing Europe for Palestine. Three weeks later, Irgun bombed the British Military Headquarters of the King David Hotel in Jerusalem, killing 91 people. In the days following the bombing, Tel Aviv was placed under curfew and over 120,000 Jews, nearly 20% of the Jewish population of Palestine, were questioned by the police. In the US, Congress criticized British handling of the situation and considered delaying loans that were vital to British post-war recovery. The alliance between Haganah and Etzel was dissolved after the King David bombings. Between 1945 and 1948, 100,000–120,000 Jews left Poland. Their departure was largely organized by Zionist activists under the umbrella of the semi-clandestine organization Berihah ("Flight"). Berihah was also responsible for the organized emigration of Jews from Romania, Hungary, Czechoslovakia and Yugoslavia, totalling 250,000 (including Poland) Holocaust survivors. The British imprisoned the Jews trying to enter Palestine in the Atlit detainee camp and Cyprus internment camps. Those held were mainly Holocaust survivors, including large numbers of children and orphans. In response to Cypriot fears that the Jews would never leave and because the 75,000 quota established by the 1939 White Paper had never been filled, the British allowed the refugees to enter Palestine at a rate of 750 per month. On 2 April 1947, the United Kingdom requested that the question of Palestine be handled by the General Assembly. The General Assembly created a committee, United Nations Special Committee on Palestine (UNSCOP), to report on "the question of Palestine". In July 1947 the UNSCOP visited Palestine and met with Jewish and Zionist delegations. The Arab Higher Committee boycotted the meetings. During the visit the British Foreign Secretary Ernest Bevin ordered that passengers from an Aliyah Bet ship, SS Exodus 1947, be sent back to Europe. The Holocaust surviving migrants on the ship were forcibly removed by British troops at Hamburg, Germany. The principal non-Zionist Orthodox Jewish (or Haredi) party, Agudat Israel, recommended to UNSCOP that a Jewish state be set up after reaching a religious status quo agreement with Ben-Gurion. The agreement granted an exemption from military service to a quota of yeshiva (religious seminary) students and to all Orthodox women, made the Sabbath the national weekend, guaranteed kosher food in government institutions and allowed Orthodox Jews to maintain a separate education system. The majority report of UNSCOP proposed "an independent Arab State, an independent Jewish State, and the City of Jerusalem", the last to be under "an International Trusteeship System". On 29 November 1947, in Resolution 181 (II), the General Assembly adopted the majority report of UNSCOP, but with slight modifications. The Plan also called for the British to allow "substantial" Jewish migration by 1 February 1948. Neither Britain nor the UN Security Council took any action to implement the recommendation made by the resolution and Britain continued detaining Jews attempting to enter Palestine. Concerned that partition would severely damage Anglo-Arab relations, Britain denied UN representatives access to Palestine during the period between the adoption of Resolution 181 (II) and the termination of the British Mandate. The British withdrawal was completed in May 1948. However, Britain continued to hold Jewish immigrants of "fighting age" and their families on Cyprus until March 1949. The General Assembly's vote caused joy in the Jewish community and anger in the Arab community. Violence broke out between the sides, escalating into civil war. From January 1948, operations became increasingly militarized, with the intervention of a number of Arab Liberation Army regiments inside Palestine, each active in a variety of distinct sectors around the different coastal towns. They consolidated their presence in Galilee and Samaria. Abd al-Qadir al-Husayni came from Egypt with several hundred men of the Army of the Holy War. Having recruited a few thousand volunteers, he organized the blockade of the 100,000 Jewish residents of Jerusalem. The Yishuv tried to supply the city using convoys of up to 100 armoured vehicles, but largely failed. By March, almost all Haganah's armoured vehicles had been destroyed, the blockade was in full operation, and hundreds of Haganah members who had tried to bring supplies into the city were killed. Up to 100,000 Arabs, from the urban upper and middle classes in Haifa, Jaffa and Jerusalem, or Jewish-dominated areas, evacuated abroad or to Arab centres eastwards. This situation caused the US to withdraw their support for the Partition plan, thus encouraging the Arab League to believe that the Palestinian Arabs, reinforced by the Arab Liberation Army, could put an end to the plan for partition. The British, on the other hand, decided on 7 February 1948 to support the annexation of the Arab part of Palestine by Transjordan. The Jordanian army was commanded by the British. David Ben-Gurion reorganized the Haganah and made conscription obligatory. Every Jewish man and woman in the country had to receive military training. Thanks to funds raised by Golda Meir from sympathisers in the United States, and Stalin's decision to support the Zionist cause, the Jewish representatives of Palestine were able to purchase important arms in Eastern Europe. Ben-Gurion gave Yigael Yadin the responsibility to plan for the announced intervention of the Arab states. The result of his analysis was Plan Dalet, in which Haganah passed from the defensive to the offensive. The plan sought to establish Jewish territorial continuity by conquering mixed zones. Tiberias, Haifa, Safed, Beisan, Jaffa and Acre fell, resulting in the flight of more than 250,000 Palestinian Arabs. On 14 May 1948, on the day the last British forces left Haifa, the Jewish People's Council gathered at the Tel Aviv Museum and proclaimed the establishment of a Jewish state, to be known as the State of Israel. State of Israel In 1948, following the 1947–1948 war in Mandatory Palestine, the Israeli Declaration of Independence sparked the 1948 Arab–Israeli War. This resulted in the 1948 Palestinian expulsion and flight from the land that the State of Israel came to control, and led to waves of Jewish immigration from other parts of the Middle East. The latter half of the 20th century saw further conflicts between Israel and its neighbouring Arab nations. In 1967, the Six-Day War erupted; in its aftermath, Israel captured and occupied the Golan Heights from Syria, the West Bank from Jordan, and the Gaza Strip and the Sinai Peninsula from Egypt. In 1973, the Yom Kippur War began with an attack by Egypt on the Israeli-occupied Sinai Peninsula. In 1979, the Egypt–Israel peace treaty was signed, based on the Camp David Accords. In 1993, Israel signed the Oslo I Accord with the Palestine Liberation Organization, which was followed by the establishment of the Palestinian National Authority. In 1994, the Israel–Jordan peace treaty was signed. Despite efforts to finalize the peace agreement, the conflict continues. Demographics See also Notes References Further reading External links Israeli settlementsTimeline, International law West BankJudea and Samaria Area Gaza StripHof Aza Regional Council |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Black_hole#cite_note-266] | [TOKENS: 13839] |
Contents Black hole A black hole is an astronomical body so compact that its gravity prevents anything, including light, from escaping. Albert Einstein's theory of general relativity predicts that a sufficiently compact mass will form a black hole. The boundary of no escape is called the event horizon. In general relativity, a black hole's event horizon seals an object's fate but produces no locally detectable change when crossed. General relativity also predicts that every black hole should have a central singularity, where the curvature of spacetime is infinite. In many ways, a black hole acts like an ideal black body, as it reflects no light. Quantum field theory in curved spacetime predicts that event horizons emit Hawking radiation, with the same spectrum as a black body of a temperature inversely proportional to its mass. This temperature is of the order of billionths of a kelvin for stellar black holes, making it essentially impossible to observe directly. Objects whose gravitational fields are too strong for light to escape were first considered in the 18th century by John Michell and Pierre-Simon Laplace. In 1916, Karl Schwarzschild found the first modern solution of general relativity that would characterise a black hole. Due to his influential research, the Schwarzschild metric is named after him. David Finkelstein, in 1958, first interpreted Schwarzschild's model as a region of space from which nothing can escape. Black holes were long considered a mathematical curiosity; it was not until the 1960s that theoretical work showed they were a generic prediction of general relativity. The first black hole known was Cygnus X-1, identified by several researchers independently in 1971. Black holes typically form when massive stars collapse at the end of their life cycle. After a black hole has formed, it can grow by absorbing mass from its surroundings. Supermassive black holes of millions of solar masses may form by absorbing other stars and merging with other black holes, or via direct collapse of gas clouds. There is consensus that supermassive black holes exist in the centres of most galaxies. The presence of a black hole can be inferred through its interaction with other matter and with electromagnetic radiation such as visible light. Matter falling toward a black hole can form an accretion disk of infalling plasma, heated by friction and emitting light. In extreme cases, this creates a quasar, some of the brightest objects in the universe. Merging black holes can also be detected by observation of the gravitational waves they emit. If other stars are orbiting a black hole, their orbits can be used to determine the black hole's mass and location. Such observations can be used to exclude possible alternatives such as neutron stars. In this way, astronomers have identified numerous stellar black hole candidates in binary systems and established that the radio source known as Sagittarius A*, at the core of the Milky Way galaxy, contains a supermassive black hole of about 4.3 million solar masses. History The idea of a body so massive that even light could not escape was first proposed in the late 18th century by English astronomer and clergyman John Michell and independently by French scientist Pierre-Simon Laplace. Both scholars proposed very large stars in contrast to the modern concept of an extremely dense object. Michell's idea, in a short part of a letter published in 1784, calculated that a star with the same density but 500 times the radius of the sun would not let any emitted light escape; the surface escape velocity would exceed the speed of light.: 122 Michell correctly hypothesized that such supermassive but non-radiating bodies might be detectable through their gravitational effects on nearby visible bodies. In 1796, Laplace mentioned that a star could be invisible if it were sufficiently large while speculating on the origin of the Solar System in his book Exposition du Système du Monde. Franz Xaver von Zach asked Laplace for a mathematical analysis, which Laplace provided and published in a journal edited by von Zach. In 1905, Albert Einstein showed that the laws of electromagnetism would be invariant under a Lorentz transformation: they would be identical for observers travelling at different velocities relative to each other. This discovery became known as the principle of special relativity. Although the laws of mechanics had already been shown to be invariant, gravity remained yet to be included.: 19 In 1907, Einstein published a paper proposing his equivalence principle, the hypothesis that inertial mass and gravitational mass have a common cause. Using the principle, Einstein predicted the redshift and half of the lensing effect of gravity on light; the full prediction of gravitational lensing required development of general relativity.: 19 By 1915, Einstein refined these ideas into his general theory of relativity, which explained how matter affects spacetime, which in turn affects the motion of other matter. This formed the basis for black hole physics. Only a few months after Einstein published the field equations describing general relativity, astrophysicist Karl Schwarzschild set out to apply the idea to stars. He assumed spherical symmetry with no spin and found a solution to Einstein's equations.: 124 A few months after Schwarzschild, Johannes Droste, a student of Hendrik Lorentz, independently gave the same solution. At a certain radius from the center of the mass, the Schwarzschild solution became singular, meaning that some of the terms in the Einstein equations became infinite. The nature of this radius, which later became known as the Schwarzschild radius, was not understood at the time. Many physicists of the early 20th century were skeptical of the existence of black holes. In a 1926 popular science book, Arthur Eddington critiqued the idea of a star with mass compressed to its Schwarzschild radius as a flaw in the then-poorly-understood theory of general relativity.: 134 In 1939, Einstein himself used his theory of general relativity in an attempt to prove that black holes were impossible. His work relied on increasing pressure or increasing centrifugal force balancing the force of gravity so that the object would not collapse beyond its Schwarzschild radius. He missed the possibility that implosion would drive the system below this critical value.: 135 By the 1920s, astronomers had classified a number of white dwarf stars as too cool and dense to be explained by the gradual cooling of ordinary stars. In 1926, Ralph Fowler showed that quantum-mechanical degeneracy pressure was larger than thermal pressure at these densities.: 145 In 1931, Subrahmanyan Chandrasekhar calculated that a non-rotating body of electron-degenerate matter below a certain limiting mass is stable, and by 1934 he showed that this explained the catalog of white dwarf stars.: 151 When Chandrasekhar announced his results, Eddington pointed out that stars above this limit would radiate until they were sufficiently dense to prevent light from exiting, a conclusion he considered absurd. Eddington and, later, Lev Landau argued that some yet unknown mechanism would stop the collapse. In the 1930s, Fritz Zwicky and Walter Baade studied stellar novae, focusing on exceptionally bright ones they called supernovae. Zwicky promoted the idea that supernovae produced stars with the density of atomic nuclei—neutron stars—but this idea was largely ignored.: 171 In 1939, based on Chandrasekhar's reasoning, J. Robert Oppenheimer and George Volkoff predicted that neutron stars below a certain mass limit, later called the Tolman–Oppenheimer–Volkoff limit, would be stable due to neutron degeneracy pressure. Above that limit, they reasoned that either their model would not apply or that gravitational contraction would not stop.: 380 John Archibald Wheeler and two of his students resolved questions about the model behind the Tolman–Oppenheimer–Volkoff (TOV) limit. Harrison and Wheeler developed the equations of state relating density to pressure for cold matter all the way through electron degeneracy and neutron degeneracy. Masami Wakano and Wheeler then used the equations to compute the equilibrium curve for stars, relating mass to circumference. They found no additional features that would invalidate the TOV limit. This meant that the only thing that could prevent black holes from forming was a dynamic process ejecting sufficient mass from a star as it cooled.: 205 The modern concept of black holes was formulated by Robert Oppenheimer and his student Hartland Snyder in 1939.: 80 In the paper, Oppenheimer and Snyder solved Einstein's equations of general relativity for an idealized imploding star, in a model later called the Oppenheimer–Snyder model, then described the results from far outside the star. The implosion starts as one might expect: the star material rapidly collapses inward. However, as the density of the star increases, gravitational time dilation increases and the collapse, viewed from afar, seems to slow down further and further until the star reaches its Schwarzschild radius, where it appears frozen in time.: 217 In 1958, David Finkelstein identified the Schwarzschild surface as an event horizon, calling it "a perfect unidirectional membrane: causal influences can cross it in only one direction". In this sense, events that occur inside of the black hole cannot affect events that occur outside of the black hole. Finkelstein created a new reference frame to include the point of view of infalling observers.: 103 Finkelstein's new frame of reference allowed events at the surface of an imploding star to be related to events far away. By 1962 the two points of view were reconciled, convincing many skeptics that implosion into a black hole made physical sense.: 226 The era from the mid-1960s to the mid-1970s was the "golden age of black hole research", when general relativity and black holes became mainstream subjects of research.: 258 In this period, more general black hole solutions were found. In 1963, Roy Kerr found the exact solution for a rotating black hole. Two years later, Ezra Newman found the cylindrically symmetric solution for a black hole that is both rotating and electrically charged. In 1967, Werner Israel found that the Schwarzschild solution was the only possible solution for a nonspinning, uncharged black hole, meaning that a Schwarzschild black hole would be defined by its mass alone. Similar identities were later found for Reissner-Nordstrom and Kerr black holes, defined only by their mass and their charge or spin respectively. Together, these findings became known as the no-hair theorem, which states that a stationary black hole is completely described by the three parameters of the Kerr–Newman metric: mass, angular momentum, and electric charge. At first, it was suspected that the strange mathematical singularities found in each of the black hole solutions only appeared due to the assumption that a black hole would be perfectly spherically symmetric, and therefore the singularities would not appear in generic situations where black holes would not necessarily be symmetric. This view was held in particular by Vladimir Belinski, Isaak Khalatnikov, and Evgeny Lifshitz, who tried to prove that no singularities appear in generic solutions, although they would later reverse their positions. However, in 1965, Roger Penrose proved that general relativity without quantum mechanics requires that singularities appear in all black holes. Astronomical observations also made great strides during this era. In 1967, Antony Hewish and Jocelyn Bell Burnell discovered pulsars and by 1969, these were shown to be rapidly rotating neutron stars. Until that time, neutron stars, like black holes, were regarded as just theoretical curiosities, but the discovery of pulsars showed their physical relevance and spurred a further interest in all types of compact objects that might be formed by gravitational collapse. Based on observations in Greenwich and Toronto in the early 1970s, Cygnus X-1, a galactic X-ray source discovered in 1964, became the first astronomical object commonly accepted to be a black hole. Work by James Bardeen, Jacob Bekenstein, Carter, and Hawking in the early 1970s led to the formulation of black hole thermodynamics. These laws describe the behaviour of a black hole in close analogy to the laws of thermodynamics by relating mass to energy, area to entropy, and surface gravity to temperature. The analogy was completed: 442 when Hawking, in 1974, showed that quantum field theory implies that black holes should radiate like a black body with a temperature proportional to the surface gravity of the black hole, predicting the effect now known as Hawking radiation. While Cygnus X-1, a stellar-mass black hole, was generally accepted by the scientific community as a black hole by the end of 1973, it would be decades before a supermassive black hole would gain the same broad recognition. Although, as early as the 1960s, physicists such as Donald Lynden-Bell and Martin Rees had suggested that powerful quasars in the center of galaxies were powered by accreting supermassive black holes, little observational proof existed at the time. However, the Hubble Space Telescope, launched decades later, found that supermassive black holes were not only present in these active galactic nuclei, but that supermassive black holes in the center of galaxies were ubiquitous: Almost every galaxy had a supermassive black hole at its center, many of which were quiescent. In 1999, David Merritt proposed the M–sigma relation, which related the dispersion of the velocity of matter in the center bulge of a galaxy to the mass of the supermassive black hole at its core. Subsequent studies confirmed this correlation. Around the same time, based on telescope observations of the velocities of stars at the center of the Milky Way galaxy, independent work groups led by Andrea Ghez and Reinhard Genzel concluded that the compact radio source in the center of the galaxy, Sagittarius A*, was likely a supermassive black hole. On 11 February 2016, the LIGO Scientific Collaboration and Virgo Collaboration announced the first direct detection of gravitational waves, named GW150914, representing the first observation of a black hole merger. At the time of the merger, the black holes were approximately 1.4 billion light-years away from Earth and had masses of 30 and 35 solar masses.: 6 In 2017, Rainer Weiss, Kip Thorne, and Barry Barish, who had spearheaded the project, were awarded the Nobel Prize in Physics for their work. Since the initial discovery in 2015, hundreds more gravitational waves have been observed by LIGO and another interferometer, Virgo. On 10 April 2019, the first direct image of a black hole and its vicinity was published, following observations made by the Event Horizon Telescope (EHT) in 2017 of the supermassive black hole in Messier 87's galactic centre. In 2022, the Event Horizon Telescope collaboration released an image of the black hole in the center of the Milky Way galaxy, Sagittarius A*; The data had been collected in 2017. In 2020, the Nobel Prize in Physics was awarded for work on black holes. Andrea Ghez and Reinhard Genzel shared one-half for their discovery that Sagittarius A* is a supermassive black hole. Penrose received the other half for his work showing that the mathematics of general relativity requires the formation of black holes. Cosmologists lamented that Hawking's extensive theoretical work on black holes would not be honored since he died in 2018. In December 1967, a student reportedly suggested the phrase black hole at a lecture by John Wheeler; Wheeler adopted the term for its brevity and "advertising value", and Wheeler's stature in the field ensured it quickly caught on, leading some to credit Wheeler with coining the phrase. However, the term was used by others around that time. Science writer Marcia Bartusiak traces the term black hole to physicist Robert H. Dicke, who in the early 1960s reportedly compared the phenomenon to the Black Hole of Calcutta, notorious as a prison where people entered but never left alive. The term was used in print by Life and Science News magazines in 1963, and by science journalist Ann Ewing in her article "'Black Holes' in Space", dated 18 January 1964, which was a report on a meeting of the American Association for the Advancement of Science held in Cleveland, Ohio. Definition A black hole is generally defined as a region of spacetime from which no information-carrying signals or objects can escape. However, verifying an object as a black hole by this definition would require waiting for an infinite time and at an infinite distance from the black hole to verify that indeed, nothing has escaped, and thus cannot be used to identify a physical black hole. Broadly, physicists do not have a precisely-agreed-upon definition of a black hole. Among astrophysicists, a black hole is a compact object with a mass larger than four solar masses. A black hole may also be defined as a reservoir of information: 142 or a region where space is falling inwards faster than the speed of light. Properties The no-hair theorem postulates that, once it achieves a stable condition after formation, a black hole has only three independent physical properties: mass, electric charge, and angular momentum; the black hole is otherwise featureless. If the conjecture is true, any two black holes that share the same values for these properties, or parameters, are indistinguishable from one another. The degree to which the conjecture is true for real black holes is currently an unsolved problem. The simplest static black holes have mass but neither electric charge nor angular momentum. According to Birkhoff's theorem, these Schwarzschild black holes are the only vacuum solution that is spherically symmetric. Solutions describing more general black holes also exist. Non-rotating charged black holes are described by the Reissner–Nordström metric, while the Kerr metric describes a non-charged rotating black hole. The most general stationary black hole solution known is the Kerr–Newman metric, which describes a black hole with both charge and angular momentum. The simplest static black holes have mass but neither electric charge nor angular momentum. Contrary to the popular notion of a black hole "sucking in everything" in its surroundings, from far away, the external gravitational field of a black hole is identical to that of any other body of the same mass. While a black hole can theoretically have any positive mass, the charge and angular momentum are constrained by the mass. The total electric charge Q and the total angular momentum J are expected to satisfy the inequality Q 2 4 π ϵ 0 + c 2 J 2 G M 2 ≤ G M 2 {\displaystyle {\frac {Q^{2}}{4\pi \epsilon _{0}}}+{\frac {c^{2}J^{2}}{GM^{2}}}\leq GM^{2}} for a black hole of mass M. Black holes with the maximum possible charge or spin satisfying this inequality are called extremal black holes. Solutions of Einstein's equations that violate this inequality exist, but they do not possess an event horizon. These are so-called naked singularities that can be observed from the outside. Because these singularities make the universe inherently unpredictable, many physicists believe they could not exist. The weak cosmic censorship hypothesis, proposed by Sir Roger Penrose, rules out the formation of such singularities, when they are created through the gravitational collapse of realistic matter. However, this theory has not yet been proven, and some physicists believe that naked singularities could exist. It is also unknown whether black holes could even become extremal, forming naked singularities, since natural processes counteract increasing spin and charge when a black hole becomes near-extremal. The total mass of a black hole can be estimated by analyzing the motion of objects near the black hole, such as stars or gas. All black holes spin, often fast—One supermassive black hole, GRS 1915+105 has been estimated to spin at over 1,000 revolutions per second. The Milky Way's central black hole Sagittarius A* rotates at about 90% of the maximum rate. The spin rate can be inferred from measurements of atomic spectral lines in the X-ray range. As gas near the black hole plunges inward, high energy X-ray emission from electron-positron pairs illuminates the gas further out, appearing red-shifted due to relativistic effects. Depending on the spin of the black hole, this plunge happens at different radii from the hole, with different degrees of redshift. Astronomers can use the gap between the x-ray emission of the outer disk and the redshifted emission from plunging material to determine the spin of the black hole. A newer way to estimate spin is based on the temperature of gasses accreting onto the black hole. The method requires an independent measurement of the black hole mass and inclination angle of the accretion disk followed by computer modeling. Gravitational waves from coalescing binary black holes can also provide the spin of both progenitor black holes and the merged hole, but such events are rare. A spinning black hole has angular momentum. The supermassive black hole in the center of the Messier 87 (M87) galaxy appears to have an angular momentum very close to the maximum theoretical value. That uncharged limit is J ≤ G M 2 c , {\displaystyle J\leq {\frac {GM^{2}}{c}},} allowing definition of a dimensionless spin magnitude such that 0 ≤ c J G M 2 ≤ 1. {\displaystyle 0\leq {\frac {cJ}{GM^{2}}}\leq 1.} Most black holes are believed to have an approximately neutral charge. For example, Michal Zajaček, Arman Tursunov, Andreas Eckart, and Silke Britzen found the electric charge of Sagittarius A* to be at least ten orders of magnitude below the theoretical maximum. A charged black hole repels other like charges just like any other charged object. If a black hole were to become charged, particles with an opposite sign of charge would be pulled in by the extra electromagnetic force, while particles with the same sign of charge would be repelled, neutralizing the black hole. This effect may not be as strong if the black hole is also spinning. The presence of charge can reduce the diameter of the black hole by up to 38%. The charge Q for a nonspinning black hole is bounded by Q ≤ G M , {\displaystyle Q\leq {\sqrt {G}}M,} where G is the gravitational constant and M is the black hole's mass. Classification Black holes can have a wide range of masses. The minimum mass of a black hole formed by stellar gravitational collapse is governed by the maximum mass of a neutron star and is believed to be approximately two-to-four solar masses. However, theoretical primordial black holes, believed to have formed soon after the Big Bang, could be far smaller, with masses as little as 10−5 grams at formation. These very small black holes are sometimes called micro black holes. Black holes formed by stellar collapse are called stellar black holes. Estimates of their maximum mass at formation vary, but generally range from 10 to 100 solar masses, with higher estimates for black holes progenated by low-metallicity stars. The mass of a black hole formed via a supernova has a lower bound: If the progenitor star is too small, the collapse may be stopped by the degeneracy pressure of the star's constituents, allowing the condensation of matter into an exotic denser state. Degeneracy pressure occurs from the Pauli exclusion principle—Particles will resist being in the same place as each other. Smaller progenitor stars, with masses less than about 8 M☉, will be held together by the degeneracy pressure of electrons and will become a white dwarf. For more massive progenitor stars, electron degeneracy pressure is no longer strong enough to resist the force of gravity and the star will be held together by neutron degeneracy pressure, which can occur at much higher densities, forming a neutron star. If the star is still too massive, even neutron degeneracy pressure will not be able to resist the force of gravity and the star will collapse into a black hole.: 5.8 Stellar black holes can also gain mass via accretion of nearby matter, often from a companion object such as a star. Black holes that are larger than stellar black holes but smaller than supermassive black holes are called intermediate-mass black holes, with masses of approximately 102 to 105 solar masses. These black holes seem to be rarer than their stellar and supermassive counterparts, with relatively few candidates having been observed. Physicists have speculated that such black holes may form from collisions in globular and star clusters or at the center of low-mass galaxies. They may also form as the result of mergers of smaller black holes, with several LIGO observations finding merged black holes within the 110-350 solar mass range. The black holes with the largest masses are called supermassive black holes, with masses more than 106 times that of the Sun. These black holes are believed to exist at the centers of almost every large galaxy, including the Milky Way. Some scientists have proposed a subcategory of even larger black holes, called ultramassive black holes, with masses greater than 109-1010 solar masses. Theoretical models predict that the accretion disc that feeds black holes will be unstable once a black hole reaches 50-100 billion times the mass of the Sun, setting a rough upper limit to black hole mass. Structure While black holes are conceptually invisible sinks of all matter and light, in astronomical settings, their enormous gravity alters the motion of surrounding objects and pulls nearby gas inwards at near-light speed, making the area around black holes the brightest objects in the universe. Some black holes have relativistic jets—thin streams of plasma travelling away from the black hole at more than one-tenth of the speed of light. A small faction of the matter falling towards the black hole gets accelerated away along the hole rotation axis. These jets can extend as far as millions of parsecs from the black hole itself. Black holes of any mass can have jets. However, they are typically observed around spinning black holes with strongly-magnetized accretion disks. Relativistic jets were more common in the early universe, when galaxies and their corresponding supermassive black holes were rapidly gaining mass. All black holes with jets also have an accretion disk, but the jets are usually brighter than the disk. Quasars, typically found in other galaxies, are believed to be supermassive black holes with jets; microquasars are believed to be stellar-mass objects with jets, typically observed in the Milky Way. The mechanism of formation of jets is not yet known, but several options have been proposed. One method proposed to fuel these jets is the Blandford-Znajek process, which suggests that the dragging of magnetic field lines by a black hole's rotation could launch jets of matter into space. The Penrose process, which involves extraction of a black hole's rotational energy, has also been proposed as a potential mechanism of jet propulsion. Due to conservation of angular momentum, gas falling into the gravitational well created by a massive object will typically form a disk-like structure around the object.: 242 As the disk's angular momentum is transferred outward due to internal processes, its matter falls farther inward, converting its gravitational energy into heat and releasing a large flux of x-rays. The temperature of these disks can range from thousands to millions of Kelvin, and temperatures can differ throughout a single accretion disk. Accretion disks can also emit in other parts of the electromagnetic spectrum, depending on the disk's turbulence and magnetization and the black hole's mass and angular momentum. Accretion disks can be defined as geometrically thin or geometrically thick. Geometrically thin disks are mostly confined to the black hole's equatorial plane and have a well-defined edge at the innermost stable circular orbit (ISCO), while geometrically thick disks are supported by internal pressure and temperature and can extend inside the ISCO. Disks with high rates of electron scattering and absorption, appearing bright and opaque, are called optically thick; optically thin disks are more translucent and produce fainter images when viewed from afar. Accretion disks of black holes accreting beyond the Eddington limit are often referred to as polish donuts due to their thick, toroidal shape that resembles that of a donut. Quasar accretion disks are expected to usually appear blue in color. The disk for a stellar black hole, on the other hand, would likely look orange, yellow, or red, with its inner regions being the brightest. Theoretical research suggests that the hotter a disk is, the bluer it should be, although this is not always supported by observations of real astronomical objects. Accretion disk colors may also be altered by the Doppler effect, with the part of the disk travelling towards an observer appearing bluer and brighter and the part of the disk travelling away from the observer appearing redder and dimmer. In Newtonian gravity, test particles can stably orbit at arbitrary distances from a central object. In general relativity, however, there exists a smallest possible radius for which a massive particle can orbit stably. Any infinitesimal inward perturbations to this orbit will lead to the particle spiraling into the black hole, and any outward perturbations will, depending on the energy, cause the particle to spiral in, move to a stable orbit further from the black hole, or escape to infinity. This orbit is called the innermost stable circular orbit, or ISCO. The location of the ISCO depends on the spin of the black hole and the spin of the particle itself. In the case of a Schwarzschild black hole (spin zero) and a particle without spin, the location of the ISCO is: r I S C O = 3 r s = 6 G M c 2 , {\displaystyle r_{\rm {ISCO}}=3\,r_{\text{s}}={\frac {6\,GM}{c^{2}}},} where r I S C O {\displaystyle r_{\rm {_{ISCO}}}} is the radius of the ISCO, r s {\displaystyle r_{\text{s}}} is the Schwarzschild radius of the black hole, G {\displaystyle G} is the gravitational constant, and c {\displaystyle c} is the speed of light. The radius of this orbit changes slightly based on particle spin. For charged black holes, the ISCO moves inwards. For spinning black holes, the ISCO is moved inwards for particles orbiting in the same direction that the black hole is spinning (prograde) and outwards for particles orbiting in the opposite direction (retrograde). For example, the ISCO for a particle orbiting retrograde can be as far out as about 9 r s {\displaystyle 9r_{\text{s}}} , while the ISCO for a particle orbiting prograde can be as close as at the event horizon itself. The photon sphere is a spherical boundary for which photons moving on tangents to that sphere are bent completely around the black hole, possibly orbiting multiple times. Light rays with impact parameters less than the radius of the photon sphere enter the black hole. For Schwarzschild black holes, the photon sphere has a radius 1.5 times the Schwarzschild radius; the radius for non-Schwarzschild black holes is at least 1.5 times the radius of the event horizon. When viewed from a great distance, the photon sphere creates an observable black hole shadow. Since no light emerges from within the black hole, this shadow is the limit for possible observations.: 152 The shadow of colliding black holes should have characteristic warped shapes, allowing scientists to detect black holes that are about to merge. While light can still escape from the photon sphere, any light that crosses the photon sphere on an inbound trajectory will be captured by the black hole. Therefore, any light that reaches an outside observer from the photon sphere must have been emitted by objects between the photon sphere and the event horizon. Light emitted towards the photon sphere may also curve around the black hole and return to the emitter. For a rotating, uncharged black hole, the radius of the photon sphere depends on the spin parameter and whether the photon is orbiting prograde or retrograde. For a photon orbiting prograde, the photon sphere will be 1-3 Schwarzschild radii from the center of the black hole, while for a photon orbiting retrograde, the photon sphere will be between 3-5 Schwarzschild radii from the center of the black hole. The exact location of the photon sphere depends on the magnitude of the black hole's rotation. For a charged, nonrotating black hole, there will only be one photon sphere, and the radius of the photon sphere will decrease for increasing black hole charge. For non-extremal, charged, rotating black holes, there will always be two photon spheres, with the exact radii depending on the parameters of the black hole. Near a rotating black hole, spacetime rotates similar to a vortex. The rotating spacetime will drag any matter and light into rotation around the spinning black hole. This effect of general relativity, called frame dragging, gets stronger closer to the spinning mass. The region of spacetime in which it is impossible to stay still is called the ergosphere. The ergosphere of a black hole is a volume bounded by the black hole's event horizon and the ergosurface, which coincides with the event horizon at the poles but bulges out from it around the equator. Matter and radiation can escape from the ergosphere. Through the Penrose process, objects can emerge from the ergosphere with more energy than they entered with. The extra energy is taken from the rotational energy of the black hole, slowing down the rotation of the black hole.: 268 A variation of the Penrose process in the presence of strong magnetic fields, the Blandford–Znajek process, is considered a likely mechanism for the enormous luminosity and relativistic jets of quasars and other active galactic nuclei. The observable region of spacetime around a black hole closest to its event horizon is called the plunging region. In this area it is no longer possible for free falling matter to follow circular orbits or stop a final descent into the black hole. Instead, it will rapidly plunge toward the black hole at close to the speed of light, growing increasingly hot and producing a characteristic, detectable thermal emission. However, light and radiation emitted from this region can still escape from the black hole's gravitational pull. For a nonspinning, uncharged black hole, the radius of the event horizon, or Schwarzschild radius, is proportional to the mass, M, through r s = 2 G M c 2 ≈ 2.95 M M ⊙ k m , {\displaystyle r_{\mathrm {s} }={\frac {2GM}{c^{2}}}\approx 2.95\,{\frac {M}{M_{\odot }}}~\mathrm {km,} } where rs is the Schwarzschild radius and M☉ is the mass of the Sun.: 124 For a black hole with nonzero spin or electric charge, the radius is smaller,[Note 1] until an extremal black hole could have an event horizon close to r + = G M c 2 , {\displaystyle r_{\mathrm {+} }={\frac {GM}{c^{2}}},} half the radius of a nonspinning, uncharged black hole of the same mass. Since the volume within the Schwarzschild radius increase with the cube of the radius, average density of a black hole inside its Schwarzschild radius is inversely proportional to the square of its mass: supermassive black holes are much less dense than stellar black holes. The average density of a 108 M☉ black hole is comparable to that of water. The defining feature of a black hole is the existence of an event horizon, a boundary in spacetime through which matter and light can pass only inward towards the center of the black hole. Nothing, not even light, can escape from inside the event horizon. The event horizon is referred to as such because if an event occurs within the boundary, information from that event cannot reach or affect an outside observer, making it impossible to determine whether such an event occurred.: 179 For non-rotating black holes, the geometry of the event horizon is precisely spherical, while for rotating black holes, the event horizon is oblate. To a distant observer, a clock near a black hole would appear to tick more slowly than one further from the black hole.: 217 This effect, known as gravitational time dilation, would also cause an object falling into a black hole to appear to slow as it approached the event horizon, never quite reaching the horizon from the perspective of an outside observer.: 218 All processes on this object would appear to slow down, and any light emitted by the object to appear redder and dimmer, an effect known as gravitational redshift. An object falling from half of a Schwarzschild radius above the event horizon would fade away until it could no longer be seen, disappearing from view within one hundredth of a second. It would also appear to flatten onto the black hole, joining all other material that had ever fallen into the hole. On the other hand, an observer falling into a black hole would not notice any of these effects as they cross the event horizon. Their own clocks appear to them to tick normally, and they cross the event horizon after a finite time without noting any singular behaviour. In general relativity, it is impossible to determine the location of the event horizon from local observations, due to Einstein's equivalence principle.: 222 Black holes that are rotating and/or charged have an inner horizon, often called the Cauchy horizon, inside of the black hole. The inner horizon is divided up into two segments: an ingoing section and an outgoing section. At the ingoing section of the Cauchy horizon, radiation and matter that fall into the black hole would build up at the horizon, causing the curvature of spacetime to go to infinity. This would cause an observer falling in to experience tidal forces. This phenomenon is often called mass inflation, since it is associated with a parameter dictating the black hole's internal mass growing exponentially, and the buildup of tidal forces is called the mass-inflation singularity or Cauchy horizon singularity. Some physicists have argued that in realistic black holes, accretion and Hawking radiation would stop mass inflation from occurring. At the outgoing section of the inner horizon, infalling radiation would backscatter off of the black hole's spacetime curvature and travel outward, building up at the outgoing Cauchy horizon. This would cause an infalling observer to experience a gravitational shock wave and tidal forces as the spacetime curvature at the horizon grew to infinity. This buildup of tidal forces is called the shock singularity. Both of these singularities are weak, meaning that an object crossing them would only be deformed a finite amount by tidal forces, even though the spacetime curvature would still be infinite at the singularity. This is as opposed to a strong singularity, where an object hitting the singularity would be stretched and squeezed by an infinite amount. They are also null singularities, meaning that a photon could travel parallel to the them without ever being intercepted. Ignoring quantum effects, every black hole has a singularity inside, points where the curvature of spacetime becomes infinite, and geodesics terminate within a finite proper time.: 205 For a non-rotating black hole, this region takes the shape of a single point; for a rotating black hole it is smeared out to form a ring singularity that lies in the plane of rotation.: 264 In both cases, the singular region has zero volume. All of the mass of the black hole ends up in the singularity.: 252 Since the singularity has nonzero mass in an infinitely small space, it can be thought of as having infinite density. Observers falling into a Schwarzschild black hole (i.e., non-rotating and not charged) cannot avoid being carried into the singularity once they cross the event horizon. As they fall further into the black hole, they will be torn apart by the growing tidal forces in a process sometimes referred to as spaghettification or the noodle effect. Eventually, they will reach the singularity and be crushed into an infinitely small point.: 182 However any perturbations, such as those caused by matter or radiation falling in, would cause space to oscillate chaotically near the singularity. Any matter falling in would experience intense tidal forces rapidly changing in direction, all while being compressed into an increasingly small volume. Alternative forms of general relativity, including addition of some quatum effects, can lead to regular, or nonsingular, black holes without singularities. For example, the fuzzball model, based on string theory, states that black holes are actually made up of quantum microstates and need not have a singularity or an event horizon. The theory of loop quantum gravity proposes that the curvature and density at the center of a black hole is large, but not infinite. Formation Black holes are formed by gravitational collapse of massive stars, either by direct collapse or during a supernova explosion in a process called fallback. Black holes can result from the merger of two neutron stars or a neutron star and a black hole. Other more speculative mechanisms include primordial black holes created from density fluctuations in the early universe, the collapse of dark stars, a hypothetical object powered by annihilation of dark matter, or from hypothetical self-interacting dark matter. Gravitational collapse occurs when an object's internal pressure is insufficient to resist the object's own gravity. At the end of a star's life, it will run out of hydrogen to fuse, and will start fusing more and more massive elements, until it gets to iron. Since the fusion of elements heavier than iron would require more energy than it would release, nuclear fusion ceases. If the iron core of the star is too massive, the star will no longer be able to support itself and will undergo gravitational collapse. While most of the energy released during gravitational collapse is emitted very quickly, an outside observer does not actually see the end of this process. Even though the collapse takes a finite amount of time from the reference frame of infalling matter, a distant observer would see the infalling material slow and halt just above the event horizon, due to gravitational time dilation. Light from the collapsing material takes longer and longer to reach the observer, with the delay growing to infinity as the emitting material reaches the event horizon. Thus the external observer never sees the formation of the event horizon; instead, the collapsing material seems to become dimmer and increasingly red-shifted, eventually fading away. Observations of quasars at redshift z ∼ 7 {\displaystyle z\sim 7} , less than a billion years after the Big Bang, has led to investigations of other ways to form black holes. The accretion process to build supermassive black holes has a limiting rate of mass accumulation and a billion years is not enough time to reach quasar status. One suggestion is direct collapse of nearly pure hydrogen gas (low metalicity) clouds characteristic of the young universe, forming a supermassive star which collapses into a black hole. It has been suggested that seed black holes with typical masses of ~105 M☉ could have formed in this way which then could grow to ~109 M☉. However, the very large amount of gas required for direct collapse is not typically stable to fragmentation to form multiple stars. Thus another approach suggests massive star formation followed by collisions that seed massive black holes which ultimately merge to create a quasar.: 85 A neutron star in a common envelope with a regular star can accrete sufficient material to collapse to a black hole or two neutron stars can merge. These avenues for the formation of black holes are considered relatively rare. In the current epoch of the universe, conditions needed to form black holes are rare and are mostly only found in stars. However, in the early universe, conditions may have allowed for black hole formations via other means. Fluctuations of spacetime soon after the Big Bang may have formed areas that were denser then their surroundings. Initially, these regions would not have been compact enough to form a black hole, but eventually, the curvature of spacetime in the regions become large enough to cause them to collapse into a black hole. Different models for the early universe vary widely in their predictions of the scale of these fluctuations. Various models predict the creation of primordial black holes ranging from a Planck mass (~2.2×10−8 kg) to hundreds of thousands of solar masses. Primordial black holes with masses less than 1015 g would have evaporated by now due to Hawking radiation. Despite the early universe being extremely dense, it did not re-collapse into a black hole during the Big Bang, since the universe was expanding rapidly and did not have the gravitational differential necessary for black hole formation. Models for the gravitational collapse of objects of relatively constant size, such as stars, do not necessarily apply in the same way to rapidly expanding space such as the Big Bang. In principle, black holes could be formed in high-energy particle collisions that achieve sufficient density, although no such events have been detected. These hypothetical micro black holes, which could form from the collision of cosmic rays and Earth's atmosphere or in particle accelerators like the Large Hadron Collider, would not be able to aggregate additional mass. Instead, they would evaporate in about 10−25 seconds, posing no threat to the Earth. Evolution Black holes can also merge with other objects such as stars or even other black holes. This is thought to have been important, especially in the early growth of supermassive black holes, which could have formed from the aggregation of many smaller objects. The process has also been proposed as the origin of some intermediate-mass black holes. Mergers of supermassive black holes may take a long time: As a binary of supermassive black holes approach each other, most nearby stars are ejected, leaving little for the remaining black holes to gravitationally interact with that would allow them to get closer to each other. This phenomenon has been called the final parsec problem, as the distance at which this happens is usually around one parsec. When a black hole accretes matter, the gas in the inner accretion disk orbits at very high speeds because of its proximity to the black hole. The resulting friction heats the inner disk to temperatures at which it emits vast amounts of electromagnetic radiation (mainly X-rays) detectable by telescopes. By the time the matter of the disk reaches the ISCO, between 5.7% and 42% of its mass will have been converted to energy, depending on the black hole's spin. About 90% of this energy is released within about 20 black hole radii. In many cases, accretion disks are accompanied by relativistic jets that are emitted along the black hole's poles, which carry away much of the energy. The mechanism for the creation of these jets is currently not well understood, in part due to insufficient data. Many of the universe's most energetic phenomena have been attributed to the accretion of matter on black holes. Active galactic nuclei and quasars are believed to be the accretion disks of supermassive black holes. X-ray binaries are generally accepted to be binary systems in which one of the two objects is a compact object accreting matter from its companion. Ultraluminous X-ray sources may be the accretion disks of intermediate-mass black holes. At a certain rate of accretion, the outward radiation pressure will become as strong as the inward gravitational force, and the black hole should unable to accrete any faster. This limit is called the Eddington limit. However, many black holes accrete beyond this rate due to their non-spherical geometry or instabilities in the accretion disk. Accretion beyond the limit is called Super-Eddington accretion and may have been commonplace in the early universe. Stars have been observed to get torn apart by tidal forces in the immediate vicinity of supermassive black holes in galaxy nuclei, in what is known as a tidal disruption event (TDE). Some of the material from the disrupted star forms an accretion disk around the black hole, which emits observable electromagnetic radiation. The correlation between the masses of supermassive black holes in the centres of galaxies with the velocity dispersion and mass of stars in their host bulges suggests that the formation of galaxies and the formation of their central black holes are related. Black hole winds from rapid accretion, particularly when the galaxy itself is still accreting matter, can compress gas nearby, accelerating star formation. However, if the winds become too strong, the black hole may blow nearly all of the gas out of the galaxy, quenching star formation. Black hole jets may also energize nearby cavities of plasma and eject low-entropy gas from out of the galactic core, causing gas in galactic centers to be hotter than expected. If Hawking's theory of black hole radiation is correct, then black holes are expected to shrink and evaporate over time as they lose mass by the emission of photons and other particles. The temperature of this thermal spectrum (Hawking temperature) is proportional to the surface gravity of the black hole, which is inversely proportional to the mass. Hence, large black holes emit less radiation than small black holes.: Ch. 9.6 A stellar black hole of 1 M☉ has a Hawking temperature of 62 nanokelvins. This is far less than the 2.7 K temperature of the cosmic microwave background radiation. Stellar-mass or larger black holes receive more mass from the cosmic microwave background than they emit through Hawking radiation and thus will grow instead of shrinking. To have a Hawking temperature larger than 2.7 K (and be able to evaporate), a black hole would need a mass less than the Moon. Such a black hole would have a diameter of less than a tenth of a millimetre. The Hawking radiation for an astrophysical black hole is predicted to be very weak and would thus be exceedingly difficult to detect from Earth. A possible exception is the burst of gamma rays emitted in the last stage of the evaporation of primordial black holes. Searches for such flashes have proven unsuccessful and provide stringent limits on the possibility of existence of low mass primordial black holes, with modern research predicting that primordial black holes must make up less than a fraction of 10−7 of the universe's total mass. NASA's Fermi Gamma-ray Space Telescope, launched in 2008, has searched for these flashes, but has not yet found any. The properties of a black hole are constrained and interrelated by the theories that predict these properties. When based on general relativity, these relationships are called the laws of black hole mechanics. For a black hole that is not still forming or accreting matter, the zeroth law of black hole mechanics states the black hole's surface gravity is constant across the event horizon. The first law relates changes in the black hole's surface area, angular momentum, and charge to changes in its energy. The second law says the surface area of a black hole never decreases on its own. Finally, the third law says that the surface gravity of a black hole is never zero. These laws are mathematical analogs of the laws of thermodynamics. They are not equivalent, however, because, according to general relativity without quantum mechanics, a black hole can never emit radiation, and thus its temperature must always be zero.: 11 Quantum mechanics predicts that a black hole will continuously emit thermal Hawking radiation, and therefore must always have a nonzero temperature. It also predicts that all black holes have entropy which scales with their surface area. When quantum mechanics is accounted for, the laws of black hole mechanics become equivalent to the classical laws of thermodynamics. However, these conclusions are derived without a complete theory of quantum gravity, although many potential theories do predict black holes having entropy and temperature. Thus, the true quantum nature of black hole thermodynamics continues to be debated.: 29 Observational evidence Millions of black holes with around 30 solar masses derived from stellar collapse are expected to exist in the Milky Way. Even a dwarf galaxy like Draco should have hundreds. Only a few of these have been detected. By nature, black holes do not themselves emit any electromagnetic radiation other than the hypothetical Hawking radiation, so astrophysicists searching for black holes must generally rely on indirect observations. The defining characteristic of a black hole is its event horizon. The horizon itself cannot be imaged, so all other possible explanations for these indirect observations must be considered and eliminated before concluding that a black hole has been observed.: 11 The Event Horizon Telescope (EHT) is a global system of radio telescopes capable of directly observing a black hole shadow. The angular resolution of a telescope is based on its aperture and the wavelengths it is observing. Because the angular diameters of Sagittarius A* and Messier 87* in the sky are very small, a single telescope would need to be about the size of the Earth to clearly distinguish their horizons using radio wavelengths. By combining data from several different radio telescopes around the world, the Event Horizon Telescope creates an effective aperture the diameter size of the Earth. The EHT team used imaging algorithms to compute the most probable image from the data in its observations of Sagittarius A* and M87*. Gravitational-wave interferometry can be used to detect merging black holes and other compact objects. In this method, a laser beam is split down two long arms of a tunnel. The laser beams reflect off of mirrors in the tunnels and converge at the intersection of the arms, cancelling each other out. However, when a gravitational wave passes, it warps spacetime, changing the lengths of the arms themselves. Since each laser beam is now travelling a slightly different distance, they do not cancel out and produce a recognizable signal. Analysis of the signal can give scientists information about what caused the gravitational waves. Since gravitational waves are very weak, gravitational-wave observatories such as LIGO must have arms several kilometers long and carefully control for noise from Earth to be able to detect these gravitational waves. Since the first measurements in 2016, multiple gravitational waves from black holes have been detected and analyzed. The proper motions of stars near the centre of the Milky Way provide strong observational evidence that these stars are orbiting a supermassive black hole. Since 1995, astronomers have tracked the motions of 90 stars orbiting an invisible object coincident with the radio source Sagittarius A*. In 1998, by fitting the motions of the stars to Keplerian orbits, the astronomers were able to infer that Sagittarius A* must be a 2.6×106 M☉ object must be contained within a radius of 0.02 light-years. Since then, one of the stars—called S2—has completed a full orbit. From the orbital data, astronomers were able to refine the calculations of the mass of Sagittarius A* to 4.3×106 M☉, with a radius of less than 0.002 light-years. This upper limit radius is larger than the Schwarzschild radius for the estimated mass, so the combination does not prove Sagittarius A* is a black hole. Nevertheless, these observations strongly suggest that the central object is a supermassive black hole as there are no other plausible scenarios for confining so much invisible mass into such a small volume. Additionally, there is some observational evidence that this object might possess an event horizon, a feature unique to black holes. The Event Horizon Telescope image of Sagittarius A*, released in 2022, provided further confirmation that it is indeed a black hole. X-ray binaries are binary systems that emit a majority of their radiation in the X-ray part of the electromagnetic spectrum. These X-ray emissions result when a compact object accretes matter from an ordinary star. The presence of an ordinary star in such a system provides an opportunity for studying the central object and to determine if it might be a black hole. By measuring the orbital period of the binary, the distance to the binary from Earth, and the mass of the companion star, scientists can estimate the mass of the compact object. The Tolman-Oppenheimer-Volkoff limit (TOV limit) dictates the largest mass a nonrotating neutron star can be, and is estimated to be about two solar masses. While a rotating neutron star can be slightly more massive, if the compact object is much more massive than the TOV limit, it cannot be a neutron star and is generally expected to be a black hole. The first strong candidate for a black hole, Cygnus X-1, was discovered in this way by Charles Thomas Bolton, Louise Webster, and Paul Murdin in 1972. Observations of rotation broadening of the optical star reported in 1986 lead to a compact object mass estimate of 16 solar masses, with 7 solar masses as the lower bound. In 2011, this estimate was updated to 14.1±1.0 M☉ for the black hole and 19.2±1.9 M☉ for the optical stellar companion. X-ray binaries can be categorized as either low-mass or high-mass; This classification is based on the mass of the companion star, not the compact object itself. In a class of X-ray binaries called soft X-ray transients, the companion star is of relatively low mass, allowing for more accurate estimates of the black hole mass. These systems actively emit X-rays for only several months once every 10–50 years. During the period of low X-ray emission, called quiescence, the accretion disk is extremely faint, allowing detailed observation of the companion star. Numerous black hole candidates have been measured by this method. Black holes are also sometimes found in binaries with other compact objects, such as white dwarfs, neutron stars, and other black holes. The centre of nearly every galaxy contains a supermassive black hole. The close observational correlation between the mass of this hole and the velocity dispersion of the host galaxy's bulge, known as the M–sigma relation, strongly suggests a connection between the formation of the black hole and that of the galaxy itself. Astronomers use the term active galaxy to describe galaxies with unusual characteristics, such as unusual spectral line emission and very strong radio emission. Theoretical and observational studies have shown that the high levels of activity in the centers of these galaxies, regions called active galactic nuclei (AGN), may be explained by accretion onto supermassive black holes. These AGN consist of a central black hole that may be millions or billions of times more massive than the Sun, a disk of interstellar gas and dust called an accretion disk, and two jets perpendicular to the accretion disk. Although supermassive black holes are expected to be found in most AGN, only some galaxies' nuclei have been more carefully studied in attempts to both identify and measure the actual masses of the central supermassive black hole candidates. Some of the most notable galaxies with supermassive black hole candidates include the Andromeda Galaxy, Messier 32, Messier 87, the Sombrero Galaxy, and the Milky Way itself. Another way black holes can be detected is through observation of effects caused by their strong gravitational field. One such effect is gravitational lensing: The deformation of spacetime around a massive object causes light rays to be deflected, making objects behind them appear distorted. When the lensing object is a black hole, this effect can be strong enough to create multiple images of a star or other luminous source. However, the distance between the lensed images may be too small for contemporary telescopes to resolve—this phenomenon is called microlensing. Instead of seeing two images of a lensed star, astronomers see the star brighten slightly as the black hole moves towards the line of sight between the star and Earth and then return to its normal luminosity as the black hole moves away. The turn of the millennium saw the first 3 candidate detections of black holes in this way, and in January 2022, astronomers reported the first confirmed detection of a microlensing event from an isolated black hole. This was also the first determination of an isolated black hole mass, 7.1±1.3 M☉. Alternatives While there is a strong case for supermassive black holes, the model for stellar-mass black holes assumes of an upper limit for the mass of a neutron star: objects observed to have more mass are assumed to be black holes. However, the properties of extremely dense matter are poorly understood. New exotic phases of matter could allow other kinds of massive objects. Quark stars would be made up of quark matter and supported by quark degeneracy pressure, a form of degeneracy pressure even stronger than neutron degeneracy pressure. This would halt gravitational collapse at a higher mass than for a neutron star. Even stronger stars called electroweak stars would convert quarks in their cores into leptons, providing additional pressure to stop the star from collapsing. If, as some extensions of the Standard Model posit, quarks and leptons are made up of the even-smaller fundamental particles called preons, a very compact star could be supported by preon degeneracy pressure. While none of these hypothetical models can explain all of the observations of stellar black hole candidates, a Q star is the only alternative which could significantly exceed the mass limit for neutron stars and thus provide an alternative for supermassive black holes.: 12 A few theoretical objects have been conjectured to match observations of astronomical black hole candidates identically or near-identically, but which function via a different mechanism. A dark energy star would convert infalling matter into vacuum energy; This vacuum energy would be much larger than the vacuum energy of outside space, exerting outwards pressure and preventing a singularity from forming. A black star would be gravitationally collapsing slowly enough that quantum effects would keep it just on the cusp of fully collapsing into a black hole. A gravastar would consist of a very thin shell and a dark-energy interior providing outward pressure to stop the collapse into a black hole or formation of a singularity; It could even have another gravastar inside, called a 'nestar'. Open questions According to the no-hair theorem, a black hole is defined by only three parameters: its mass, charge, and angular momentum. This seems to mean that all other information about the matter that went into forming the black hole is lost, as there is no way to determine anything about the black hole from outside other than those three parameters. When black holes were thought to persist forever, this information loss was not problematic, as the information can be thought of as existing inside the black hole. However, black holes slowly evaporate by emitting Hawking radiation. This radiation does not appear to carry any additional information about the matter that formed the black hole, meaning that this information is seemingly gone forever. This is called the black hole information paradox. Theoretical studies analyzing the paradox have led to both further paradoxes and new ideas about the intersection of quantum mechanics and general relativity. While there is no consensus on the resolution of the paradox, work on the problem is expected to be important for a theory of quantum gravity.: 126 Observations of faraway galaxies have found that ultraluminous quasars, powered by supermassive black holes, existed in the early universe as far as redshift z ≥ 7 {\displaystyle z\geq 7} . These black holes have been assumed to be the products of the gravitational collapse of large population III stars. However, these stellar remnants were not massive enough to produce the quasars observed at early times without accreting beyond the Eddington limit, the theoretical maximum rate of black hole accretion. Physicists have suggested a variety of different mechanisms by which these supermassive black holes may have formed. It has been proposed that smaller black holes may have also undergone mergers to produce the observed supermassive black holes. It is also possible that they were seeded by direct-collapse black holes, in which a large cloud of hot gas avoids fragmentation that would lead to multiple stars, due to low angular momentum or heating from a nearby galaxy. Given the right circumstances, a single supermassive star forms and collapses directly into a black hole without undergoing typical stellar evolution. Additionally, these supermassive black holes in the early universe may be high-mass primordial black holes, which could have accreted further matter in the centers of galaxies. Finally, certain mechanisms allow black holes to grow faster than the theoretical Eddington limit, such as dense gas in the accretion disk limiting outward radiation pressure that prevents the black hole from accreting. However, the formation of bipolar jets prevent super-Eddington rates. In fiction Black holes have been portrayed in science fiction in a variety of ways. Even before the advent of the term itself, objects with characteristics of black holes appeared in stories such as the 1928 novel The Skylark of Space with its "black Sun" and the "hole in space" in the 1935 short story Starship Invincible. As black holes grew to public recognition in the 1960s and 1970s, they began to be featured in films as well as novels, such as Disney's The Black Hole. Black holes have also been used in works of the 21st century, such as Christopher Nolan's science fiction epic Interstellar. Authors and screenwriters have exploited the relativistic effects of black holes, particularly gravitational time dilation. For example, Interstellar features a black hole planet with a time dilation factor of over 60,000:1, while the 1977 novel Gateway depicts a spaceship approaching but never crossing the event horizon of a black hole from the perspective of an outside observer due to time dilation effects. Black holes have also been appropriated as wormholes or other methods of faster-than-light travel, such as in the 1974 novel The Forever War, where a network of black holes is used for interstellar travel. Additionally, black holes can feature as hazards to spacefarers and planets: A black hole threatens a deep-space outpost in 1978 short story The Black Hole Passes, and a binary black hole dangerously alters the orbit of a planet in the 2018 Netflix reboot of Lost in Space. Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-SIPRI-20202-321] | [TOKENS: 17273] |
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America) |
======================================== |
[SOURCE: https://www.theverge.com/electric-cars] | [TOKENS: 1532] |
Electric Cars The future of transportation is electric. Tesla proved with the Model S that customers would want to buy luxury vehicles powered by lithium-ion batteries. Other EV startups like Faraday Future, Byton, Lucid Motors, and SF Motors are chasing after Elon Musk. And major automakers like Jaguar, Audi, and Mercedes-Benz have each released their own Tesla challengers. There are obstacles, such as the need for a more robust charging network. But battery-powered cars are here to stay. The EV company says the staff cuts are intended to “improve operational effectiveness and optimize our resources,” TechCrunch reports. An internal memo added that the company is still focused on “further expansion into the robotaxi market,” following the launch of a robotaxi collaboration with Nuro and Uber last year. [TechCrunch] The EV tech startup rocked the auto industry with its CES announcement of a production-ready solid-state battery. Since then, there’s been a lot of skepticism and some out-right denials that the battery is even real. Now, Donut Labs is pushing back with a cleverly titled new video series, “I Donut Believe,” and independent test results that verify its claims. The first report is expected to drop next week. A federal jury in Florida last year found Tesla partly liable for a deadly 2019 crash involving the company’s Autopilot driver assist software, and ordered the company to pay the families $243 million. Tesla appealed the ruling, but now a judge has dismissed that effort. In her ruling, US District Court Judge Beth Bloom stated that Tesla’s arguments “were already considered and rejected” and that the evidence at trial “more than supports the jury verdict and does not find it committed any error.” [Reuters] An update to the Rivian mobile app released today introduces a companion app for the Apple Watch. From your wrist you can lock and unlock doors, vent windows, activate the alarm, adjust the cabin temperature using the Apple Watch’s crown dial, and monitor your vehicle’s battery status from your watch face. The SUV pioneer owned by Volkswagen won’t start production on its first EVs, the Terra truck and the Traveler SUV, until 2028, not 2027 as originally planned, German publication Der Spiegel reports (as noted by The Drive). Given the dour mood around EVs these days, a one-year production delay isn’t the worse news. [The Drive] We can’t really tell from the photo whether it has a steering wheel, which was probably a deliberate choice. Elon Musk has said that the fully driverless vehicle will go into volume production in April. That works out to one crash for every 57,000 miles, according to Electrek, which has been tracking robotaxi crashes reported to the National Highway Traffic Safety Administration. Tesla also updated a July 2025 crash to include information about someone being hospitalized — but since Tesla heavily redacts its crash reports, we have no more information about who was injured. The lack of transparency from Tesla also means we have no information about the cause or circumstances around any of those 14 crashes. [Electrek] The automaker’s EV skunkworks team is using ‘bounties’ to guide engineering decisions that track gains in battery range and reductions in cost. Last month, a federal judge ordered the US Department of Transportation to unfreeze $5 billion from the federal program dedicated to building more EV chargers. But today, Transportation Secretary announced a new requirement that all federally funded EV chargers be “100 percent” built in America. Since most EV chargers are sourced from China, this will essentially refreeze the funds and indefinitely delay the installation of more chargers. [USDOT] A handful of journalists and YouTubers got to drive a pre-production version of Rivian’s upcoming $45,000 EV, and the reviews are now live. Doug DeMuro called it “awesome.” MKBHD thinks it will be Rivian’s answer to the Model Y. JerryRigEverything took it off-roading. And Patrick George from InsideEVs found some of the software choices frustrating. They all agree that Rivian can’t afford to screw this up. Car and Driver doing the important tests of the new EV’s Jony Ive-designed interior. David Stern suggested Epstein invest in multiple EV startups, including Lucid Motors, Faraday Future, and Canoo, TechCrunch reports. Stern was also an advisor to Andrew Mountbatten-Windsor and worked with Epstein for nearly a decade, calling him his “mentor.” He also pitched Epstein on buying farmland in Russia and the news organization Al-Jazeera. [TechCrunch] EV adoption was tied to a decrease in smog-forming nitrogen dioxide pollution in California, the biggest market for electric cars in the US, a recent study confirms. [Fast Company] “The charges announced today largely reflect the cost of overestimating the pace of the energy transition that distanced us from many car buyers’ real-world needs, means and desires,” CEO Antonio Filosa said in a statement. The automaker is the latest to record a massive charge on its EV investment, as sales growth slows amid vanishing government incentives. Ford reported a $19.5 billion write-down, while GM said it would take a more modest $6 billion hit. [Reuters] Geely may build cars in the US, but their software still has to follow cybersecurity restrictions. The Geely, Lynk & Co, and Zeekr cars we drove were all ready for US primetime. There are many important safety reasons to support China’s move to ban hidden, electric door handles from EVs, but also a pettier one: they’re just bad, unintuitive, and inconvenient handles. verge_user_m65nybmy: Rejoice! Concealed handles are so dumb. What do you mean I have to press one side then pull the other? Just give me a handle ffs Get the day’s best comment and more in my free newsletter, The Verge Daily. Four months after launching “Standard” versions of the Model 3 and Model Y, Tesla is dropping the trim description. The move comes as the automaker introduces a more affordable all-wheel drive version of the Model Y in the US. [InsideEVs] SpaceX is profitable, while xAI is burning about $1 billion a month. Is this another case of Musk bailing out himself? The luxury EV made Tesla the world’s most interesting car company — for a while. California will outline its own $200 million EV incentive program next week, after officials meet with Detroit automakers to discuss the next phases of its plan to regulate tailpipe emissions. The Trump administration has eliminated incentives for EV purchases, stymied energy efficiency policies, and gutted pollution regulations in general. California is challenging those efforts in court. [Reuters] In its earnings report today, Tesla disclosed a $2 billion investment in xAI “as part of their recent publicly-disclosed financing round.” Bloomberg reported earlier this year that xAI, which also owns X.com and the Grok AI chatbot, burned about $7.8 billion in the first nine months of 2025. Tesla claims that the investment in xAI is “intended to enhance Tesla’s ability to develop and deploy AI products and services into the physical world at scale.” [Tesla] Pagination Most Popular The Verge Daily A free daily digest of the news that matters most. This is the title for the native ad © 2026 Vox Media, LLC. All Rights Reserved |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/BBC_News#Partners] | [TOKENS: 8810] |
Contents BBC News BBC News is an operational business division of the British Broadcasting Corporation (BBC) responsible for the gathering and broadcasting of news and current affairs in the UK and around the world. The department is the world's largest broadcast news organisation and generates about 120 hours of radio and television output each day, as well as online news coverage. The service has over 5,500 journalists working across its output including in 50 foreign news bureaus where more than 250 foreign correspondents are stationed. Deborah Turness has been the CEO of news and current affairs since September 2022. In 2019, it was reported in an Ofcom report that the BBC spent £136m on news during the period April 2018 to March 2019. BBC News' domestic, global and online news divisions are housed within the largest live newsroom in Europe, in Broadcasting House in central London. Parliamentary coverage is produced and broadcast from studios in London. Through BBC English Regions, the BBC also has regional centres across England and national news centres in Northern Ireland, Scotland and Wales. All nations and English regions produce their own local news programmes and other current affairs and sport programmes. The BBC is a quasi-autonomous corporation authorised by royal charter, making it operationally independent of the government. As of 2024, the BBC reaches an average of 450 million people per week, with the BBC World Service accounting for 320 million people. History This is London calling – 2LO calling. Here is the first general news bulletin, copyright by Reuters, Press Association, Exchange Telegraph and Central News. — BBC news programme opening during the 1920s The British Broadcasting Company broadcast its first radio bulletin from radio station 2LO on 14 November 1922. Wishing to avoid competition, newspaper publishers persuaded the government to ban the BBC from broadcasting news before 7 pm, and to force it to use wire service copy instead of reporting on its own. The BBC gradually gained the right to edit the copy and, in 1934, created its own news operation. However, it could not broadcast news before 6 p.m. until World War II. In addition to news, Gaumont British and Movietone cinema newsreels had been broadcast on the TV service since 1936, with the BBC producing its own equivalent Television Newsreel programme from January 1948. A weekly Children's Newsreel was inaugurated on 23 April 1950, to around 350,000 receivers. The network began simulcasting its radio news on television in 1946, with a still picture of Big Ben. Televised bulletins began on 5 July 1954, broadcast from leased studios within Alexandra Palace in London. The public's interest in television and live events was stimulated by Elizabeth II's coronation in 1953. It is estimated that up to 27 million people viewed the programme in the UK, overtaking radio's audience of 12 million for the first time. Those live pictures were fed from 21 cameras in central London to Alexandra Palace for transmission, and then on to other UK transmitters opened in time for the event. That year, there were around two million TV Licences held in the UK, rising to over three million the following year, and four and a half million by 1955. Television news, although physically separate from its radio counterpart, was still firmly under radio news' control in the 1950s. Correspondents provided reports for both outlets, and the first televised bulletin, shown on 5 July 1954 on the then BBC television service and presented by Richard Baker, involved his providing narration off-screen while stills were shown. This was then followed by the customary Television Newsreel with a recorded commentary by John Snagge (and on other occasions by Andrew Timothy). On-screen newsreaders were introduced a year later in 1955 – Kenneth Kendall (the first to appear in vision), Robert Dougall, and Richard Baker—three weeks before ITN's launch on 21 September 1955. Mainstream television production had started to move out of Alexandra Palace in 1950 to larger premises – mainly at Lime Grove Studios in Shepherd's Bush, west London – taking Current Affairs (then known as Talks Department) with it. It was from here that the first Panorama, a new documentary programme, was transmitted on 11 November 1953, with Richard Dimbleby becoming anchor in 1955. In 1958, Hugh Carleton Greene became head of News and Current Affairs. On 1 January 1960, Greene became Director-General. Greene made changes that were aimed at making BBC reporting more similar to its competitor ITN, which had been highly rated by study groups held by Greene. A newsroom was created at Alexandra Palace, television reporters were recruited and given the opportunity to write and voice their own scripts, without having to cover stories for radio too. On 20 June 1960, Nan Winton, the first female BBC network newsreader, appeared in vision. 19 September 1960 saw the start of the radio news and current affairs programme The Ten O'clock News. BBC2 started transmission on 20 April 1964 and began broadcasting a new show, Newsroom. The World at One, a lunchtime news programme, began on 4 October 1965 on the then Home Service, and the year before News Review had started on television. News Review was a summary of the week's news, first broadcast on Sunday, 26 April 1964 on BBC 2 and harking back to the weekly Newsreel Review of the Week, produced from 1951, to open programming on Sunday evenings–the difference being that this incarnation had subtitles for the deaf and hard-of-hearing. As this was the decade before electronic caption generation, each superimposition ("super") had to be produced on paper or card, synchronised manually to studio and news footage, committed to tape during the afternoon, and broadcast early evening. Thus Sundays were no longer a quiet day for news at Alexandra Palace. The programme ran until the 1980s – by then using electronic captions, known as Anchor – to be superseded by Ceefax subtitling (a similar Teletext format), and the signing of such programmes as See Hear (from 1981). On Sunday 17 September 1967, The World This Weekend, a weekly news and current affairs programme, launched on what was then Home Service, but soon-to-be Radio 4. Preparations for colour began in the autumn of 1967 and on Thursday 7 March 1968 Newsroom on BBC2 moved to an early evening slot, becoming the first UK news programme to be transmitted in colour – from Studio A at Alexandra Palace. News Review and Westminster (the latter a weekly review of Parliamentary happenings) were "colourised" shortly after. However, much of the insert material was still in black and white, as initially only a part of the film coverage shot in and around London was on colour reversal film stock, and all regional and many international contributions were still in black and white. Colour facilities at Alexandra Palace were technically very limited for the next eighteen months, as it had only one RCA colour Quadruplex videotape machine and, eventually two Pye plumbicon colour telecines–although the news colour service started with just one. Black and white national bulletins on BBC 1 continued to originate from Studio B on weekdays, along with Town and Around, the London regional "opt out" programme broadcast throughout the 1960s (and the BBC's first regional news programme for the South East), until it started to be replaced by Nationwide on Tuesday to Thursday from Lime Grove Studios early in September 1969. Town and Around was never to make the move to Television Centre – instead it became London This Week which aired on Mondays and Fridays only, from the new TVC studios. The BBC moved production out of Alexandra Palace in 1969. BBC Television News resumed operations the next day with a lunchtime bulletin on BBC1 – in black and white – from Television Centre, where it remained until March 2013. This move to a smaller studio with better technical facilities allowed Newsroom and News Review to replace back projection with colour-separation overlay. During the 1960s, satellite communication had become possible; however, it was some years before digital line-store conversion was able to undertake the process seamlessly. On 14 September 1970, the first Nine O'Clock News was broadcast on television. Robert Dougall presented the first week from studio N1 – described by The Guardian as "a sort of polystyrene padded cell"—the bulletin having been moved from the earlier time of 20.50 as a response to the ratings achieved by ITN's News at Ten, introduced three years earlier on the rival ITV. Richard Baker and Kenneth Kendall presented subsequent weeks, thus echoing those first television bulletins of the mid-1950s. Angela Rippon became the first female news presenter of the Nine O'Clock News in 1975. Her work outside the news was controversial at the time, appearing on The Morecambe and Wise Christmas Show in 1976 singing and dancing. The first edition of John Craven's Newsround, initially intended only as a short series and later renamed just Newsround, came from studio N3 on 4 April 1972. Afternoon television news bulletins during the mid to late 1970s were broadcast from the BBC newsroom itself, rather than one of the three news studios. The newsreader would present to camera while sitting on the edge of a desk; behind him staff would be seen working busily at their desks. This period corresponded with when the Nine O'Clock News got its next makeover, and would use a CSO background of the newsroom from that very same camera each weekday evening. Also in the mid-1970s, the late night news on BBC2 was briefly renamed Newsnight, but this was not to last, or be the same programme as we know today – that would be launched in 1980 – and it soon reverted to being just a news summary with the early evening BBC2 news expanded to become Newsday. News on radio was to change in the 1970s, and on Radio 4 in particular, brought about by the arrival of new editor Peter Woon from television news and the implementation of the Broadcasting in the Seventies report. These included the introduction of correspondents into news bulletins where previously only a newsreader would present, as well as the inclusion of content gathered in the preparation process. New programmes were also added to the daily schedule, PM and The World Tonight as part of the plan for the station to become a "wholly speech network". Newsbeat launched as the news service on Radio 1 on 10 September 1973. On 23 September 1974, a teletext system which was launched to bring news content on television screens using text only was launched. Engineers originally began developing such a system to bring news to deaf viewers, but the system was expanded. The Ceefax service became much more diverse before it ceased on 23 October 2012: it not only had subtitling for all channels, it also gave information such as weather, flight times and film reviews. By the end of the decade, the practice of shooting on film for inserts in news broadcasts was declining, with the introduction of ENG technology into the UK. The equipment would gradually become less cumbersome – the BBC's first attempts had been using a Philips colour camera with backpack base station and separate portable Sony U-matic recorder in the latter half of the decade. In 1980, the Iranian Embassy Siege had been shot electronically by the BBC Television News Outside broadcasting team, and the work of reporter Kate Adie, broadcasting live from Prince's Gate, was nominated for BAFTA actuality coverage, but this time beaten by ITN for the 1980 award. Newsnight, the news and current affairs programme, was due to go on air on 23 January 1980, although trade union disagreements meant that its launch from Lime Grove was postponed by a week. On 27 August 1981 Moira Stuart became the first African Caribbean female newsreader to appear on British television. By 1982, ENG technology had become sufficiently reliable for Bernard Hesketh to use an Ikegami camera to cover the Falklands War, coverage for which he won the "Royal Television Society Cameraman of the Year" award and a BAFTA nomination – the first time that BBC News had relied upon an electronic camera, rather than film, in a conflict zone. BBC News won the BAFTA for its actuality coverage, however the event has become remembered in television terms for Brian Hanrahan's reporting where he coined the phrase "I'm not allowed to say how many planes joined the raid, but I counted them all out and I counted them all back" to circumvent restrictions, and which has become cited as an example of good reporting under pressure. The first BBC breakfast television programme, Breakfast Time also launched during the 1980s, on 17 January 1983 from Lime Grove Studio E and two weeks before its ITV rival TV-am. Frank Bough, Selina Scott, and Nick Ross helped to wake viewers with a relaxed style of presenting. The Six O'Clock News first aired on 3 September 1984, eventually becoming the most watched news programme in the UK (however, since 2006 it has been overtaken by the BBC News at Ten). In October 1984, images of millions of people starving to death in the Ethiopian famine were shown in Michael Buerk's Six O'Clock News reports. The BBC News crew were the first to document the famine, with Buerk's report on 23 October describing it as "a biblical famine in the 20th century" and "the closest thing to hell on Earth". The BBC News report shocked Britain, motivating its citizens to inundate relief agencies, such as Save the Children, with donations, and to bring global attention to the crisis in Ethiopia. The news report was also watched by Bob Geldof, who would organise the charity single "Do They Know It's Christmas?" to raise money for famine relief followed by the Live Aid concert in July 1985. Starting in 1981, the BBC gave a common theme to its main news bulletins with new electronic titles–a set of computer-animated "stripes" forming a circle on a red background with a "BBC News" typescript appearing below the circle graphics, and a theme tune consisting of brass and keyboards. The Nine used a similar (striped) number 9. The red background was replaced by a blue from 1985 until 1987. By 1987, the BBC had decided to re-brand its bulletins and established individual styles again for each one with differing titles and music, the weekend and holiday bulletins branded in a similar style to the Nine, although the "stripes" introduction continued to be used until 1989 on occasions where a news bulletin was screened out of the running order of the schedule. In 1987, John Birt resurrected the practice of correspondents working for both TV and radio with the introduction of bi-media journalism. During the 1990s, a wider range of services began to be offered by BBC News, with the split of BBC World Service Television to become BBC World (news and current affairs), and BBC Prime (light entertainment). Content for a 24-hour news channel was thus required, followed in 1997 with the launch of domestic equivalent BBC News 24. Rather than set bulletins, ongoing reports and coverage was needed to keep both channels functioning and meant a greater emphasis in budgeting for both was necessary. In 1998, after 66 years at Broadcasting House, the BBC Radio News operation moved to BBC Television Centre. New technology, provided by Silicon Graphics, came into use in 1993 for a re-launch of the main BBC 1 bulletins, creating a virtual set which appeared to be much larger than it was physically. The relaunch also brought all bulletins into the same style of set with only small changes in colouring, titles, and music to differentiate each. A computer generated cut-glass sculpture of the BBC coat of arms was the centrepiece of the programme titles until the large scale corporate rebranding of news services in 1999. In November 1997, BBC News Online was launched, following individual webpages for major news events such as the 1996 Olympic Games, 1997 general election, and the death of Princess Diana. In 1999, the biggest relaunch occurred, with BBC One bulletins, BBC World, BBC News 24, and BBC News Online all adopting a common style. One of the most significant changes was the gradual adoption of the corporate image by the BBC regional news programmes, giving a common style across local, national and international BBC television news. This also included Newyddion, the main news programme of Welsh language channel S4C, produced by BBC News Wales. Following the relaunch of BBC News in 1999, regional headlines were included at the start of the BBC One news bulletins in 2000. The English regions did however lose five minutes at the end of their bulletins, due to a new headline round-up at 18:55. 2000 also saw the Nine O'Clock News moved to the later time of 22:00. This was in response to ITN who had just moved their popular News at Ten programme to 23:00. ITN briefly returned News at Ten but following poor ratings when head-to-head against the BBC's Ten O'Clock News, the ITN bulletin was moved to 22.30, where it remained until 14 January 2008. The retirement in 2009 of Peter Sissons and departure of Michael Buerk from the Ten O'Clock News led to changes in the BBC One bulletin presenting team on 20 January 2003. The Six O'Clock News became double headed with George Alagiah and Sophie Raworth after Huw Edwards and Fiona Bruce moved to present the Ten. A new set design featuring a projected fictional newsroom backdrop was introduced, followed on 16 February 2004 by new programme titles to match those of BBC News 24. BBC News 24 and BBC World introduced a new style of presentation in December 2003, that was slightly altered on 5 July 2004 to mark 50 years of BBC Television News. On 7 March 2005 director general Mark Thompson launched the "Creative Futures" project to restructure the organisation. The individual positions of editor of the One and Six O'Clock News were replaced by a new daytime position in November 2005. Kevin Bakhurst became the first Controller of BBC News 24, replacing the position of editor. Amanda Farnsworth became daytime editor while Craig Oliver was later named editor of the Ten O'Clock News. Bulletins received new titles and a new set design in May 2006, to allow for Breakfast to move into the main studio for the first time since 1997. The new set featured Barco videowall screens with a background of the London skyline used for main bulletins and originally an image of cirrus clouds against a blue sky for Breakfast. This was later replaced following viewer criticism. The studio bore similarities with the ITN-produced ITV News in 2004, though ITN uses a CSO Virtual studio rather than the actual screens at BBC News. BBC News became part of a new BBC Journalism group in November 2006 as part of a restructuring of the BBC. The then-Director of BBC News, Helen Boaden reported to the then-Deputy Director-General and head of the journalism group, Mark Byford until he was made redundant in 2010. On 18 October 2007, ED Mark Thompson announced a six-year plan, "Delivering Creative Futures" (based on his project begun in March 2005), merging the television current affairs department into a new "News Programmes" division. Thompson's announcement, in response to a £2 billion shortfall in funding, would, he said, deliver "a smaller but fitter BBC" in the digital age, by cutting its payroll and, in 2013, selling Television Centre. The various separate newsrooms for television, radio and online operations were merged into a single multimedia newsroom. Programme making within the newsrooms was brought together to form a multimedia programme making department. BBC World Service director Peter Horrocks said that the changes would achieve efficiency at a time of cost-cutting at the BBC. In his blog, he wrote that by using the same resources across the various broadcast media meant fewer stories could be covered, or by following more stories, there would be fewer ways to broadcast them. A new graphics and video playout system was introduced for production of television bulletins in January 2007. This coincided with a new structure to BBC World News bulletins, editors favouring a section devoted to analysing the news stories reported on. The first new BBC News bulletin since the Six O'Clock News was announced in July 2007 following a successful trial in the Midlands. The summary, lasting 90 seconds, has been broadcast at 20:00 on weekdays since December 2007 and bears similarities with 60 Seconds on BBC Three, but also includes headlines from the various BBC regions and a weather summary. As part of a long-term cost cutting programme, bulletins were renamed the BBC News at One, Six and Ten respectively in April 2008 while BBC News 24 was renamed BBC News and moved into the same studio as the bulletins at BBC Television Centre. BBC World was renamed BBC World News and regional news programmes were also updated with the new presentation style, designed by Lambie-Nairn. 2008 also saw tri-media introduced across TV, radio, and online. The studio moves also meant that Studio N9, previously used for BBC World, was closed, and operations moved to the previous studio of BBC News 24. Studio N9 was later refitted to match the new branding, and was used for the BBC's UK local elections and European elections coverage in early June 2009. A strategy review of the BBC in March 2010, confirmed that having "the best journalism in the world" would form one of five key editorial policies, as part of changes subject to public consultation and BBC Trust approval. After a period of suspension in late 2012, Helen Boaden ceased to be the Director of BBC News. On 16 April 2013, incoming BBC Director-General Tony Hall named James Harding, a former editor of The Times of London newspaper as Director of News and Current Affairs. From August 2012 to March 2013, all news operations moved from Television Centre to new facilities in the refurbished and extended Broadcasting House, in Portland Place. The move began in October 2012, and also included the BBC World Service, which moved from Bush House following the expiry of the BBC's lease. This new extension to the north and east, referred to as "New Broadcasting House", includes several new state-of-the-art radio and television studios centred around an 11-storey atrium. The move began with the domestic programme The Andrew Marr Show on 2 September 2012, and concluded with the move of the BBC News channel and domestic news bulletins on 18 March 2013. The newsroom houses all domestic bulletins and programmes on both television and radio, as well as the BBC World Service international radio networks and the BBC World News international television channel. BBC News and CBS News established an editorial and newsgathering partnership in 2017, replacing an earlier long-standing partnership between BBC News and ABC News. In an October 2018 Simmons Research survey of 38 news organisations, BBC News was ranked the fourth most trusted news organisation by Americans, behind CBS News, ABC News and The Wall Street Journal. In January 2020 the BBC announced a BBC News savings target of £80 million per year by 2022, involving about 450 staff reductions from the current 6,000. BBC director of news and current affairs Fran Unsworth said there would be further moves toward digital broadcasting, in part to attract back a youth audience, and more pooling of reporters to stop separate teams covering the same news. A further 70 staff reductions were announced in July 2020. BBC Three began airing the news programme The Catch Up in February 2022. It is presented by Levi Jouavel, Kirsty Grant, and Callum Tulley and aims to get the channel's target audience (16 to 34-year olds) to make sense of the world around them while also highlighting optimistic stories. Compared to its predecessor 60 Seconds, The Catch Up is three times longer, running for about three minutes and not airing during weekends. According to its annual report as of December 2021[update], India has the largest number of people using BBC services in the world. In May 2025, following the earthquake that hit Myanmar and Thailand, a television news bulletin (BBC News Myanmar) from the Burmese service using a vacated Voice of America satellite frequency began its broadcasts. Programming and reporting In November 2023, BBC News joined with the International Consortium of Investigative Journalists, Paper Trail Media [de] and 69 media partners including Distributed Denial of Secrets and the Organised Crime and Corruption Reporting Project (OCCRP) and more than 270 journalists in 55 countries and territories to produce the 'Cyprus Confidential' report on the financial network which supports the regime of Vladimir Putin, mostly with connections to Cyprus, and showed Cyprus to have strong links with high-up figures in the Kremlin, some of whom have been sanctioned. Government officials including Cyprus president Nikos Christodoulides and European lawmakers began responding to the investigation's findings in less than 24 hours, calling for reforms and launching probes. BBC News is responsible for the news programmes and documentary content on the BBC's general television channels, as well as the news coverage on the BBC News Channel in the UK, and 22 hours of programming for the corporation's international BBC World News channel. Coverage for BBC Parliament is carried out on behalf of the BBC at Millbank Studios, though BBC News provides editorial and journalistic content. BBC News content is also output onto the BBC's digital interactive television services under the BBC Red Button brand, and until 2012, on the Ceefax teletext system. The music on all BBC television news programmes was introduced in 1999 and composed by David Lowe. It was part of the re-branding which commenced in 1999 and features 'BBC Pips'. The general theme was used on bulletins on BBC One, News 24, BBC World and local news programmes in the BBC's Nations and Regions. Lowe was also responsible for the music on Radio One's Newsbeat. The theme has had several changes since 1999, the latest in March 2013. The BBC Arabic Television news channel launched on 11 March 2008, a Persian-language channel followed on 14 January 2009, broadcasting from the Peel wing of Broadcasting House; both include news, analysis, interviews, sports and highly cultural programmes and are run by the BBC World Service and funded from a grant-in-aid from the British Foreign Office (and not the television licence). The BBC Verify service was launched in 2023 to fact-check news stories, followed by BBC Verify Live in 2025. BBC Radio News produces bulletins for the BBC's national radio stations and provides content for local BBC radio stations via the General News Service (GNS), a BBC-internal news distribution service. BBC News does not produce the BBC's regional news bulletins, which are produced individually by the BBC nations and regions themselves. The BBC World Service broadcasts to some 150 million people in English as well as 27 languages across the globe. BBC Radio News is a patron of the Radio Academy. BBC News Online is the BBC's news website. Launched in November 1997, it is one of the most popular news websites, with 1.2 billion website visits in April 2021, as well as being used by 60% of the UK's internet users for news. The website contains international news coverage as well as entertainment, sport, science, and political news. Mobile apps for Android, iOS and Windows Phone systems have been provided since 2010. Many television and radio programmes are also available to view on the BBC iPlayer and BBC Sounds services. The BBC News channel is also available to view 24 hours a day, while video and radio clips are also available within online news articles. In October 2019, BBC News Online launched a mirror on the dark web anonymity network Tor in an effort to circumvent censorship. Criticism The BBC is required by its charter to be free from both political and commercial influence and answers only to its viewers and listeners. This political objectivity is sometimes questioned. For instance, The Daily Telegraph (3 August 2005) carried a letter from the KGB defector Oleg Gordievsky, referring to it as "The Red Service". Books have been written on the subject, including anti-BBC works like Truth Betrayed by W J West and The Truth Twisters by Richard Deacon. The BBC has been accused of bias by Conservative MPs. The BBC's Editorial Guidelines on Politics and Public Policy state that while "the voices and opinions of opposition parties must be routinely aired and challenged", "the government of the day will often be the primary source of news". The BBC is regularly accused by the government of the day of bias in favour of the opposition and, by the opposition, of bias in favour of the government. Similarly, during times of war, the BBC is often accused by the UK government, or by strong supporters of British military campaigns, of being overly sympathetic to the view of the enemy. An edition of Newsnight at the start of the Falklands War in 1982 was described as "almost treasonable" by John Page, MP, who objected to Peter Snow saying "if we believe the British". During the first Gulf War, critics of the BBC took to using the satirical name "Baghdad Broadcasting Corporation". During the Kosovo War, the BBC were labelled the "Belgrade Broadcasting Corporation" (suggesting favouritism towards the FR Yugoslavia government over ethnic Albanian rebels) by British ministers, although Slobodan Milosević (then FRY president) claimed that the BBC's coverage had been biased against his nation. Conversely, some of those who style themselves anti-establishment in the United Kingdom or who oppose foreign wars have accused the BBC of pro-establishment bias or of refusing to give an outlet to "anti-war" voices. Following the 2003 invasion of Iraq, a study by the Cardiff University School of Journalism of the reporting of the war found that nine out of 10 references to weapons of mass destruction during the war assumed that Iraq possessed them, and only one in 10 questioned this assumption. It also found that, out of the main British broadcasters covering the war, the BBC was the most likely to use the British government and military as its source. It was also the least likely to use independent sources, like the Red Cross, who were more critical of the war. When it came to reporting Iraqi casualties, the study found fewer reports on the BBC than on the other three main channels. The report's author, Justin Lewis, wrote "Far from revealing an anti-war BBC, our findings tend to give credence to those who criticised the BBC for being too sympathetic to the government in its war coverage. Either way, it is clear that the accusation of BBC anti-war bias fails to stand up to any serious or sustained analysis." Prominent BBC appointments are constantly assessed by the British media and political establishment for signs of political bias. The appointment of Greg Dyke as Director-General was highlighted by press sources because Dyke was a Labour Party member and former activist, as well as a friend of Tony Blair. The BBC's former Political Editor, Nick Robinson, was some years ago a chairman of the Young Conservatives and did, as a result, attract informal criticism from the former Labour government, but his predecessor Andrew Marr faced similar claims from the right because he was editor of The Independent, a liberal-leaning newspaper, before his appointment in 2000. Mark Thompson, former Director-General of the BBC, admitted the organisation has been biased "towards the left" in the past. He said, "In the BBC I joined 30 years ago, there was, in much of current affairs, in terms of people's personal politics, which were quite vocal, a massive bias to the left". He then added, "The organization did struggle then with impartiality. Now it is a completely different generation. There is much less overt tribalism among the young journalists who work for the BBC." Following the EU referendum in 2016, some critics suggested that the BBC was biased in favour of leaving the EU. For instance, in 2018, the BBC received complaints from people who took issue that the BBC was not sufficiently covering anti-Brexit marches while giving smaller-scale events hosted by former UKIP leader Nigel Farage more airtime. On the other hand, a poll released by YouGov showed that 45% of people who voted to leave the EU thought that the BBC was 'actively anti-Brexit' compared to 13% of the same kinds of voters who think the BBC is pro-Brexit. In 2008, the BBC Hindi was criticised by some Indian outlets for referring to the terrorists who carried out the 2008 Mumbai attacks as "gunmen". The response to this added to prior criticism from some Indian commentators suggesting that the BBC may have an Indophobic bias. In March 2015, the BBC was criticised for a BBC Storyville documentary interviewing one of the rapists in India. In spite of a ban ordered by the Indian High court, the BBC still aired the documentary "India's Daughter" outside India. BBC News was at the centre of a political controversy following the 2003 invasion of Iraq. Three BBC News reports (Andrew Gilligan's on Today, Gavin Hewitt's on The Ten O'Clock News and another on Newsnight) quoted an anonymous source that stated the British government (particularly the Prime Minister's office) had embellished the September Dossier with misleading exaggerations of Iraq's weapons of mass destruction capabilities. The government denounced the reports and accused the corporation of poor journalism. In subsequent weeks the corporation stood by the report, saying that it had a reliable source. Following intense media speculation, David Kelly was named in the press as the source for Gilligan's story on 9 July 2003. Kelly was found dead, by suicide, in a field close to his home early on 18 July. An inquiry led by Lord Hutton was announced by the British government the following day to investigate the circumstances leading to Kelly's death, concluding that "Dr. Kelly took his own life." In his report on 28 January 2004, Lord Hutton concluded that Gilligan's original accusation was "unfounded" and the BBC's editorial and management processes were "defective". In particular, it specifically criticised the chain of management that caused the BBC to defend its story. The BBC Director of News, Richard Sambrook, the report said, had accepted Gilligan's word that his story was accurate in spite of his notes being incomplete. Davies had then told the BBC Board of Governors that he was happy with the story and told the Prime Minister that a satisfactory internal inquiry had taken place. The Board of Governors, under the chairman's, Gavyn Davies, guidance, accepted that further investigation of the Government's complaints were unnecessary. Because of the criticism in the Hutton report, Davies resigned on the day of publication. BBC News faced an important test, reporting on itself with the publication of the report, but by common consent (of the Board of Governors) managed this "independently, impartially and honestly". Davies' resignation was followed by the resignation of Director General, Greg Dyke, the following day, and the resignation of Gilligan on 30 January. While undoubtedly a traumatic experience for the corporation, an ICM poll in April 2003 indicated that it had sustained its position as the best and most trusted provider of news. The BBC has faced accusations of holding both anti-Israel and anti-Palestine bias. Douglas Davis, the London correspondent of The Jerusalem Post, has described the BBC's coverage of the Arab–Israeli conflict as "a relentless, one-dimensional portrayal of Israel as a demonic, criminal state and Israelis as brutal oppressors [which] bears all the hallmarks of a concerted campaign of vilification that, wittingly or not, has the effect of delegitimising the Jewish state and pumping oxygen into a dark old European hatred that dared not speak its name for the past half-century.". However two large independent studies, one conducted by Loughborough University and the other by Glasgow University's Media Group concluded that Israeli perspectives are given greater coverage. Critics of the BBC argue that the Balen Report proves systematic bias against Israel in headline news programming. The Daily Mail and The Daily Telegraph criticised the BBC for spending hundreds of thousands of British tax payers' pounds from preventing the report being released to the public. Jeremy Bowen, the Middle East Editor for BBC world news, was singled out specifically for bias by the BBC Trust which concluded that he violated "BBC guidelines on accuracy and impartiality." An independent panel appointed by the BBC Trust was set up in 2006 to review the impartiality of the BBC's coverage of the Israeli–Palestinian conflict. The panel's assessment was that "apart from individual lapses, there was little to suggest deliberate or systematic bias." While noting a "commitment to be fair accurate and impartial" and praising much of the BBC's coverage the independent panel concluded "that BBC output does not consistently give a full and fair account of the conflict. In some ways the picture is incomplete and, in that sense, misleading." It notes that, "the failure to convey adequately the disparity in the Israeli and Palestinian experience, [reflects] the fact that one side is in control and the other lives under occupation". Writing in the Financial Times, Philip Stephens, one of the panellists, later accused the BBC's director-general, Mark Thompson, of misrepresenting the panel's conclusions. He further opined "My sense is that BBC news reporting has also lost a once iron-clad commitment to objectivity and a necessary respect for the democratic process. If I am right, the BBC, too, is lost". Mark Thompson published a rebuttal in the FT the next day. The description by one BBC correspondent reporting on the funeral of Yassir Arafat that she had been left with tears in her eyes led to other questions of impartiality, particularly from Martin Walker in a guest opinion piece in The Times, who picked out the apparent case of Fayad Abu Shamala, the BBC Arabic Service correspondent, who told a Hamas rally on 6 May 2001, that journalists in Gaza were "waging the campaign shoulder to shoulder together with the Palestinian people". Walker argues that the independent inquiry was flawed for two reasons. Firstly, because the time period over which it was conducted (August 2005 to January 2006) surrounded the Israeli withdrawal from Gaza and Ariel Sharon's stroke, which produced more positive coverage than usual. Furthermore, he wrote, the inquiry only looked at the BBC's domestic coverage, and excluded output on the BBC World Service and BBC World. Tom Gross accused the BBC of glorifying Hamas suicide bombers, and condemned its policy of inviting guests such as Jenny Tonge and Tom Paulin who have compared Israeli soldiers to Nazis. Writing for the BBC, Paulin said Israeli soldiers should be "shot dead" like Hitler's S.S, and said he could "understand how suicide bombers feel". The BBC also faced criticism for not airing a Disasters Emergency Committee aid appeal for Palestinians who suffered in Gaza during 22-day war there between late 2008 and early 2009. Most other major UK broadcasters did air this appeal, but rival Sky News did not. British journalist Julie Burchill has accused BBC of creating a "climate of fear" for British Jews over its "excessive coverage" of Israel compared to other nations. In light of the Gaza war, the BBC suspended seven Arab journalists over allegations of expressing support for Hamas via social media. BBC and ABC share video segments and reporters as needed in producing their newscasts. with the BBC showing ABC World News Tonight with David Muir in the UK. However, in July 2017, the BBC announced a new partnership with CBS News allows both organisations to share video, editorial content, and additional newsgathering resources in New York, London, Washington and around the world. BBC News subscribes to wire services from leading international agencies including PA Media (formerly Press Association), Reuters, and Agence France-Presse. In April 2017, the BBC dropped Associated Press in favour of an enhanced service from AFP. BBC News reporters and broadcasts are now and have in the past been banned in several countries primarily for reporting which has been unfavourable to the ruling government. For example, correspondents were banned by the former apartheid regime of South Africa. The BBC was banned in Zimbabwe under Mugabe for eight years as a terrorist organisation until being allowed to operate again over a year after the 2008 elections. The BBC was banned in Burma (officially Myanmar) after their coverage and commentary on anti-government protests there in September 2007. The ban was lifted four years later in September 2011. Other cases have included Uzbekistan, China, and Pakistan. BBC Persian, the BBC's Persian language news site, was blocked from the Iranian internet in 2006. The BBC News website was made available in China again in March 2008, but as of October 2014[update], was blocked again. In June 2015, the Rwandan government placed an indefinite ban on BBC broadcasts following the airing of a controversial documentary regarding the 1994 Rwandan genocide, Rwanda's Untold Story, broadcast on BBC2 on 1 October 2014. The UK's Foreign Office recognised "the hurt caused in Rwanda by some parts of the documentary". In February 2017, reporters from the BBC (as well as the Daily Mail, The New York Times, Politico, CNN, and others) were denied access to a United States White House briefing. In 2017, BBC India was banned for a period of five years from covering all national parks and sanctuaries in India. Following the withdrawal of CGTN's UK broadcaster licence on 4 February 2021 by Ofcom, China banned BBC News from airing in China. See also References External links |below = Category }} |
======================================== |
[SOURCE: https://www.wired.com/story/samsung-s90f-deal-226/] | [TOKENS: 1149] |
Brad BourqueGearFeb 20, 2026 3:46 PMOur Favorite Gaming TV Is $100 OffThis Samsung OLED comes loaded with features specifically for gamers.Courtesy of SamsungCommentLoaderSave StorySave this storyCommentLoaderSave StorySave this storyWhile there are a ton of OLED screens to choose from, gamers have a different set of needs when it comes to their TVs. For those weeknight warriors, we recommend the Samsung S90F QD-OLED, and the 65-inch model is currently marked down to $1,298 at Amazon. That's a big discount from its list price and about $100 below the price where it's been sitting for a few months.Photograph: Ryan WaniataPhotograph: Ryan WaniataPhotograph: Ryan WaniataChevronChevronSave to wishlistSave to wishlistSamsungS90F QD-OLED TV$1,698 $1,298 (24% off) AmazonShop atWalmart$2,500 $1,400 (44% off) SamsungWhat the Samsung S90F lacks in top-level brightness it more than makes up for with its impressive detail and clear image quality. The colors are bold and flashy while also staying nice and accurate across the panel, and the contrast is incredible, thanks to the OLED panel's ability to reach basically blacked-out levels in the darkest corners of your favorite RPG. It has excellent glare-reduction in bright rooms, which is perfect for those Saturday-afternoon gaming sessions. Add in support for HDR10/10+, and you've got a screen that's just as good for movies as it is for first-person shooters.Gamers can look forward to the Samsung Gaming Hub, which has a great set of features for both cloud and local gaming. Unlike some of our other favorite TVs that aren't as considerate towards gaming, the S90F also boasts a full set of four HDMI 2.1 ports, so you won't have to pick a favorite console or swap cables every time you switch games. The gaming bar lets you quickly adjust settings to suit your favorite genre, and it has super-fast input response times to help keep your character alive in-game. While capable consoles will happily run the 4K S90F at 120 Hz, PC gamers with compatible GPUs can push that all the way to 144 Hz at 4K.While the 65-inch model for just under $1,300 is certainly the more reasonable offering, I also spotted the 77-inch model from a third-party seller for just a few dollars under $2,220. As always, the best place to find your next screen is our guide to the best televisions, with real, hands-on experiences from WIRED writers. Our Favorite Gaming TV Is $100 Off While there are a ton of OLED screens to choose from, gamers have a different set of needs when it comes to their TVs. For those weeknight warriors, we recommend the Samsung S90F QD-OLED, and the 65-inch model is currently marked down to $1,298 at Amazon. That's a big discount from its list price and about $100 below the price where it's been sitting for a few months. Samsung Amazon Shop at Walmart Samsung What the Samsung S90F lacks in top-level brightness it more than makes up for with its impressive detail and clear image quality. The colors are bold and flashy while also staying nice and accurate across the panel, and the contrast is incredible, thanks to the OLED panel's ability to reach basically blacked-out levels in the darkest corners of your favorite RPG. It has excellent glare-reduction in bright rooms, which is perfect for those Saturday-afternoon gaming sessions. Add in support for HDR10/10+, and you've got a screen that's just as good for movies as it is for first-person shooters. Gamers can look forward to the Samsung Gaming Hub, which has a great set of features for both cloud and local gaming. Unlike some of our other favorite TVs that aren't as considerate towards gaming, the S90F also boasts a full set of four HDMI 2.1 ports, so you won't have to pick a favorite console or swap cables every time you switch games. The gaming bar lets you quickly adjust settings to suit your favorite genre, and it has super-fast input response times to help keep your character alive in-game. While capable consoles will happily run the 4K S90F at 120 Hz, PC gamers with compatible GPUs can push that all the way to 144 Hz at 4K. While the 65-inch model for just under $1,300 is certainly the more reasonable offering, I also spotted the 77-inch model from a third-party seller for just a few dollars under $2,220. As always, the best place to find your next screen is our guide to the best televisions, with real, hands-on experiences from WIRED writers. Comments Wired Coupons Squarespace Promo Code: 20% Off Annual Acuity Subscriptions Laptop - $400 Off LG Promo Code 10% Off Dell Coupon Code for New Customers 30% Samsung Coupon - Offer Program 2026 10% Off Canon Promo Code + Up to 30% Off 50% Off Doordash Promo Code For New & Existing Users © 2026 Condé Nast. All rights reserved. WIRED may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Computer#cite_ref-LTQ1914es_27-0] | [TOKENS: 10628] |
Contents Computer A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (computation). Modern digital electronic computers can perform generic sets of operations known as programs, which enable computers to perform a wide range of tasks. The term computer system may refer to a nominally complete computer that includes the hardware, operating system, software, and peripheral equipment needed and used for full operation, or to a group of computers that are linked and function together, such as a computer network or computer cluster. A broad range of industrial and consumer products use computers as control systems, including simple special-purpose devices like microwave ovens and remote controls, and factory devices like industrial robots. Computers are at the core of general-purpose devices such as personal computers and mobile devices such as smartphones. Computers power the Internet, which links billions of computers and users. Early computers were meant to be used only for calculations. Simple manual instruments like the abacus have aided people in doing calculations since ancient times. Early in the Industrial Revolution, some mechanical devices were built to automate long, tedious tasks, such as guiding patterns for looms. More sophisticated electrical machines did specialized analog calculations in the early 20th century. The first digital electronic calculating machines were developed during World War II, both electromechanical and using thermionic valves. The first semiconductor transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit chip technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in the 1970s. The speed, power, and versatility of computers have been increasing dramatically ever since then, with transistor counts increasing at a rapid pace (Moore's law noted that counts doubled every two years), leading to the Digital Revolution during the late 20th and early 21st centuries. Conventionally, a modern computer consists of at least one processing element, typically a central processing unit (CPU) in the form of a microprocessor, together with some type of computer memory, typically semiconductor memory chips. The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information. Peripheral devices include input devices (keyboards, mice, joysticks, etc.), output devices (monitors, printers, etc.), and input/output devices that perform both functions (e.g. touchscreens). Peripheral devices allow information to be retrieved from an external source, and they enable the results of operations to be saved and retrieved. Etymology It was not until the mid-20th century that the word acquired its modern definition; according to the Oxford English Dictionary, the first known use of the word computer was in a different sense, in a 1613 book called The Yong Mans Gleanings by the English writer Richard Brathwait: "I haue [sic] read the truest computer of Times, and the best Arithmetician that euer [sic] breathed, and he reduceth thy dayes into a short number." This usage of the term referred to a human computer, a person who carried out calculations or computations. The word continued to have the same meaning until the middle of the 20th century. During the latter part of this period, women were often hired as computers because they could be paid less than their male counterparts. By 1943, most human computers were women. The Online Etymology Dictionary gives the first attested use of computer in the 1640s, meaning 'one who calculates'; this is an "agent noun from compute (v.)". The Online Etymology Dictionary states that the use of the term to mean "'calculating machine' (of any type) is from 1897." The Online Etymology Dictionary indicates that the "modern use" of the term, to mean 'programmable digital electronic computer' dates from "1945 under this name; [in a] theoretical [sense] from 1937, as Turing machine". The name has remained, although modern computers are capable of many higher-level functions. History Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was most likely a form of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented counts of items, likely livestock or grains, sealed in hollow unbaked clay containers.[a] The use of counting rods is one example. The abacus was initially used for arithmetic tasks. The Roman abacus was developed from devices used in Babylonia as early as 2400 BCE. Since then, many other forms of reckoning boards or tables have been invented. In a medieval European counting house, a checkered cloth would be placed on a table, and markers moved around on it according to certain rules, as an aid to calculating sums of money. The Antikythera mechanism is believed to be the earliest known mechanical analog computer, according to Derek J. de Solla Price. It was designed to calculate astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated to approximately c. 100 BCE. Devices of comparable complexity to the Antikythera mechanism would not reappear until the fourteenth century. Many mechanical aids to calculation and measurement were constructed for astronomical and navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th century. The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BCE and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was effectively an analog computer capable of working out several different kinds of problems in spherical astronomy. An astrolabe incorporating a mechanical calendar computer and gear-wheels was invented by Abi Bakr of Isfahan, Persia in 1235. Abū Rayhān al-Bīrūnī invented the first mechanical geared lunisolar calendar astrolabe, an early fixed-wired knowledge processing machine with a gear train and gear-wheels, c. 1000 AD. The sector, a calculating instrument used for solving problems in proportion, trigonometry, multiplication and division, and for various functions, such as squares and cube roots, was developed in the late 16th century and found application in gunnery, surveying and navigation. The planimeter was a manual instrument to calculate the area of a closed figure by tracing over it with a mechanical linkage. The slide rule was invented around 1620–1630, by the English clergyman William Oughtred, shortly after the publication of the concept of the logarithm. It is a hand-operated analog computer for doing multiplication and division. As slide rule development progressed, added scales provided reciprocals, squares and square roots, cubes and cube roots, as well as transcendental functions such as logarithms and exponentials, circular and hyperbolic trigonometry and other functions. Slide rules with special scales are still used for quick performance of routine calculations, such as the E6B circular slide rule used for time and distance calculations on light aircraft. In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automaton) that could write holding a quill pen. By switching the number and order of its internal wheels different letters, and hence different messages, could be produced. In effect, it could be mechanically "programmed" to read instructions. Along with two other complex machines, the doll is at the Musée d'Art et d'Histoire of Neuchâtel, Switzerland, and still operates. In 1831–1835, mathematician and engineer Giovanni Plana devised a Perpetual Calendar machine, which through a system of pulleys and cylinders could predict the perpetual calendar for every year from 0 CE (that is, 1 BCE) to 4000 CE, keeping track of leap years and varying day length. The tide-predicting machine invented by the Scottish scientist Sir William Thomson in 1872 was of great utility to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location. The differential analyser, a mechanical analog computer designed to solve differential equations by integration, used wheel-and-disc mechanisms to perform the integration. In 1876, Sir William Thomson had already discussed the possible construction of such calculators, but he had been stymied by the limited output torque of the ball-and-disk integrators. In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that allowed these machines to work. Starting in the 1920s, Vannevar Bush and others developed mechanical differential analyzers. In the 1890s, the Spanish engineer Leonardo Torres Quevedo began to develop a series of advanced analog machines that could solve real and complex roots of polynomials, which were published in 1901 by the Paris Academy of Sciences. Charles Babbage, an English mechanical engineer and polymath, originated the concept of a programmable computer. Considered the "father of the computer", he conceptualized and invented the first mechanical computer in the early 19th century. After working on his difference engine he announced his invention in 1822, in a paper to the Royal Astronomical Society, titled "Note on the application of machinery to the computation of astronomical and mathematical tables". He also designed to aid in navigational calculations, in 1833 he realized that a much more general design, an analytical engine, was possible. The input of programs and data was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. The engine would incorporate an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. The machine was about a century ahead of its time. All the parts for his machine had to be made by hand – this was a major problem for a device with thousands of parts. Eventually, the project was dissolved with the decision of the British Government to cease funding. Babbage's failure to complete the analytical engine can be chiefly attributed to political and financial difficulties as well as his desire to develop an increasingly sophisticated computer and to move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906. In his work Essays on Automatics published in 1914, Leonardo Torres Quevedo wrote a brief history of Babbage's efforts at constructing a mechanical Difference Engine and Analytical Engine. The paper contains a design of a machine capable to calculate formulas like a x ( y − z ) 2 {\displaystyle a^{x}(y-z)^{2}} , for a sequence of sets of values. The whole machine was to be controlled by a read-only program, which was complete with provisions for conditional branching. He also introduced the idea of floating-point arithmetic. In 1920, to celebrate the 100th anniversary of the invention of the arithmometer, Torres presented in Paris the Electromechanical Arithmometer, which allowed a user to input arithmetic problems through a keyboard, and computed and printed the results, demonstrating the feasibility of an electromechanical analytical engine. During the first half of the 20th century, many scientific computing needs were met by increasingly sophisticated analog computers, which used a direct mechanical or electrical model of the problem as a basis for computation. However, these were not programmable and generally lacked the versatility and accuracy of modern digital computers. The first modern analog computer was a tide-predicting machine, invented by Sir William Thomson (later to become Lord Kelvin) in 1872. The differential analyser, a mechanical analog computer designed to solve differential equations by integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the elder brother of the more famous Sir William Thomson. The art of mechanical analog computing reached its zenith with the differential analyzer, completed in 1931 by Vannevar Bush at MIT. By the 1950s, the success of digital electronic computers had spelled the end for most analog computing machines, but analog computers remained in use during the 1950s in some specialized applications such as education (slide rule) and aircraft (control systems).[citation needed] Claude Shannon's 1937 master's thesis laid the foundations of digital computing, with his insight of applying Boolean algebra to the analysis and synthesis of switching circuits being the basic concept which underlies all electronic digital computers. By 1938, the United States Navy had developed the Torpedo Data Computer, an electromechanical analog computer for submarines that used trigonometry to solve the problem of firing a torpedo at a moving target. During World War II, similar devices were developed in other countries. Early digital computers were electromechanical; electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939 in Berlin, was one of the earliest examples of an electromechanical relay computer. In 1941, Zuse followed his earlier machine up with the Z3, the world's first working electromechanical programmable, fully automatic digital computer. The Z3 was built with 2000 relays, implementing a 22-bit word length that operated at a clock frequency of about 5–10 Hz. Program code was supplied on punched film while data could be stored in 64 words of memory or supplied from the keyboard. It was quite similar to modern machines in some respects, pioneering numerous advances such as floating-point numbers. Rather than the harder-to-implement decimal system (used in Charles Babbage's earlier design), using a binary system meant that Zuse's machines were easier to build and potentially more reliable, given the technologies available at that time. The Z3 was not itself a universal computer but could be extended to be Turing complete. Zuse's next computer, the Z4, became the world's first commercial computer; after initial delay due to the Second World War, it was completed in 1950 and delivered to the ETH Zurich. The computer was manufactured by Zuse's own company, Zuse KG, which was founded in 1941 as the first company with the sole purpose of developing computers in Berlin. The Z4 served as the inspiration for the construction of the ERMETH, the first Swiss computer and one of the first in Europe. Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the same time that digital calculation replaced analog. The engineer Tommy Flowers, working at the Post Office Research Station in London in the 1930s, began to explore the possible use of electronics for the telephone exchange. Experimental equipment that he built in 1934 went into operation five years later, converting a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum tubes. In the US, John Vincent Atanasoff and Clifford E. Berry of Iowa State University developed and tested the Atanasoff–Berry Computer (ABC) in 1942, the first "automatic electronic digital computer". This design was also all-electronic and used about 300 vacuum tubes, with capacitors fixed in a mechanically rotating drum for memory. During World War II, the British code-breakers at Bletchley Park achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma, was first attacked with the help of the electro-mechanical bombes which were often run by women. To crack the more sophisticated German Lorenz SZ 40/42 machine, used for high-level Army communications, Max Newman and his colleagues commissioned Flowers to build the Colossus. He spent eleven months from early February 1943 designing and building the first Colossus. After a functional test in December 1943, Colossus was shipped to Bletchley Park, where it was delivered on 18 January 1944 and attacked its first message on 5 February. Colossus was the world's first electronic digital programmable computer. It used a large number of valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The Mk I was converted to a Mk II making ten machines in total). Colossus Mark I contained 1,500 thermionic valves (tubes), but Mark II with 2,400 valves, was both five times faster and simpler to operate than Mark I, greatly speeding the decoding process. The ENIAC (Electronic Numerical Integrator and Computer) was the first electronic programmable computer built in the U.S. Although the ENIAC was similar to the Colossus, it was much faster, more flexible, and it was Turing-complete. Like the Colossus, a "program" on the ENIAC was defined by the states of its patch cables and switches, a far cry from the stored program electronic machines that came later. Once a program was written, it had to be mechanically set into the machine with manual resetting of plugs and switches. The programmers of the ENIAC were six women, often known collectively as the "ENIAC girls". It combined the high speed of electronics with the ability to be programmed for many complex problems. It could add or subtract 5000 times a second, a thousand times faster than any other machine. It also had modules to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC's development and construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30 tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and hundreds of thousands of resistors, capacitors, and inductors. The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper, On Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine" and that is now known as a universal Turing machine. He proved that such a machine is capable of computing anything that is computable by executing instructions (program) stored on tape, allowing the machine to be programmable. The fundamental concept of Turing's design is the stored program, where all the instructions for computing are stored in memory. Von Neumann acknowledged that the central concept of the modern computer was due to this paper. Turing machines are to this day a central object of study in theory of computation. Except for the limitations imposed by their finite memory stores, modern computers are said to be Turing-complete, which is to say, they have algorithm execution capability equivalent to a universal Turing machine. Early computing machines had fixed programs. Changing its function required the re-wiring and re-structuring of the machine. With the proposal of the stored-program computer this changed. A stored-program computer includes by design an instruction set and can store in memory a set of instructions (a program) that details the computation. The theoretical basis for the stored-program computer was laid out by Alan Turing in his 1936 paper. In 1945, Turing joined the National Physical Laboratory and began work on developing an electronic stored-program digital computer. His 1945 report "Proposed Electronic Calculator" was the first specification for such a device. John von Neumann at the University of Pennsylvania also circulated his First Draft of a Report on the EDVAC in 1945. The Manchester Baby was the world's first stored-program computer. It was built at the University of Manchester in England by Frederic C. Williams, Tom Kilburn and Geoff Tootill, and ran its first program on 21 June 1948. It was designed as a testbed for the Williams tube, the first random-access digital storage device. Although the computer was described as "small and primitive" by a 1998 retrospective, it was the first working machine to contain all of the elements essential to a modern electronic computer. As soon as the Baby had demonstrated the feasibility of its design, a project began at the university to develop it into a practically useful computer, the Manchester Mark 1. The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first commercially available general-purpose computer. Built by Ferranti, it was delivered to the University of Manchester in February 1951. At least seven of these later machines were delivered between 1953 and 1957, one of them to Shell labs in Amsterdam. In October 1947 the directors of British catering company J. Lyons & Company decided to take an active role in promoting the commercial development of computers. Lyons's LEO I computer, modelled closely on the Cambridge EDSAC of 1949, became operational in April 1951 and ran the world's first routine office computer job. The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first working transistor, the point-contact transistor, in 1947, which was followed by Shockley's bipolar junction transistor in 1948. From 1955 onwards, transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. However, early junction transistors were relatively bulky devices that were difficult to manufacture on a mass-production basis, which limited them to a number of specialized applications. At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developed transistors instead of valves. Their first transistorized computer and the first in the world, was operational by 1953, and a second version was completed there in April 1955. However, the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic drum memory, so it was not the first completely transistorized computer. That distinction goes to the Harwell CADET of 1955, built by the electronics division of the Atomic Energy Research Establishment at Harwell. The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS transistor, was invented at Bell Labs between 1955 and 1960 and was the first truly compact transistor that could be miniaturized and mass-produced for a wide range of uses. With its high scalability, and much lower power consumption and higher density than bipolar junction transistors, the MOSFET made it possible to build high-density integrated circuits. In addition to data processing, it also enabled the practical use of MOS transistors as memory cell storage elements, leading to the development of MOS semiconductor memory, which replaced earlier magnetic-core memory in computers. The MOSFET led to the microcomputer revolution, and became the driving force behind the computer revolution. The MOSFET is the most widely used transistor in computers, and is the fundamental building block of digital electronics. The next great advance in computing power came with the advent of the integrated circuit (IC). The idea of the integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C., on 7 May 1952. The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor. Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958. In his patent application of 6 February 1959, Kilby described his new device as "a body of semiconductor material ... wherein all the components of the electronic circuit are completely integrated". However, Kilby's invention was a hybrid integrated circuit (hybrid IC), rather than a monolithic integrated circuit (IC) chip. Kilby's IC had external wire connections, which made it difficult to mass-produce. Noyce also came up with his own idea of an integrated circuit half a year later than Kilby. Noyce's invention was the first true monolithic IC chip. His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium. Noyce's monolithic IC was fabricated using the planar process, developed by his colleague Jean Hoerni in early 1959. In turn, the planar process was based on Carl Frosch and Lincoln Derick work on semiconductor surface passivation by silicon dioxide. Modern monolithic ICs are predominantly MOS (metal–oxide–semiconductor) integrated circuits, built from MOSFETs (MOS transistors). The earliest experimental MOS IC to be fabricated was a 16-transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962. General Microelectronics later introduced the first commercial MOS IC in 1964, developed by Robert Norman. Following the development of the self-aligned gate (silicon-gate) MOS transistor by Robert Kerwin, Donald Klein and John Sarace at Bell Labs in 1967, the first silicon-gate MOS IC with self-aligned gates was developed by Federico Faggin at Fairchild Semiconductor in 1968. The MOSFET has since become the most critical device component in modern ICs. The development of the MOS integrated circuit led to the invention of the microprocessor, and heralded an explosion in the commercial and personal use of computers. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term "microprocessor", it is largely undisputed that the first single-chip microprocessor was the Intel 4004, designed and realized by Federico Faggin with his silicon-gate MOS IC technology, along with Ted Hoff, Masatoshi Shima and Stanley Mazor at Intel.[b] In the early 1970s, MOS IC technology enabled the integration of more than 10,000 transistors on a single chip. System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of a coin. They may or may not have integrated RAM and flash memory. If not integrated, the RAM is usually placed directly above (known as Package on package) or below (on the opposite side of the circuit board) the SoC, and the flash memory is usually placed right next to the SoC. This is done to improve data transfer speeds, as the data signals do not have to travel long distances. Since ENIAC in 1945, computers have advanced enormously, with modern SoCs (such as the Snapdragon 865) being the size of a coin while also being hundreds of thousands of times more powerful than ENIAC, integrating billions of transistors, and consuming only a few watts of power. The first mobile computers were heavy and ran from mains power. The 50 lb (23 kg) IBM 5100 was an early example. Later portables such as the Osborne 1 and Compaq Portable were considerably lighter but still needed to be plugged in. The first laptops, such as the Grid Compass, removed this requirement by incorporating batteries – and with the continued miniaturization of computing resources and advancements in portable battery life, portable computers grew in popularity in the 2000s. The same developments allowed manufacturers to integrate computing resources into cellular mobile phones by the early 2000s. These smartphones and tablets run on a variety of operating systems and recently became the dominant computing device on the market. These are powered by System on a Chip (SoCs), which are complete computers on a microchip the size of a coin. Types Computers can be classified in a number of different ways, including: A computer does not need to be electronic, nor even have a processor, nor RAM, nor even a hard disk. While popular usage of the word "computer" is synonymous with a personal electronic computer,[c] a typical modern definition of a computer is: "A device that computes, especially a programmable [usually] electronic machine that performs high-speed mathematical or logical operations or that assembles, stores, correlates, or otherwise processes information." According to this definition, any device that processes information qualifies as a computer. Hardware The term hardware covers all of those parts of a computer that are tangible physical objects. Circuits, computer chips, graphic cards, sound cards, memory (RAM), motherboard, displays, power supplies, cables, keyboards, printers and "mice" input devices are all hardware. A general-purpose computer has four main components: the arithmetic logic unit (ALU), the control unit, the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by buses, often made of groups of wires. Inside each of these parts are thousands to trillions of small electrical circuits which can be turned off or on by means of an electronic switch. Each circuit represents a bit (binary digit) of information so that when the circuit is on it represents a "1", and when off it represents a "0" (in positive logic representation). The circuits are arranged in logic gates so that one or more of the circuits may control the state of one or more of the other circuits. Input devices are the means by which the operations of a computer are controlled and it is provided with data. Examples include: Output devices are the means by which a computer provides the results of its calculations in a human-accessible form. Examples include: The control unit (often called a control system or central controller) manages the computer's various components; it reads and interprets (decodes) the program instructions, transforming them into control signals that activate other parts of the computer.[e] Control systems in advanced computers may change the order of execution of some instructions to improve performance. A key component common to all CPUs is the program counter, a special memory cell (a register) that keeps track of which location in memory the next instruction is to be read from.[f] The control system's function is as follows— this is a simplified description, and some of these steps may be performed concurrently or in a different order depending on the type of CPU: Since the program counter is (conceptually) just another set of memory cells, it can be changed by calculations done in the ALU. Adding 100 to the program counter would cause the next instruction to be read from a place 100 locations further down the program. Instructions that modify the program counter are often known as "jumps" and allow for loops (instructions that are repeated by the computer) and often conditional instruction execution (both examples of control flow). The sequence of operations that the control unit goes through to process an instruction is in itself like a short computer program, and indeed, in some more complex CPU designs, there is another yet smaller computer called a microsequencer, which runs a microcode program that causes all of these events to happen. The control unit, ALU, and registers are collectively known as a central processing unit (CPU). Early CPUs were composed of many separate components. Since the 1970s, CPUs have typically been constructed on a single MOS integrated circuit chip called a microprocessor. The ALU is capable of performing two classes of operations: arithmetic and logic. The set of arithmetic operations that a particular ALU supports may be limited to addition and subtraction, or might include multiplication, division, trigonometry functions such as sine, cosine, etc., and square roots. Some can operate only on whole numbers (integers) while others use floating point to represent real numbers, albeit with limited precision. However, any computer that is capable of performing just the simplest operations can be programmed to break down the more complex operations into simple steps that it can perform. Therefore, any computer can be programmed to perform any arithmetic operation—although it will take more time to do so if its ALU does not directly support the operation. An ALU may also compare numbers and return Boolean truth values (true or false) depending on whether one is equal to, greater than or less than the other ("is 64 greater than 65?"). Logic operations involve Boolean logic: AND, OR, XOR, and NOT. These can be useful for creating complicated conditional statements and processing Boolean logic. Superscalar computers may contain multiple ALUs, allowing them to process several instructions simultaneously. Graphics processors and computers with SIMD and MIMD features often contain ALUs that can perform arithmetic on vectors and matrices. A computer's memory can be viewed as a list of cells into which numbers can be placed or read. Each cell has a numbered "address" and can store a single number. The computer can be instructed to "put the number 123 into the cell numbered 1357" or to "add the number that is in cell 1357 to the number that is in cell 2468 and put the answer into cell 1595." The information stored in memory may represent practically anything. Letters, numbers, even computer instructions can be placed into memory with equal ease. Since the CPU does not differentiate between different types of information, it is the software's responsibility to give significance to what the memory sees as nothing but a series of numbers. In almost all modern computers, each memory cell is set up to store binary numbers in groups of eight bits (called a byte). Each byte is able to represent 256 different numbers (28 = 256); either from 0 to 255 or −128 to +127. To store larger numbers, several consecutive bytes may be used (typically, two, four or eight). When negative numbers are required, they are usually stored in two's complement notation. Other arrangements are possible, but are usually not seen outside of specialized applications or historical contexts. A computer can store any kind of information in memory if it can be represented numerically. Modern computers have billions or even trillions of bytes of memory. The CPU contains a special set of memory cells called registers that can be read and written to much more rapidly than the main memory area. There are typically between two and one hundred registers depending on the type of CPU. Registers are used for the most frequently needed data items to avoid having to access main memory every time data is needed. As data is constantly being worked on, reducing the need to access main memory (which is often slow compared to the ALU and control units) greatly increases the computer's speed. Computer main memory comes in two principal varieties: RAM can be read and written to anytime the CPU commands it, but ROM is preloaded with data and software that never changes, therefore the CPU can only read from it. ROM is typically used to store the computer's initial start-up instructions. In general, the contents of RAM are erased when the power to the computer is turned off, but ROM retains its data indefinitely. In a PC, the ROM contains a specialized program called the BIOS that orchestrates loading the computer's operating system from the hard disk drive into RAM whenever the computer is turned on or reset. In embedded computers, which frequently do not have disk drives, all of the required software may be stored in ROM. Software stored in ROM is often called firmware, because it is notionally more like hardware than software. Flash memory blurs the distinction between ROM and RAM, as it retains its data when turned off but is also rewritable. It is typically much slower than conventional ROM and RAM however, so its use is restricted to applications where high speed is unnecessary.[g] In more sophisticated computers there may be one or more RAM cache memories, which are slower than registers but faster than main memory. Generally computers with this sort of cache are designed to move frequently needed data into the cache automatically, often without the need for any intervention on the programmer's part. I/O is the means by which a computer exchanges information with the outside world. Devices that provide input or output to the computer are called peripherals. On a typical personal computer, peripherals include input devices like the keyboard and mouse, and output devices such as the display and printer. Hard disk drives, floppy disk drives and optical disc drives serve as both input and output devices. Computer networking is another form of I/O. I/O devices are often complex computers in their own right, with their own CPU and memory. A graphics processing unit might contain fifty or more tiny computers that perform the calculations necessary to display 3D graphics.[citation needed] Modern desktop computers contain many smaller computers that assist the main CPU in performing I/O. A 2016-era flat screen display contains its own computer circuitry. While a computer may be viewed as running one gigantic program stored in its main memory, in some systems it is necessary to give the appearance of running several programs simultaneously. This is achieved by multitasking, i.e. having the computer switch rapidly between running each program in turn. One means by which this is done is with a special signal called an interrupt, which can periodically cause the computer to stop executing instructions where it was and do something else instead. By remembering where it was executing prior to the interrupt, the computer can return to that task later. If several programs are running "at the same time". Then the interrupt generator might be causing several hundred interrupts per second, causing a program switch each time. Since modern computers typically execute instructions several orders of magnitude faster than human perception, it may appear that many programs are running at the same time, even though only one is ever executing in any given instant. This method of multitasking is sometimes termed "time-sharing" since each program is allocated a "slice" of time in turn. Before the era of inexpensive computers, the principal use for multitasking was to allow many people to share the same computer. Seemingly, multitasking would cause a computer that is switching between several programs to run more slowly, in direct proportion to the number of programs it is running, but most programs spend much of their time waiting for slow input/output devices to complete their tasks. If a program is waiting for the user to click on the mouse or press a key on the keyboard, then it will not take a "time slice" until the event it is waiting for has occurred. This frees up time for other programs to execute so that many programs may be run simultaneously without unacceptable speed loss. Some computers are designed to distribute their work across several CPUs in a multiprocessing configuration, a technique once employed in only large and powerful machines such as supercomputers, mainframe computers and servers. Multiprocessor and multi-core (multiple CPUs on a single integrated circuit) personal and laptop computers are now widely available, and are being increasingly used in lower-end markets as a result. Supercomputers in particular often have highly unique architectures that differ significantly from the basic stored-program architecture and from general-purpose computers.[h] They often feature thousands of CPUs, customized high-speed interconnects, and specialized computing hardware. Such designs tend to be useful for only specialized tasks due to the large scale of program organization required to use most of the available resources at once. Supercomputers usually see usage in large-scale simulation, graphics rendering, and cryptography applications, as well as with other so-called "embarrassingly parallel" tasks. Software Software is the part of a computer system that consists of the encoded information that determines the computer's operation, such as data or instructions on how to process the data. In contrast to the physical hardware from which the system is built, software is immaterial. Software includes computer programs, libraries and related non-executable data, such as online documentation or digital media. It is often divided into system software and application software. Computer hardware and software require each other and neither is useful on its own. When software is stored in hardware that cannot easily be modified, such as with BIOS ROM in an IBM PC compatible computer, it is sometimes called "firmware". The defining feature of modern computers which distinguishes them from all other machines is that they can be programmed. That is to say that some type of instructions (the program) can be given to the computer, and it will process them. Modern computers based on the von Neumann architecture often have machine code in the form of an imperative programming language. In practical terms, a computer program may be just a few instructions or extend to many millions of instructions, as do the programs for word processors and web browsers for example. A typical modern computer can execute billions of instructions per second (gigaflops) and rarely makes a mistake over many years of operation. Large computer programs consisting of several million instructions may take teams of programmers years to write, and due to the complexity of the task almost certainly contain errors. This section applies to most common RAM machine–based computers. In most cases, computer instructions are simple: add one number to another, move some data from one location to another, send a message to some external device, etc. These instructions are read from the computer's memory and are generally carried out (executed) in the order they were given. However, there are usually specialized instructions to tell the computer to jump ahead or backwards to some other place in the program and to carry on executing from there. These are called "jump" instructions (or branches). Furthermore, jump instructions may be made to happen conditionally so that different sequences of instructions may be used depending on the result of some previous calculation or some external event. Many computers directly support subroutines by providing a type of jump that "remembers" the location it jumped from and another instruction to return to the instruction following that jump instruction. Program execution might be likened to reading a book. While a person will normally read each word and line in sequence, they may at times jump back to an earlier place in the text or skip sections that are not of interest. Similarly, a computer may sometimes go back and repeat the instructions in some section of the program over and over again until some internal condition is met. This is called the flow of control within the program and it is what allows the computer to perform tasks repeatedly without human intervention. Comparatively, a person using a pocket calculator can perform a basic arithmetic operation such as adding two numbers with just a few button presses. But to add together all of the numbers from 1 to 1,000 would take thousands of button presses and a lot of time, with a near certainty of making a mistake. On the other hand, a computer may be programmed to do this with just a few simple instructions. The following example is written in the MIPS assembly language: Once told to run this program, the computer will perform the repetitive addition task without further human intervention. It will almost never make a mistake and a modern PC can complete the task in a fraction of a second. In most computers, individual instructions are stored as machine code with each instruction being given a unique number (its operation code or opcode for short). The command to add two numbers together would have one opcode; the command to multiply them would have a different opcode, and so on. The simplest computers are able to perform any of a handful of different instructions; the more complex computers have several hundred to choose from, each with a unique numerical code. Since the computer's memory is able to store numbers, it can also store the instruction codes. This leads to the important fact that entire programs (which are just lists of these instructions) can be represented as lists of numbers and can themselves be manipulated inside the computer in the same way as numeric data. The fundamental concept of storing programs in the computer's memory alongside the data they operate on is the crux of the von Neumann, or stored program, architecture. In some cases, a computer might store some or all of its program in memory that is kept separate from the data it operates on. This is called the Harvard architecture after the Harvard Mark I computer. Modern von Neumann computers display some traits of the Harvard architecture in their designs, such as in CPU caches. While it is possible to write computer programs as long lists of numbers (machine language) and while this technique was used with many early computers,[i] it is extremely tedious and potentially error-prone to do so in practice, especially for complicated programs. Instead, each basic instruction can be given a short name that is indicative of its function and easy to remember – a mnemonic such as ADD, SUB, MULT or JUMP. These mnemonics are collectively known as a computer's assembly language. Converting programs written in assembly language into something the computer can actually understand (machine language) is usually done by a computer program called an assembler. A programming language is a notation system for writing the source code from which a computer program is produced. Programming languages provide various ways of specifying programs for computers to run. Unlike natural languages, programming languages are designed to permit no ambiguity and to be concise. They are purely written languages and are often difficult to read aloud. They are generally either translated into machine code by a compiler or an assembler before being run, or translated directly at run time by an interpreter. Sometimes programs are executed by a hybrid method of the two techniques. There are thousands of programming languages—some intended for general purpose programming, others useful for only highly specialized applications. Machine languages and the assembly languages that represent them (collectively termed low-level programming languages) are generally unique to the particular architecture of a computer's central processing unit (CPU). For instance, an ARM architecture CPU (such as may be found in a smartphone or a hand-held videogame) cannot understand the machine language of an x86 CPU that might be in a PC.[j] Historically a significant number of other CPU architectures were created and saw extensive use, notably including the MOS Technology 6502 and 6510 in addition to the Zilog Z80. Although considerably easier than in machine language, writing long programs in assembly language is often difficult and is also error prone. Therefore, most practical programs are written in more abstract high-level programming languages that are able to express the needs of the programmer more conveniently (and thereby help reduce programmer error). High level languages are usually "compiled" into machine language (or sometimes into assembly language and then into machine language) using another computer program called a compiler.[k] High level languages are less related to the workings of the target computer than assembly language, and more related to the language and structure of the problem(s) to be solved by the final program. It is therefore often possible to use different compilers to translate the same high level language program into the machine language of many different types of computer. This is part of the means by which software like video games may be made available for different computer architectures such as personal computers and various video game consoles. Program design of small programs is relatively simple and involves the analysis of the problem, collection of inputs, using the programming constructs within languages, devising or using established procedures and algorithms, providing data for output devices and solutions to the problem as applicable. As problems become larger and more complex, features such as subprograms, modules, formal documentation, and new paradigms such as object-oriented programming are encountered. Large programs involving thousands of line of code and more require formal software methodologies. The task of developing large software systems presents a significant intellectual challenge. Producing software with an acceptably high reliability within a predictable schedule and budget has historically been difficult; the academic and professional discipline of software engineering concentrates specifically on this challenge. Errors in computer programs are called "bugs". They may be benign and not affect the usefulness of the program, or have only subtle effects. However, in some cases they may cause the program or the entire system to "hang", becoming unresponsive to input such as mouse clicks or keystrokes, to completely fail, or to crash. Otherwise benign bugs may sometimes be harnessed for malicious intent by an unscrupulous user writing an exploit, code designed to take advantage of a bug and disrupt a computer's proper execution. Bugs are usually not the fault of the computer. Since computers merely execute the instructions they are given, bugs are nearly always the result of programmer error or an oversight made in the program's design.[l] Admiral Grace Hopper, an American computer scientist and developer of the first compiler, is credited for having first used the term "bugs" in computing after a dead moth was found shorting a relay in the Harvard Mark II computer in September 1947. Networking and the Internet Computers have been used to coordinate information between multiple physical locations since the 1950s. The U.S. military's SAGE system was the first large-scale example of such a system, which led to a number of special-purpose commercial systems such as Sabre. In the 1970s, computer engineers at research institutions throughout the United States began to link their computers together using telecommunications technology. The effort was funded by ARPA (now DARPA), and the computer network that resulted was called the ARPANET. Logic gates are a common abstraction which can apply to most of the above digital or analog paradigms. The ability to store and execute lists of instructions called programs makes computers extremely versatile, distinguishing them from calculators. The Church–Turing thesis is a mathematical statement of this versatility: any computer with a minimum capability (being Turing-complete) is, in principle, capable of performing the same tasks that any other computer can perform. Therefore, any type of computer (netbook, supercomputer, cellular automaton, etc.) is able to perform the same computational tasks, given enough time and storage capacity. In the 20th century, artificial intelligence systems were predominantly symbolic: they executed code that was explicitly programmed by software developers. Machine learning models, however, have a set parameters that are adjusted throughout training, so that the model learns to accomplish a task based on the provided data. The efficiency of machine learning (and in particular of neural networks) has rapidly improved with progress in hardware for parallel computing, mainly graphics processing units (GPUs). Some large language models are able to control computers or robots. AI progress may lead to the creation of artificial general intelligence (AGI), a type of AI that could accomplish virtually any intellectual task at least as well as humans. Professions and organizations As the use of computers has spread throughout society, there are an increasing number of careers involving computers. The need for computers to work well together and to be able to exchange information has spawned the need for many standards organizations, clubs and societies of both a formal and informal nature. See also Notes References Sources External links |
======================================== |
[SOURCE: https://en.wikipedia.org/w/index.php?title=Special:CiteThisPage&page=Fast.ai&id=1329908597&wpFormIdentifier=titleform] | [TOKENS: 561] |
Contents Cite This Page IMPORTANT NOTE: Most educators and professionals do not consider it appropriate to use tertiary sources such as encyclopedias as a sole source for any information—citing an encyclopedia as an important reference in footnotes or bibliographies may result in censure or a failing grade. Wikipedia articles should be used for background information, as a reference for correct terminology and search terms, and as a starting point for further research. As with any community-built reference, there is a possibility for error in Wikipedia's content—please check your facts against multiple sources and read our disclaimers for more information. Bibliographic details for "Fast.ai" Please remember to check your manual of style, standards guide or instructor's guidelines for the exact syntax to suit your needs. For more detailed advice, see Citing Wikipedia. Citation styles for "Fast.ai" Wikipedia contributors. (2025, December 28). Fast.ai. In Wikipedia, The Free Encyclopedia. Retrieved 10:52, February 21, 2026, from https://en.wikipedia.org/w/index.php?title=Fast.ai&oldid=1329908597 Wikipedia contributors. "Fast.ai." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 28 Dec. 2025. Web. 21 Feb. 2026. Wikipedia contributors, 'Fast.ai', Wikipedia, The Free Encyclopedia, 28 December 2025, 15:24 UTC, <https://en.wikipedia.org/w/index.php?title=Fast.ai&oldid=1329908597> [accessed 21 February 2026] Wikipedia contributors, "Fast.ai," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Fast.ai&oldid=1329908597 (accessed February 21, 2026). Wikipedia contributors. Fast.ai [Internet]. Wikipedia, The Free Encyclopedia; 2025 Dec 28, 15:24 UTC [cited 2026 Feb 21]. Available from: https://en.wikipedia.org/w/index.php?title=Fast.ai&oldid=1329908597. Fast.ai, https://en.wikipedia.org/w/index.php?title=Fast.ai&oldid=1329908597 (last visited Feb. 21, 2026). Wikipedia contributors. Fast.ai. Wikipedia, The Free Encyclopedia. December 28, 2025, 15:24 UTC. Available at: https://en.wikipedia.org/w/index.php?title=Fast.ai&oldid=1329908597. Accessed February 21, 2026. When using the LaTeX package url (\usepackage{url} somewhere in the preamble), which tends to give much more nicely formatted web addresses, the following may be preferred: |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.