text stringlengths 0 473k |
|---|
[SOURCE: https://en.wikipedia.org/wiki/Al-Burj,_Ramle] | [TOKENS: 1112] |
Contents Al-Burj, Ramle Al-Burj (Arabic: البرج) was a Palestinian Arab village 14 km east of Ramle close to the highway to Ramallah, which was depopulated in 1948. Its name, "the tower", is believed to be derived from the crusader castle, Castle Arnold, built on the site. Victorian visitors in the 19th century recorded seeing crusader ruins close to the village. Etymology The name "al-Burj" is of Arabic origin, and means "The tower". The names refers to the site's Crusader keep. History A Byzantine lintel was found in the village in the 1870s, with "a Greek cross inscribed in a circle, and having its four arms ornamented with curious facet-work." Charles Clearmont Ganneau suggested al-Burj as the site of the Castellum Arnoldi, near Beit Nuba, 'in primes auspices campestrum,' built in 1131 A.D. by the Patriarch of Jerusalem, to protect the approach to that city (William of Tyre). Just west of Al-Burj is Kŭlảt et Tantûrah, "the castle of the peak." It is the remains of a tower, with 5-meter thick walls, and a door to the east. It is possible the Crusader castle called Tharenta, under Muslim rule since 1187. While nearby Bayt Jiz often has been identified as the Crusader village of Gith, some scholars (Schmitt, 1980; Fischer, Isaac and Roll, 1996) have suggested that Gith was actually at Kŭlảt et Tantûrah. In 1838 el-Burj was noted as a Muslim village, in the Ibn Humar area in the District of Er-Ramleh. It was also noted as a being small, "situated on an isolated hill surrounded by open vallies and plains." It was further noted that "there are here evident traces of an ancient site, apparently once fortified." In 1863 Victor Guérin found the village to have no more than 200 inhabitants, and noted that the Crusader fortress was in ruins. An Ottoman village list from about 1870 showed that Al-Burj had a population of 139 in a total of 31 houses, though that population count included men, only. It was further noted that it was located one hour from Beit Ur al-Tahta. In 1873–74 Clermont-Ganneau noted that the village was closely connected with Bir Ma'in. In 1883, the PEF's Survey of Western Palestine (SWP) described Al-Burj as "a small village on a hill-top, with open ground beneath on all sides. There are remains of a Crusading fortress (Kulat et Tanturah), and the position is a strong one, near the main road to Lydda". By the beginning of the 20th century, former Bedouins from the 'Arab al-Jaramina tribe settled the in the village in and neighboring Bir Ma'in. In the 1922 census of Palestine conducted by the British Mandate authorities, Al Burj had a population of 344; all Muslims, increasing in the 1931 census to 370, still all Muslims, in a total of 92 houses. In the 1945 statistics, the village had a population of 480 Muslims, with a total land area of 4,708 dunams. 6 dunams were either irrigated or used for orchards, 2,631 were used for cereals, while 12 dunams were built-up (urban) areas. An elementary school for boys was completed in 1947 with around 35 pupils. Al-Burj was occupied by the Israeli Army on July 15, 1948, during the second phase of Operation Dani. The Arab Legion counterattacked the following day with two infantry platoons and ten armoured cars but were forced to retreat. According to the Haganah 30 Arabs were killed and four armoured vehicles captured with 3 Israelis killed. Aref al-Aref records around 13 Legionaires killed. Two elderly women and men remained. On the 23 July, one, a military cook, was sent out to pick vegetables. In his absence the other three were led to a house which, when an antitank shell missed, was blown up with six grenades. Two died: the surviving woman was then executed, and their bodies torched. The cook, on returning, didn't believe the story that they had been sent to a hospital in Ramallah, and some time later was executed with four bullets. In 1992 the village site was described: "Only one crumbled house remains on the hilltop. Cactuses and wild plants grow on the site. The nearby settlements uses the village for hothouse agriculture." In 2002 a woman, Kawthar al-Amir, published a 64 page long book about Al-Burj. According to Rochelle Davis, the book is "innovatively styled for children, the descendants of the village who do not know about the village," and it is a "question and answer format, as a conversation between her and her granddaughter Bahiyya." References Bibliography External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/PlayStation_(console)#cite_ref-56] | [TOKENS: 10728] |
Contents PlayStation (console) The PlayStation[a] (codenamed PSX, abbreviated as PS, and retroactively PS1 or PS one) is a home video game console developed and marketed by Sony Computer Entertainment. It was released in Japan on 3 December 1994, followed by North America on 9 September 1995, Europe on 29 September 1995, and other regions following thereafter. As a fifth-generation console, the PlayStation primarily competed with the Nintendo 64 and the Sega Saturn. Sony began developing the PlayStation after a failed venture with Nintendo to create a CD-ROM peripheral for the Super Nintendo Entertainment System in the early 1990s. The console was primarily designed by Ken Kutaragi and Sony Computer Entertainment in Japan, while additional development was outsourced in the United Kingdom. An emphasis on 3D polygon graphics was placed at the forefront of the console's design. PlayStation game production was designed to be streamlined and inclusive, enticing the support of many third party developers. The console proved popular for its extensive game library, popular franchises, low retail price, and aggressive youth marketing which advertised it as the preferable console for adolescents and adults. Critically acclaimed games that defined the console include Gran Turismo, Crash Bandicoot, Spyro the Dragon, Tomb Raider, Resident Evil, Metal Gear Solid, Tekken 3, and Final Fantasy VII. Sony ceased production of the PlayStation on 23 March 2006—over eleven years after it had been released, and in the same year the PlayStation 3 debuted. More than 4,000 PlayStation games were released, with cumulative sales of 962 million units. The PlayStation signaled Sony's rise to power in the video game industry. It received acclaim and sold strongly; in less than a decade, it became the first computer entertainment platform to ship over 100 million units. Its use of compact discs heralded the game industry's transition from cartridges. The PlayStation's success led to a line of successors, beginning with the PlayStation 2 in 2000. In the same year, Sony released a smaller and cheaper model, the PS one. History The PlayStation was conceived by Ken Kutaragi, a Sony executive who managed a hardware engineering division and was later dubbed "the Father of the PlayStation". Kutaragi's interest in working with video games stemmed from seeing his daughter play games on Nintendo's Famicom. Kutaragi convinced Nintendo to use his SPC-700 sound processor in the Super Nintendo Entertainment System (SNES) through a demonstration of the processor's capabilities. His willingness to work with Nintendo was derived from both his admiration of the Famicom and conviction in video game consoles becoming the main home-use entertainment systems. Although Kutaragi was nearly fired because he worked with Nintendo without Sony's knowledge, president Norio Ohga recognised the potential in Kutaragi's chip and decided to keep him as a protégé. The inception of the PlayStation dates back to a 1988 joint venture between Nintendo and Sony. Nintendo had produced floppy disk technology to complement cartridges in the form of the Family Computer Disk System, and wanted to continue this complementary storage strategy for the SNES. Since Sony was already contracted to produce the SPC-700 sound processor for the SNES, Nintendo contracted Sony to develop a CD-ROM add-on, tentatively titled the "Play Station" or "SNES-CD". The PlayStation name had already been trademarked by Yamaha, but Nobuyuki Idei liked it so much that he agreed to acquire it for an undisclosed sum rather than search for an alternative. Sony was keen to obtain a foothold in the rapidly expanding video game market. Having been the primary manufacturer of the MSX home computer format, Sony had wanted to use their experience in consumer electronics to produce their own video game hardware. Although the initial agreement between Nintendo and Sony was about producing a CD-ROM drive add-on, Sony had also planned to develop a SNES-compatible Sony-branded console. This iteration was intended to be more of a home entertainment system, playing both SNES cartridges and a new CD format named the "Super Disc", which Sony would design. Under the agreement, Sony would retain sole international rights to every Super Disc game, giving them a large degree of control despite Nintendo's leading position in the video game market. Furthermore, Sony would also be the sole benefactor of licensing related to music and film software that it had been aggressively pursuing as a secondary application. The Play Station was to be announced at the 1991 Consumer Electronics Show (CES) in Las Vegas. However, Nintendo president Hiroshi Yamauchi was wary of Sony's increasing leverage at this point and deemed the original 1988 contract unacceptable upon realising it essentially handed Sony control over all games written on the SNES CD-ROM format. Although Nintendo was dominant in the video game market, Sony possessed a superior research and development department. Wanting to protect Nintendo's existing licensing structure, Yamauchi cancelled all plans for the joint Nintendo–Sony SNES CD attachment without telling Sony. He sent Nintendo of America president Minoru Arakawa (his son-in-law) and chairman Howard Lincoln to Amsterdam to form a more favourable contract with Dutch conglomerate Philips, Sony's rival. This contract would give Nintendo total control over their licences on all Philips-produced machines. Kutaragi and Nobuyuki Idei, Sony's director of public relations at the time, learned of Nintendo's actions two days before the CES was due to begin. Kutaragi telephoned numerous contacts, including Philips, to no avail. On the first day of the CES, Sony announced their partnership with Nintendo and their new console, the Play Station. At 9 am on the next day, in what has been called "the greatest ever betrayal" in the industry, Howard Lincoln stepped onto the stage and revealed that Nintendo was now allied with Philips and would abandon their work with Sony. Incensed by Nintendo's renouncement, Ohga and Kutaragi decided that Sony would develop their own console. Nintendo's contract-breaking was met with consternation in the Japanese business community, as they had broken an "unwritten law" of native companies not turning against each other in favour of foreign ones. Sony's American branch considered allying with Sega to produce a CD-ROM-based machine called the Sega Multimedia Entertainment System, but the Sega board of directors in Tokyo vetoed the idea when Sega of America CEO Tom Kalinske presented them the proposal. Kalinske recalled them saying: "That's a stupid idea, Sony doesn't know how to make hardware. They don't know how to make software either. Why would we want to do this?" Sony halted their research, but decided to develop what it had developed with Nintendo and Sega into a console based on the SNES. Despite the tumultuous events at the 1991 CES, negotiations between Nintendo and Sony were still ongoing. A deal was proposed: the Play Station would still have a port for SNES games, on the condition that it would still use Kutaragi's audio chip and that Nintendo would own the rights and receive the bulk of the profits. Roughly two hundred prototype machines were created, and some software entered development. Many within Sony were still opposed to their involvement in the video game industry, with some resenting Kutaragi for jeopardising the company. Kutaragi remained adamant that Sony not retreat from the growing industry and that a deal with Nintendo would never work. Knowing that they had to take decisive action, Sony severed all ties with Nintendo on 4 May 1992. To determine the fate of the PlayStation project, Ohga chaired a meeting in June 1992, consisting of Kutaragi and several senior Sony board members. Kutaragi unveiled a proprietary CD-ROM-based system he had been secretly working on which played games with immersive 3D graphics. Kutaragi was confident that his LSI chip could accommodate one million logic gates, which exceeded the capabilities of Sony's semiconductor division at the time. Despite gaining Ohga's enthusiasm, there remained opposition from a majority present at the meeting. Older Sony executives also opposed it, who saw Nintendo and Sega as "toy" manufacturers. The opposers felt the game industry was too culturally offbeat and asserted that Sony should remain a central player in the audiovisual industry, where companies were familiar with one another and could conduct "civili[s]ed" business negotiations. After Kutaragi reminded him of the humiliation he suffered from Nintendo, Ohga retained the project and became one of Kutaragi's most staunch supporters. Ohga shifted Kutaragi and nine of his team from Sony's main headquarters to Sony Music Entertainment Japan (SMEJ), a subsidiary of the main Sony group, so as to retain the project and maintain relationships with Philips for the MMCD development project. The involvement of SMEJ proved crucial to the PlayStation's early development as the process of manufacturing games on CD-ROM format was similar to that used for audio CDs, with which Sony's music division had considerable experience. While at SMEJ, Kutaragi worked with Epic/Sony Records founder Shigeo Maruyama and Akira Sato; both later became vice-presidents of the division that ran the PlayStation business. Sony Computer Entertainment (SCE) was jointly established by Sony and SMEJ to handle the company's ventures into the video game industry. On 27 October 1993, Sony publicly announced that it was entering the game console market with the PlayStation. According to Maruyama, there was uncertainty over whether the console should primarily focus on 2D, sprite-based graphics or 3D polygon graphics. After Sony witnessed the success of Sega's Virtua Fighter (1993) in Japanese arcades, the direction of the PlayStation became "instantly clear" and 3D polygon graphics became the console's primary focus. SCE president Teruhisa Tokunaka expressed gratitude for Sega's timely release of Virtua Fighter as it proved "just at the right time" that making games with 3D imagery was possible. Maruyama claimed that Sony further wanted to emphasise the new console's ability to utilise redbook audio from the CD-ROM format in its games alongside high quality visuals and gameplay. Wishing to distance the project from the failed enterprise with Nintendo, Sony initially branded the PlayStation the "PlayStation X" (PSX). Sony formed their European division and North American division, known as Sony Computer Entertainment Europe (SCEE) and Sony Computer Entertainment America (SCEA), in January and May 1995. The divisions planned to market the new console under the alternative branding "PSX" following the negative feedback regarding "PlayStation" in focus group studies. Early advertising prior to the console's launch in North America referenced PSX, but the term was scrapped before launch. The console was not marketed with Sony's name in contrast to Nintendo's consoles. According to Phil Harrison, much of Sony's upper management feared that the Sony brand would be tarnished if associated with the console, which they considered a "toy". Since Sony had no experience in game development, it had to rely on the support of third-party game developers. This was in contrast to Sega and Nintendo, which had versatile and well-equipped in-house software divisions for their arcade games and could easily port successful games to their home consoles. Recent consoles like the Atari Jaguar and 3DO suffered low sales due to a lack of developer support, prompting Sony to redouble their efforts in gaining the endorsement of arcade-savvy developers. A team from Epic Sony visited more than a hundred companies throughout Japan in May 1993 in hopes of attracting game creators with the PlayStation's technological appeal. Sony found that many disliked Nintendo's practices, such as favouring their own games over others. Through a series of negotiations, Sony acquired initial support from Namco, Konami, and Williams Entertainment, as well as 250 other development teams in Japan alone. Namco in particular was interested in developing for PlayStation since Namco rivalled Sega in the arcade market. Attaining these companies secured influential games such as Ridge Racer (1993) and Mortal Kombat 3 (1995), Ridge Racer being one of the most popular arcade games at the time, and it was already confirmed behind closed doors that it would be the PlayStation's first game by December 1993, despite Namco being a longstanding Nintendo developer. Namco's research managing director Shegeichi Nakamura met with Kutaragi in 1993 to discuss the preliminary PlayStation specifications, with Namco subsequently basing the Namco System 11 arcade board on PlayStation hardware and developing Tekken to compete with Virtua Fighter. The System 11 launched in arcades several months before the PlayStation's release, with the arcade release of Tekken in September 1994. Despite securing the support of various Japanese studios, Sony had no developers of their own by the time the PlayStation was in development. This changed in 1993 when Sony acquired the Liverpudlian company Psygnosis (later renamed SCE Liverpool) for US$48 million, securing their first in-house development team. The acquisition meant that Sony could have more launch games ready for the PlayStation's release in Europe and North America. Ian Hetherington, Psygnosis' co-founder, was disappointed after receiving early builds of the PlayStation and recalled that the console "was not fit for purpose" until his team got involved with it. Hetherington frequently clashed with Sony executives over broader ideas; at one point it was suggested that a television with a built-in PlayStation be produced. In the months leading up to the PlayStation's launch, Psygnosis had around 500 full-time staff working on games and assisting with software development. The purchase of Psygnosis marked another turning point for the PlayStation as it played a vital role in creating the console's development kits. While Sony had provided MIPS R4000-based Sony NEWS workstations for PlayStation development, Psygnosis employees disliked the thought of developing on these expensive workstations and asked Bristol-based SN Systems to create an alternative PC-based development system. Andy Beveridge and Martin Day, owners of SN Systems, had previously supplied development hardware for other consoles such as the Mega Drive, Atari ST, and the SNES. When Psygnosis arranged an audience for SN Systems with Sony's Japanese executives at the January 1994 CES in Las Vegas, Beveridge and Day presented their prototype of the condensed development kit, which could run on an ordinary personal computer with two extension boards. Impressed, Sony decided to abandon their plans for a workstation-based development system in favour of SN Systems's, thus securing a cheaper and more efficient method for designing software. An order of over 600 systems followed, and SN Systems supplied Sony with additional software such as an assembler, linker, and a debugger. SN Systems produced development kits for future PlayStation systems, including the PlayStation 2 and was bought out by Sony in 2005. Sony strived to make game production as streamlined and inclusive as possible, in contrast to the relatively isolated approach of Sega and Nintendo. Phil Harrison, representative director of SCEE, believed that Sony's emphasis on developer assistance reduced most time-consuming aspects of development. As well as providing programming libraries, SCE headquarters in London, California, and Tokyo housed technical support teams that could work closely with third-party developers if needed. Sony did not favour their own over non-Sony products, unlike Nintendo; Peter Molyneux of Bullfrog Productions admired Sony's open-handed approach to software developers and lauded their decision to use PCs as a development platform, remarking that "[it was] like being released from jail in terms of the freedom you have". Another strategy that helped attract software developers was the PlayStation's use of the CD-ROM format instead of traditional cartridges. Nintendo cartridges were expensive to manufacture, and the company controlled all production, prioritising their own games, while inexpensive compact disc manufacturing occurred at dozens of locations around the world. The PlayStation's architecture and interconnectability with PCs was beneficial to many software developers. The use of the programming language C proved useful, as it safeguarded future compatibility of the machine should developers decide to make further hardware revisions. Despite the inherent flexibility, some developers found themselves restricted due to the console's lack of RAM. While working on beta builds of the PlayStation, Molyneux observed that its MIPS processor was not "quite as bullish" compared to that of a fast PC and said that it took his team two weeks to port their PC code to the PlayStation development kits and another fortnight to achieve a four-fold speed increase. An engineer from Ocean Software, one of Europe's largest game developers at the time, thought that allocating RAM was a challenging aspect given the 3.5 megabyte restriction. Kutaragi said that while it would have been easy to double the amount of RAM for the PlayStation, the development team refrained from doing so to keep the retail cost down. Kutaragi saw the biggest challenge in developing the system to be balancing the conflicting goals of high performance, low cost, and being easy to program for, and felt he and his team were successful in this regard. Its technical specifications were finalised in 1993 and its design during 1994. The PlayStation name and its final design were confirmed during a press conference on May 10, 1994, although the price and release dates had not been disclosed yet. Sony released the PlayStation in Japan on 3 December 1994, a week after the release of the Sega Saturn, at a price of ¥39,800. Sales in Japan began with a "stunning" success with long queues in shops. Ohga later recalled that he realised how important PlayStation had become for Sony when friends and relatives begged for consoles for their children. PlayStation sold 100,000 units on the first day and two million units within six months, although the Saturn outsold the PlayStation in the first few weeks due to the success of Virtua Fighter. By the end of 1994, 300,000 PlayStation units were sold in Japan compared to 500,000 Saturn units. A grey market emerged for PlayStations shipped from Japan to North America and Europe, with buyers of such consoles paying up to £700. "When September 1995 arrived and Sony's Playstation roared out of the gate, things immediately felt different than [sic] they did with the Saturn launch earlier that year. Sega dropped the Saturn $100 to match the Playstation's $299 debut price, but sales weren't even close—Playstations flew out the door as fast as we could get them in stock. Before the release in North America, Sega and Sony presented their consoles at the first Electronic Entertainment Expo (E3) in Los Angeles on 11 May 1995. At their keynote presentation, Sega of America CEO Tom Kalinske revealed that their Saturn console would be released immediately to select retailers at a price of $399. Next came Sony's turn: Olaf Olafsson, the head of SCEA, summoned Steve Race, the head of development, to the conference stage, who said "$299" and left the audience with a round of applause. The attention to the Sony conference was further bolstered by the surprise appearance of Michael Jackson and the showcase of highly anticipated games, including Wipeout (1995), Ridge Racer and Tekken (1994). In addition, Sony announced that no games would be bundled with the console. Although the Saturn had released early in the United States to gain an advantage over the PlayStation, the surprise launch upset many retailers who were not informed in time, harming sales. Some retailers such as KB Toys responded by dropping the Saturn entirely. The PlayStation went on sale in North America on 9 September 1995. It sold more units within two days than the Saturn had in five months, with almost all of the initial shipment of 100,000 units sold in advance and shops across the country running out of consoles and accessories. The well-received Ridge Racer contributed to the PlayStation's early success, — with some critics considering it superior to Sega's arcade counterpart Daytona USA (1994) — as did Battle Arena Toshinden (1995). There were over 100,000 pre-orders placed and 17 games available on the market by the time of the PlayStation's American launch, in comparison to the Saturn's six launch games. The PlayStation released in Europe on 29 September 1995 and in Australia on 15 November 1995. By November it had already outsold the Saturn by three to one in the United Kingdom, where Sony had allocated a £20 million marketing budget during the Christmas season compared to Sega's £4 million. Sony found early success in the United Kingdom by securing listings with independent shop owners as well as prominent High Street chains such as Comet and Argos. Within its first year, the PlayStation secured over 20% of the entire American video game market. From September to the end of 1995, sales in the United States amounted to 800,000 units, giving the PlayStation a commanding lead over the other fifth-generation consoles,[b] though the SNES and Mega Drive from the fourth generation still outsold it. Sony reported that the attach rate of sold games and consoles was four to one. To meet increasing demand, Sony chartered jumbo jets and ramped up production in Europe and North America. By early 1996, the PlayStation had grossed $2 billion (equivalent to $4.106 billion 2025) from worldwide hardware and software sales. By late 1996, sales in Europe totalled 2.2 million units, including 700,000 in the UK. Approximately 400 PlayStation games were in development, compared to around 200 games being developed for the Saturn and 60 for the Nintendo 64. In India, the PlayStation was launched in test market during 1999–2000 across Sony showrooms, selling 100 units. Sony finally launched the console (PS One model) countrywide on 24 January 2002 with the price of Rs 7,990 and 26 games available from start. PlayStation was also doing well in markets where it was never officially released. For example, in Brazil, due to the registration of the trademark by a third company, the console could not be released, which was why the market was taken over by the officially distributed Sega Saturn during the first period, but as the Sega console withdraws, PlayStation imports and large piracy increased. In another market, China, the most popular 32-bit console was Sega Saturn, but after leaving the market, PlayStation grown with a base of 300,000 users until January 2000, although Sony China did not have plans to release it. The PlayStation was backed by a successful marketing campaign, allowing Sony to gain an early foothold in Europe and North America. Initially, PlayStation demographics were skewed towards adults, but the audience broadened after the first price drop. While the Saturn was positioned towards 18- to 34-year-olds, the PlayStation was initially marketed exclusively towards teenagers. Executives from both Sony and Sega reasoned that because younger players typically looked up to older, more experienced players, advertising targeted at teens and adults would draw them in too. Additionally, Sony found that adults reacted best to advertising aimed at teenagers; Lee Clow surmised that people who started to grow into adulthood regressed and became "17 again" when they played video games. The console was marketed with advertising slogans stylised as "LIVE IN YUR WRLD. PLY IN URS" (Live in Your World. Play in Ours.) and "U R NOT E" (red E). The four geometric shapes were derived from the symbols for the four buttons on the controller. Clow thought that by invoking such provocative statements, gamers would respond to the contrary and say "'Bullshit. Let me show you how ready I am.'" As the console's appeal enlarged, Sony's marketing efforts broadened from their earlier focus on mature players to specifically target younger children as well. Shortly after the PlayStation's release in Europe, Sony tasked marketing manager Geoff Glendenning with assessing the desires of a new target audience. Sceptical over Nintendo and Sega's reliance on television campaigns, Glendenning theorised that young adults transitioning from fourth-generation consoles would feel neglected by marketing directed at children and teenagers. Recognising the influence early 1990s underground clubbing and rave culture had on young people, especially in the United Kingdom, Glendenning felt that the culture had become mainstream enough to help cultivate PlayStation's emerging identity. Sony partnered with prominent nightclub owners such as Ministry of Sound and festival promoters to organise dedicated PlayStation areas where demonstrations of select games could be tested. Sheffield-based graphic design studio The Designers Republic was contracted by Sony to produce promotional materials aimed at a fashionable, club-going audience. Psygnosis' Wipeout in particular became associated with nightclub culture as it was widely featured in venues. By 1997, there were 52 nightclubs in the United Kingdom with dedicated PlayStation rooms. Glendenning recalled that he had discreetly used at least £100,000 a year in slush fund money to invest in impromptu marketing. In 1996, Sony expanded their CD production facilities in the United States due to the high demand for PlayStation games, increasing their monthly output from 4 million discs to 6.5 million discs. This was necessary because PlayStation sales were running at twice the rate of Saturn sales, and its lead dramatically increased when both consoles dropped in price to $199 that year. The PlayStation also outsold the Saturn at a similar ratio in Europe during 1996, with 2.2 million consoles sold in the region by the end of the year. Sales figures for PlayStation hardware and software only increased following the launch of the Nintendo 64. Tokunaka speculated that the Nintendo 64 launch had actually helped PlayStation sales by raising public awareness of the gaming market through Nintendo's added marketing efforts. Despite this, the PlayStation took longer to achieve dominance in Japan. Tokunaka said that, even after the PlayStation and Saturn had been on the market for nearly two years, the competition between them was still "very close", and neither console had led in sales for any meaningful length of time. By 1998, Sega, encouraged by their declining market share and significant financial losses, launched the Dreamcast as a last-ditch attempt to stay in the industry. Although its launch was successful, the technically superior 128-bit console was unable to subdue Sony's dominance in the industry. Sony still held 60% of the overall video game market share in North America at the end of 1999. Sega's initial confidence in their new console was undermined when Japanese sales were lower than expected, with disgruntled Japanese consumers reportedly returning their Dreamcasts in exchange for PlayStation software. On 2 March 1999, Sony officially revealed details of the PlayStation 2, which Kutaragi announced would feature a graphics processor designed to push more raw polygons than any console in history, effectively rivalling most supercomputers. The PlayStation continued to sell strongly at the turn of the new millennium: in June 2000, Sony released the PSOne, a smaller, redesigned variant which went on to outsell all other consoles in that year, including the PlayStation 2. In 2005, PlayStation became the first console to ship 100 million units with the PlayStation 2 later achieving this faster than its predecessor. The combined successes of both PlayStation consoles led to Sega retiring the Dreamcast in 2001, and abandoning the console business entirely. The PlayStation was eventually discontinued on 23 March 2006—over eleven years after its release, and less than a year before the debut of the PlayStation 3. Hardware The main microprocessor is a R3000 CPU made by LSI Logic operating at a clock rate of 33.8688 MHz and 30 MIPS. This 32-bit CPU relies heavily on the "cop2" 3D and matrix math coprocessor on the same die to provide the necessary speed to render complex 3D graphics. The role of the separate GPU chip is to draw 2D polygons and apply shading and textures to them: the rasterisation stage of the graphics pipeline. Sony's custom 16-bit sound chip supports ADPCM sources with up to 24 sound channels and offers a sampling rate of up to 44.1 kHz and music sequencing. It features 2 MB of main RAM, with an additional 1 MB of video RAM. The PlayStation has a maximum colour depth of 16.7 million true colours with 32 levels of transparency and unlimited colour look-up tables. The PlayStation can output composite, S-Video or RGB video signals through its AV Multi connector (with older models also having RCA connectors for composite), displaying resolutions from 256×224 to 640×480 pixels. Different games can use different resolutions. Earlier models also had proprietary parallel and serial ports that could be used to connect accessories or multiple consoles together; these were later removed due to a lack of usage. The PlayStation uses a proprietary video compression unit, MDEC, which is integrated into the CPU and allows for the presentation of full motion video at a higher quality than other consoles of its generation. Unusual for the time, the PlayStation lacks a dedicated 2D graphics processor; 2D elements are instead calculated as polygons by the Geometry Transfer Engine (GTE) so that they can be processed and displayed on screen by the GPU. While running, the GPU can also generate a total of 4,000 sprites and 180,000 polygons per second, in addition to 360,000 per second flat-shaded. The PlayStation went through a number of variants during its production run. Externally, the most notable change was the gradual reduction in the number of external connectors from the rear of the unit. This started with the original Japanese launch units; the SCPH-1000, released on 3 December 1994, was the only model that had an S-Video port, as it was removed from the next model. Subsequent models saw a reduction in number of parallel ports, with the final version only retaining one serial port. Sony marketed a development kit for amateur developers known as the Net Yaroze (meaning "Let's do it together" in Japanese). It was launched in June 1996 in Japan, and following public interest, was released the next year in other countries. The Net Yaroze allowed hobbyists to create their own games and upload them via an online forum run by Sony. The console was only available to buy through an ordering service and with the necessary documentation and software to program PlayStation games and applications through C programming compilers. On 7 July 2000, Sony released the PS One (stylised as "PS one" or "PSone"), a smaller, redesigned version of the original PlayStation. It was the highest-selling console through the end of the year, outselling all other consoles—including the PlayStation 2. In 2002, Sony released a 5-inch (130 mm) LCD screen add-on for the PS One, referred to as the "Combo pack". It also included a car cigarette lighter adaptor adding an extra layer of portability. Production of the LCD "Combo Pack" ceased in 2004, when the popularity of the PlayStation began to wane in markets outside Japan. A total of 28.15 million PS One units had been sold by the time it was discontinued in March 2006. Three iterations of the PlayStation's controller were released over the console's lifespan. The first controller, the PlayStation controller, was released alongside the PlayStation in December 1994. It features four individual directional buttons (as opposed to a conventional D-pad), a pair of shoulder buttons on both sides, Start and Select buttons in the centre, and four face buttons consisting of simple geometric shapes: a green triangle, red circle, blue cross, and a pink square (, , , ). Rather than depicting traditionally used letters or numbers onto its buttons, the PlayStation controller established a trademark which would be incorporated heavily into the PlayStation brand. Teiyu Goto, the designer of the original PlayStation controller, said that the circle and cross represent "yes" and "no", respectively (though this layout is reversed in Western versions); the triangle symbolises a point of view and the square is equated to a sheet of paper to be used to access menus. The European and North American models of the original PlayStation controllers are roughly 10% larger than its Japanese variant, to account for the fact the average person in those regions has larger hands than the average Japanese person. Sony's first analogue gamepad, the PlayStation Analog Joystick (often erroneously referred to as the "Sony Flightstick"), was first released in Japan in April 1996. Featuring two parallel joysticks, it uses potentiometer technology previously used on consoles such as the Vectrex; instead of relying on binary eight-way switches, the controller detects minute angular changes through the entire range of motion. The stick also features a thumb-operated digital hat switch on the right joystick, corresponding to the traditional D-pad, and used for instances when simple digital movements were necessary. The Analog Joystick sold poorly in Japan due to its high cost and cumbersome size. The increasing popularity of 3D games prompted Sony to add analogue sticks to its controller design to give users more freedom over their movements in virtual 3D environments. The first official analogue controller, the Dual Analog Controller, was revealed to the public in a small glass booth at the 1996 PlayStation Expo in Japan, and released in April 1997 to coincide with the Japanese releases of analogue-capable games Tobal 2 and Bushido Blade. In addition to the two analogue sticks (which also introduced two new buttons mapped to clicking in the analogue sticks), the Dual Analog controller features an "Analog" button and LED beneath the "Start" and "Select" buttons which toggles analogue functionality on or off. The controller also features rumble support, though Sony decided that haptic feedback would be removed from all overseas iterations before the United States release. A Sony spokesman stated that the feature was removed for "manufacturing reasons", although rumours circulated that Nintendo had attempted to legally block the release of the controller outside Japan due to similarities with the Nintendo 64 controller's Rumble Pak. However, a Nintendo spokesman denied that Nintendo took legal action. Next Generation's Chris Charla theorised that Sony dropped vibration feedback to keep the price of the controller down. In November 1997, Sony introduced the DualShock controller. Its name derives from its use of two (dual) vibration motors (shock). Unlike its predecessor, its analogue sticks feature textured rubber grips, longer handles, slightly different shoulder buttons and has rumble feedback included as standard on all versions. The DualShock later replaced its predecessors as the default controller. Sony released a series of peripherals to add extra layers of functionality to the PlayStation. Such peripherals include memory cards, the PlayStation Mouse, the PlayStation Link Cable, the Multiplayer Adapter (a four-player multitap), the Memory Drive (a disk drive for 3.5-inch floppy disks), the GunCon (a light gun), and the Glasstron (a monoscopic head-mounted display). Released exclusively in Japan, the PocketStation is a memory card peripheral which acts as a miniature personal digital assistant. The device features a monochrome liquid crystal display (LCD), infrared communication capability, a real-time clock, built-in flash memory, and sound capability. Sharing similarities with the Dreamcast's VMU peripheral, the PocketStation was typically distributed with certain PlayStation games, enhancing them with added features. The PocketStation proved popular in Japan, selling over five million units. Sony planned to release the peripheral outside Japan but the release was cancelled, despite receiving promotion in Europe and North America. In addition to playing games, most PlayStation models are equipped to play CD-Audio. The Asian model SCPH-5903 can also play Video CDs. Like most CD players, the PlayStation can play songs in a programmed order, shuffle the playback order of the disc and repeat one song or the entire disc. Later PlayStation models use a music visualisation function called SoundScope. This function, as well as a memory card manager, is accessed by starting the console without either inserting a game or closing the CD tray, thereby accessing a graphical user interface (GUI) for the PlayStation BIOS. The GUI for the PS One and PlayStation differ depending on the firmware version: the original PlayStation GUI had a dark blue background with rainbow graffiti used as buttons, while the early PAL PlayStation and PS One GUI had a grey blocked background with two icons in the middle. PlayStation emulation is versatile and can be run on numerous modern devices. Bleem! was a commercial emulator which was released for IBM-compatible PCs and the Dreamcast in 1999. It was notable for being aggressively marketed during the PlayStation's lifetime, and was the centre of multiple controversial lawsuits filed by Sony. Bleem! was programmed in assembly language, which allowed it to emulate PlayStation games with improved visual fidelity, enhanced resolutions, and filtered textures that was not possible on original hardware. Sony sued Bleem! two days after its release, citing copyright infringement and accusing the company of engaging in unfair competition and patent infringement by allowing use of PlayStation BIOSs on a Sega console. Bleem! were subsequently forced to shut down in November 2001. Sony was aware that using CDs for game distribution could have left games vulnerable to piracy, due to the growing popularity of CD-R and optical disc drives with burning capability. To preclude illegal copying, a proprietary process for PlayStation disc manufacturing was developed that, in conjunction with an augmented optical drive in Tiger H/E assembly, prevented burned copies of games from booting on an unmodified console. Specifically, all genuine PlayStation discs were printed with a small section of deliberate irregular data, which the PlayStation's optical pick-up was capable of detecting and decoding. Consoles would not boot game discs without a specific wobble frequency contained in the data of the disc pregap sector (the same system was also used to encode discs' regional lockouts). This signal was within Red Book CD tolerances, so PlayStation discs' actual content could still be read by a conventional disc drive; however, the disc drive could not detect the wobble frequency (therefore duplicating the discs omitting it), since the laser pick-up system of any optical disc drive would interpret this wobble as an oscillation of the disc surface and compensate for it in the reading process. Early PlayStations, particularly early 1000 models, experience skipping full-motion video or physical "ticking" noises from the unit. The problems stem from poorly placed vents leading to overheating in some environments, causing the plastic mouldings inside the console to warp slightly and create knock-on effects with the laser assembly. The solution is to sit the console on a surface which dissipates heat efficiently in a well vented area or raise the unit up slightly from its resting surface. Sony representatives also recommended unplugging the PlayStation when it is not in use, as the system draws in a small amount of power (and therefore heat) even when turned off. The first batch of PlayStations use a KSM-440AAM laser unit, whose case and movable parts are all built out of plastic. Over time, the plastic lens sled rail wears out—usually unevenly—due to friction. The placement of the laser unit close to the power supply accelerates wear, due to the additional heat, which makes the plastic more vulnerable to friction. Eventually, one side of the lens sled will become so worn that the laser can tilt, no longer pointing directly at the CD; after this, games will no longer load due to data read errors. Sony fixed the problem by making the sled out of die-cast metal and placing the laser unit further away from the power supply on later PlayStation models. Due to an engineering oversight, the PlayStation does not produce a proper signal on several older models of televisions, causing the display to flicker or bounce around the screen. Sony decided not to change the console design, since only a small percentage of PlayStation owners used such televisions, and instead gave consumers the option of sending their PlayStation unit to a Sony service centre to have an official modchip installed, allowing play on older televisions. Game library The PlayStation featured a diverse game library which grew to appeal to all types of players. Critically acclaimed PlayStation games included Final Fantasy VII (1997), Crash Bandicoot (1996), Spyro the Dragon (1998), Metal Gear Solid (1998), all of which became established franchises. Final Fantasy VII is credited with allowing role-playing games to gain mass-market appeal outside Japan, and is considered one of the most influential and greatest video games ever made. The PlayStation's bestselling game is Gran Turismo (1997), which sold 10.85 million units. After the PlayStation's discontinuation in 2006, the cumulative software shipment was 962 million units. Following its 1994 launch in Japan, early games included Ridge Racer, Crime Crackers, King's Field, Motor Toon Grand Prix, Toh Shin Den (i.e. Battle Arena Toshinden), and Kileak: The Blood. The first two games available at its later North American launch were Jumping Flash! (1995) and Ridge Racer, with Jumping Flash! heralded as an ancestor for 3D graphics in console gaming. Wipeout, Air Combat, Twisted Metal, Warhawk and Destruction Derby were among the popular first-year games, and the first to be reissued as part of Sony's Greatest Hits or Platinum range. At the time of the PlayStation's first Christmas season, Psygnosis had produced around 70% of its launch catalogue; their breakthrough racing game Wipeout was acclaimed for its techno soundtrack and helped raise awareness of Britain's underground music community. Eidos Interactive's action-adventure game Tomb Raider contributed substantially to the success of the console in 1996, with its main protagonist Lara Croft becoming an early gaming icon and garnering unprecedented media promotion. Licensed tie-in video games of popular films were also prevalent; Argonaut Games' 2001 adaptation of Harry Potter and the Philosopher's Stone went on to sell over eight million copies late in the console's lifespan. Third-party developers committed largely to the console's wide-ranging game catalogue even after the launch of the PlayStation 2; some of the notable exclusives in this era include Harry Potter and the Philosopher's Stone, Fear Effect 2: Retro Helix, Syphon Filter 3, C-12: Final Resistance, Dance Dance Revolution Konamix and Digimon World 3.[c] Sony assisted with game reprints as late as 2008 with Metal Gear Solid: The Essential Collection, this being the last PlayStation game officially released and licensed by Sony. Initially, in the United States, PlayStation games were packaged in long cardboard boxes, similar to non-Japanese 3DO and Saturn games. Sony later switched to the jewel case format typically used for audio CDs and Japanese video games, as this format took up less retailer shelf space (which was at a premium due to the large number of PlayStation games being released), and focus testing showed that most consumers preferred this format. Reception The PlayStation was mostly well received upon release. Critics in the west generally welcomed the new console; the staff of Next Generation reviewed the PlayStation a few weeks after its North American launch, where they commented that, while the CPU is "fairly average", the supplementary custom hardware, such as the GPU and sound processor, is stunningly powerful. They praised the PlayStation's focus on 3D, and complemented the comfort of its controller and the convenience of its memory cards. Giving the system 41⁄2 out of 5 stars, they concluded, "To succeed in this extremely cut-throat market, you need a combination of great hardware, great games, and great marketing. Whether by skill, luck, or just deep pockets, Sony has scored three out of three in the first salvo of this war." Albert Kim from Entertainment Weekly praised the PlayStation as a technological marvel, rivalling that of Sega and Nintendo. Famicom Tsūshin scored the console a 19 out of 40, lower than the Saturn's 24 out of 40, in May 1995. In a 1997 year-end review, a team of five Electronic Gaming Monthly editors gave the PlayStation scores of 9.5, 8.5, 9.0, 9.0, and 9.5—for all five editors, the highest score they gave to any of the five consoles reviewed in the issue. They lauded the breadth and quality of the games library, saying it had vastly improved over previous years due to developers mastering the system's capabilities in addition to Sony revising their stance on 2D and role playing games. They also complimented the low price point of the games compared to the Nintendo 64's, and noted that it was the only console on the market that could be relied upon to deliver a solid stream of games for the coming year, primarily due to third party developers almost unanimously favouring it over its competitors. Legacy SCE was an upstart in the video game industry in late 1994, as the video game market in the early 1990s was dominated by Nintendo and Sega. Nintendo had been the clear leader in the industry since the introduction of the Nintendo Entertainment System in 1985 and the Nintendo 64 was initially expected to maintain this position. The PlayStation's target audience included the generation which was the first to grow up with mainstream video games, along with 18- to 29-year-olds who were not the primary focus of Nintendo. By the late 1990s, Sony became a highly regarded console brand due to the PlayStation, with a significant lead over second-place Nintendo, while Sega was relegated to a distant third. The PlayStation became the first "computer entertainment platform" to ship over 100 million units worldwide, with many critics attributing the console's success to third-party developers. It remains the sixth best-selling console of all time as of 2025[update], with a total of 102.49 million units sold. Around 7,900 individual games were published for the console during its 11-year life span, the second-most games ever produced for a console. Its success resulted in a significant financial boon for Sony as profits from their video game division contributed to 23%. Sony's next-generation PlayStation 2, which is backward compatible with the PlayStation's DualShock controller and games, was announced in 1999 and launched in 2000. The PlayStation's lead in installed base and developer support paved the way for the success of its successor, which overcame the earlier launch of the Sega's Dreamcast and then fended off competition from Microsoft's newcomer Xbox and Nintendo's GameCube. The PlayStation 2's immense success and failure of the Dreamcast were among the main factors which led to Sega abandoning the console market. To date, five PlayStation home consoles have been released, which have continued the same numbering scheme, as well as two portable systems. The PlayStation 3 also maintained backward compatibility with original PlayStation discs. Hundreds of PlayStation games have been digitally re-released on the PlayStation Portable, PlayStation 3, PlayStation Vita, PlayStation 4, and PlayStation 5. The PlayStation has often ranked among the best video game consoles. In 2018, Retro Gamer named it the third best console, crediting its sophisticated 3D capabilities as one of its key factors in gaining mass success, and lauding it as a "game-changer in every sense possible". In 2009, IGN ranked the PlayStation the seventh best console in their list, noting its appeal towards older audiences to be a crucial factor in propelling the video game industry, as well as its assistance in transitioning game industry to use the CD-ROM format. Keith Stuart from The Guardian likewise named it as the seventh best console in 2020, declaring that its success was so profound it "ruled the 1990s". In January 2025, Lorentio Brodesco announced the nsOne project, attempting to reverse engineer PlayStation's motherboard. Brodesco stated that "detailed documentation on the original motherboard was either incomplete or entirely unavailable". The project was successfully crowdfunded via Kickstarter. In June, Brodesco manufactured the first working motherboard, promising to bring a fully rooted version with multilayer routing as well as documentation and design files in the near future. The success of the PlayStation contributed to the demise of cartridge-based home consoles. While not the first system to use an optical disc format, it was the first highly successful one, and ended up going head-to-head with the proprietary cartridge-relying Nintendo 64,[d] which the industry had expected to use CDs like PlayStation. After the demise of the Sega Saturn, Nintendo was left as Sony's main competitor in Western markets. Nintendo chose not to use CDs for the Nintendo 64; they were likely concerned with the proprietary cartridge format's ability to help enforce copy protection, given their substantial reliance on licensing and exclusive games for their revenue. Besides their larger capacity, CD-ROMs could be produced in bulk quantities at a much faster rate than ROM cartridges, a week compared to two to three months. Further, the cost of production per unit was far cheaper, allowing Sony to offer games about 40% lower cost to the user compared to ROM cartridges while still making the same amount of net revenue. In Japan, Sony published fewer copies of a wide variety of games for the PlayStation as a risk-limiting step, a model that had been used by Sony Music for CD audio discs. The production flexibility of CD-ROMs meant that Sony could produce larger volumes of popular games to get onto the market quickly, something that could not be done with cartridges due to their manufacturing lead time. The lower production costs of CD-ROMs also allowed publishers an additional source of profit: budget-priced reissues of games which had already recouped their development costs. Tokunaka remarked in 1996: Choosing CD-ROM is one of the most important decisions that we made. As I'm sure you understand, PlayStation could just as easily have worked with masked ROM [cartridges]. The 3D engine and everything—the whole PlayStation format—is independent of the media. But for various reasons (including the economies for the consumer, the ease of the manufacturing, inventory control for the trade, and also the software publishers) we deduced that CD-ROM would be the best media for PlayStation. The increasing complexity of developing games pushed cartridges to their storage limits and gradually discouraged some third-party developers. Part of the CD format's appeal to publishers was that they could be produced at a significantly lower cost and offered more production flexibility to meet demand. As a result, some third-party developers switched to the PlayStation, including Square and Enix, whose Final Fantasy VII and Dragon Quest VII respectively had been planned for the Nintendo 64 (both companies later merged to form Square Enix). Other developers released fewer games for the Nintendo 64 (Konami, releasing only thirteen N64 games but over fifty on the PlayStation). Nintendo 64 game releases were less frequent than the PlayStation's, with many being developed by either Nintendo themselves or second-parties such as Rare. The PlayStation Classic is a dedicated video game console made by Sony Interactive Entertainment that emulates PlayStation games. It was announced in September 2018 at the Tokyo Game Show, and released on 3 December 2018, the 24th anniversary of the release of the original console. As a dedicated console, the PlayStation Classic features 20 pre-installed games; the games run off the open source emulator PCSX. The console is bundled with two replica wired PlayStation controllers (those without analogue sticks), an HDMI cable, and a USB-Type A cable. Internally, the console uses a MediaTek MT8167a Quad A35 system on a chip with four central processing cores clocked at @ 1.5 GHz and a Power VR GE8300 graphics processing unit. It includes 16 GB of eMMC flash storage and 1 Gigabyte of DDR3 SDRAM. The PlayStation Classic is 45% smaller than the original console. The PlayStation Classic received negative reviews from critics and was compared unfavorably to Nintendo's rival Nintendo Entertainment System Classic Edition and Super Nintendo Entertainment System Classic Edition. Criticism was directed at its meagre game library, user interface, emulation quality, use of PAL versions for certain games, use of the original controller, and high retail price, though the console's design received praise. The console sold poorly. See also Notes References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Robur_Carolinum] | [TOKENS: 193] |
Contents Robur Carolinum Robur Carolinum (Latin for Charles' oak) is a former constellation created by the English astronomer Edmond Halley in 1679. The name refers to the Royal Oak where Halley's patron, King Charles II of Britain, was said to have hidden from the troops of Oliver Cromwell after the Battle of Worcester. It was located in the southern skies, between Centaurus and Carina, extending into half of Vela. Robur Carolinum was included in some star atlases for over a century, but it was eventually retired. Nicolas Louis de Lacaille complained that it took some of the finest stars from Argo Navis. Its brightest star was Beta Carinae (β Car) or Miaplacidus, which was known as α Roburis or α Roburis Carolii. See also References This star-related article is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Innaba] | [TOKENS: 1134] |
Contents Innaba 'Innaba (Arabic: عنابة), also spelled 'Annaba, was a Palestinian village in the Ramle Subdistrict of Mandatory Palestine. It was depopulated during the 1948 Arab–Israeli War on July 10, 1948 by the Yiftach and Eighth Brigades of Operation Dani. It was located 7 km east of Ramla. Etymology In Roman times, the village was called "Betoannaba" (Bετοάνναβα), meaning "House of the Grape". History Ceramics from the Roman and Byzantine periods have been found here. Al-Muqaddasi (c. 945/946 - 991), in his description of Ramla, noted that it had a gate called "The gate of the Innaba Mosque". 'Innaba, like the rest of Palestine, was incorporated into the Ottoman Empire in 1517. In 1552, 'Innaba was an inhabited village. Haseki Hürrem Sultan, the favourite wife of Suleiman the Magnificent, endowed the tax revenues of 'Annaba to its Haseki Sultan Imaret in Jerusalem. Administratively, the village belonged to the Sub-district of Ramla in the District of Gaza. During that time, the villagers drank from an arthesian well called Bayyarat 'Annaba. In the tax records of 1596 it was a village in the nahiya ("subdistrict") of Ramla, part of Gaza Sanjak, with a population of 30 households; an estimated 165 people, all Muslims. The villagers paid a fixed tax rate of 25% on agricultural products, which included wheat, barley, summer crops, olive trees, sesame, vineyards, fruit trees, goats and beehives, in addition to occasional revenues; a total of 4,200 akçe. All of the revenues went to a waqf. In 1838, it was noted as a Muslim village, 'Anabeh, in the District of Lydda. In 1863, Victor Guérin found that it had 900 inhabitants, while an Ottoman village list from about 1870 noted it as having a population of 250, in 79 houses, though the population count included men, only. In 1883, the PEF's Survey of Western Palestine described it as "A village of moderate size, on high ground, surrounded with olives, with a well to the south. The houses are of mud. It is mentioned by Jerome [.. ] as 4 Roman miles east of Lydda, and as called Betho Annaba. The distance fits almost exactly." In the 1922 census of Palestine, conducted by the British Mandate authorities, Ennabeh had a population of 863; 862 Muslims, and one Orthodox Christian. In the 1931 census Innaba was counted with Al-Kunayyisa, together they had 1135 Muslim inhabitants, in 288 houses. An elementary school for boys was founded in 1920 and in 1945, it had an enrollment of 168 students. Innaba also had a mosque, which was dedicated to al-Shaykh 'Abd Allah and had a shrine for him. In the 1945 statistics it had population of 1,420 Muslims, while the total land area was 12,857 dunams, according to an official land and population survey. Of this, a total of 111 dunams were uses for citrus and bananas, 511 were plantations and irrigable land, 10,626 were used for cereals, while 54 dunams were classified as built-up public areas. During the British Mandate period, 'Innaba was one of the key areas of Lime production for the developing urban centers along Palestine's coastal plain. The village was depopulated on July 10, 1948, after a military assault by the Israeli army. On the same day, Operation Danny head quarter ordered the Yiftach Brigade to blow up most of Innaba and Al-Tira, leaving only houses enough for a small garrison. The Israeli settlement of Kefar Shemu'el was established on Innaba land in 1950. Innaba was described in 1992: "The site, which overlooks the Jerusalem-Tel Aviv highway a few km from al-Latrun and its abbey, is fenced off and is difficult to enter. It is covered with heaps of rubble and overgrown with vegetation, including cactuses and stunted olive and Christ's-thorn trees from the pre-1948 period. In addition to the rubble of houses, the debris from the school and the local headquarters of the Arab Palestine Party are visible. In the cemetery, the tombs of Hasan Badwan and Ayish Badwan are prominent because of their stone super structures. A Christ's-thorn tree rises amidst the rubble of the former house of Muhammad Tummalay, and a denuded mulberry tree stands amid the rubble of Muhammad 'Abd Allah's house. The surrounding land is cultivated but remnants of the old agriculture remain, such as the groves of 'Ali al-Kasji, with their olive and pomegranate trees and cactus clusters, and the olive trees on the land of Abu Rummana. A deserted well with a heap of stones around its mouth lies in the section of the village formerly referred to as al-'Attan." References Bibliography External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/University_of_Chicago_Press] | [TOKENS: 1957] |
Contents University of Chicago Press The University of Chicago Press is the university press of the University of Chicago, a private research university in Chicago, Illinois. It is the largest and one of the oldest university presses in North America. It publishes a wide range of academic titles, including The Chicago Manual of Style, numerous academic journals, and advanced monographs in the academic fields. The press is located just south of the Midway Plaisance on the University of Chicago campus. One of its quasi-independent projects is the BiblioVault, a digital repository for scholarly books. History The University of Chicago Press was founded in 1890, making it one of the oldest continuously operating university presses in the United States. Its first published book was Robert F. Harper's Assyrian and Babylonian Letters Belonging to the Kouyunjik Collections of the British Museum. The book sold five copies during its first two years, but by 1900, the University of Chicago Press had published 127 books and pamphlets and 11 scholarly journals, including the current Journal of Political Economy, Journal of Near Eastern Studies, and American Journal of Sociology. For its first three years, the press was an entity discrete from the university; it was operated by the Boston publishing house D. C. Heath in conjunction with the Chicago printer R. R. Donnelley. This arrangement proved unworkable, however, and in 1894, the university officially assumed responsibility for the press. In 1902, as part of the university, the press started working on the Decennial Publications. Composed of articles and monographs by scholars and administrators on the state of the university and its faculty's research, the Decennial Publications was a radical reorganization of the press. This allowed the press, by 1905, to begin publishing books by scholars not of the University of Chicago. A manuscript editing and proofreading department was added to the existing staff of printers and typesetters, leading, in 1906, to the first edition of The Chicago Manual of Style. By 1931, the press was an established, leading academic publisher. Leading books of that era include Edgar J. Goodspeed's The New Testament: An American Translation (the press's first nationally successful title) and its successor, Goodspeed and J. M. Povis Smith's The Complete Bible: An American Translation; Sir William Alexander Craigie's A Dictionary of American English on Historical Principles, published in four volumes in 1943; John Manly and Edith Rickert's The Canterbury Tales, published in 1940; and Kate Turabian's A Manual for Writers of Term Papers, Theses, and Dissertations. In 1956, the press first published paperback-bound books (including the Phoenix Books series) under its imprint. Of the press's best-known books, most date from the 1950s, including translations of the Complete Greek Tragedies and Richmond Lattimore's The Iliad of Homer. That decade also saw the first edition of A Greek-English Lexicon of the New Testament and Other Early Christian Literature, which has since been used by students of Biblical Greek worldwide. In 1966, Morris Philipson began his 34-year tenure as director of the University of Chicago Press. He committed time and resources to lengthening the backlist, becoming known for assuming ambitious scholarly projects, among the largest of which was The Lisle Letters — a vast collection of 16th-century correspondence by Arthur Plantagenet, 1st Viscount Lisle, a wealth of information about every aspect of 16th-century life. As the press's scholarly volume expanded, the press also advanced as a trade publisher. In 1992, Norman Maclean's books A River Runs Through It and Young Men and Fire were national best sellers, and A River Runs Through It was made into a film directed by and starring Robert Redford. In 1982, Philipson was the first director of an academic press to win the Publisher Citation, one of PEN's most prestigious awards. Shortly before he retired in June 2000, Philipson received the Association of American Publishers' Curtis Benjamin Award for Creative Publishing, awarded to the person whose "creativity and leadership have left a lasting mark on American publishing." Paula Barker Duffy served as director of the press from 2000 to 2007. Under her administration, the press expanded its distribution operations and created the Chicago Digital Distribution Center and BiblioVault. Editorial depth in reference and regional books increased with titles such as The Encyclopedia of Chicago, Timothy J. Gilfoyle's Millennium Park, and new editions of The Chicago Manual of Style, the Turabian Manual, and The University of Chicago Spanish Dictionary. The press also launched an electronic reference work, The Chicago Manual of Style Online. In 2014, the press received The International Academic and Professional Publisher Award for excellence at the London Book Fair. Current status Garrett P. Kiely became the 15th director of the University of Chicago Press on September 1, 2007. He heads one of academic publishing's largest operations, employing more than 300 people across three divisions—books, journals, and distribution—and publishing 92 journal titles and approximately 280 new books and 70 paperback reprints each year. The press publishes over 50 new trade titles per year, across many subject areas. It also publishes regional titles, such as The Encyclopedia of Chicago (2004), edited by James R. Grossman, Ann Durkin Keating, and Janice Reiff; The Chicagoan: A Lost Magazine of the Jazz Age (2008) by Neil Harris; One More Time: The Best of Mike Royko (1999), a collection of columns by Pulitzer Prize-winning newspaperman Mike Royko of the Chicago Sun-Times and the Chicago Tribune; and many other books about the art, architecture, and nature of Chicago and the Midwest. The press has recently expanded its digital offerings to include most newly published books as well as key backlist titles. In 2013, Chicago Journals began offering e-book editions of each new issue of each journal, for use on e-reader devices such as smartphones, iPad, and Amazon Kindle. The contents of The Chicago Manual of Style are available online to paid subscribers. The Chicago Distribution Center is recognized as a leading distributor of scholarly works, with over 100 client presses. The Books Division of the University of Chicago Press has been publishing books for scholars, students, and general readers since 1892 and has published over 11,000 books since its founding. The Books Division has more than 6,000 books in print, including such well-known works as The Chicago Manual of Style (1906); The Structure of Scientific Revolutions (1962), by Thomas Kuhn; A River Runs Through It (1976), by Norman Maclean; and The Road to Serfdom (1944), by F. A. Hayek. In July 2009, the press announced the Chicago Digital Editions program, which made many of the press's titles available in e-book form for sale to individuals. As of August 2016, more than 3,500 titles are available in this format. In August 2010, the press published the 16th Edition of The Chicago Manual of Style simultaneously in print and online editions. The Books Division offers a Free E-book Of The Month program, through which site visitors may provide their e-mail address and receive a link to that month's free, downloadable e-book selection. University of Chicago Press joined The Association of American Publishers trade organization in the Hachette v. Internet Archive lawsuit which resulted in the removal of access to over 500,000 books from global readers. The Journals Division of the University of Chicago Press publishes and distributes influential scholarly publications on behalf of learned and professional societies and associations, foundations, museums, and other not-for-profit organizations. As of 2016, it publishes 81 titles in a wide range of academic disciplines including the biological and medical sciences, education, the humanities, the physical sciences, and the social sciences. All are peer-reviewed journals of original scholarship, with readerships that include scholars, scientists, and medical practitioners as well as interested, educated laypeople. Since 1974, the press has published the prestigious humanities journal Critical Inquiry. The Journals Division has been a pioneer in making scholarly and scientific journals available in electronic form in conjunction with their print editions. Electronic publishing efforts were launched in 1995; by 2004, all the journals published by the University of Chicago Press were available online. In 2013, all new journal issues were also made available to subscribers in e-book format. The Distribution Services Division provides the University of Chicago Press's customer service, warehousing, and related services. The Chicago Distribution Center (CDC) began providing distribution services in 1991, when the University of Tennessee Press became its first client. The CDC serves nearly 100 publishers including Northwestern University Press, Stanford University Press, Temple University Press, University of Iowa Press, University of Minnesota Press, and many others. Since 2001, with development funding from the Mellon Foundation, the Chicago Digital Distribution Center (CDDC) has been offering digital printing services and the BiblioVault digital repository services to book publishers. In 2009, the CDC enabled the sales of electronic books directly to individuals and provided digital delivery services for the University of Michigan Press among others. The Chicago Distribution Center has also partnered with an additional 15 presses, including the University of Missouri Press, West Virginia University Press, and publications of the Getty Foundation. See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Extraterrestrial_life#cite_note-156] | [TOKENS: 11349] |
Contents Extraterrestrial life Extraterrestrial life, or alien life (colloquially aliens), is life that originates from another world rather than on Earth. No extraterrestrial life has yet been scientifically or conclusively detected. Such life might range from simple forms such as prokaryotes to intelligent beings, possibly bringing forth civilizations that might be far more, or far less, advanced than humans. The Drake equation speculates about the existence of sapient life elsewhere in the universe. The science of extraterrestrial life is known as astrobiology. Speculation about inhabited worlds beyond Earth dates back to antiquity. Early Christian writers, including Augustine, discussed ideas from thinkers like Democritus and Epicurus about countless worlds in the vast universe. Pre-modern writers typically assumed extraterrestrial "worlds" were inhabited by living beings. William Vorilong, in the 15th century, acknowledged the possibility Jesus could have visited extraterrestrial worlds to redeem their inhabitants.: 26 In 1440, Nicholas of Cusa suggested Earth is a "brilliant star"; he theorized that all celestial bodies, even the Sun, could host life. Descartes wrote that there were no means to prove the stars were not inhabited by "intelligent creatures", but their existence was a matter of speculation.: 67 In comparison to the life-abundant Earth, the vast majority of intrasolar and extrasolar planets and moons have harsh surface conditions and disparate atmospheric chemistry, or lack an atmosphere. However, there are many extreme and chemically harsh ecosystems on Earth that do support forms of life and are often hypothesized to be the origin of life on Earth. Examples include life surrounding hydrothermal vents, acidic hot springs, and volcanic lakes, as well as halophiles and the deep biosphere. Since the mid-20th century, researchers have searched for extraterrestrial life and intelligence. Solar system studies focus on Venus, Mars, Europa, and Titan, while exoplanet discoveries now total 6,022 confirmed planets in 4,490 systems as of October 2025. Depending on the category of search, methods range from analysis of telescope and specimen data to radios used to detect and transmit interstellar communication. Interstellar travel remains largely hypothetical, with only the Voyager 1 and Voyager 2 probes confirmed to have entered the interstellar medium. The concept of extraterrestrial life, especially intelligent life, has greatly influenced culture and fiction. A key debate centers on contacting extraterrestrial intelligence: some advocate active attempts, while others warn it could be risky, given human history of exploiting other societies. Context Initially, after the Big Bang, the universe was too hot to allow life. It is estimated that the temperature of the universe was around 10 billion Kelvin at the one-second mark. Roughly 15 million years later, it cooled to temperate levels, though the elements of organic life were yet nonexistent. The only freely available elements at that point were hydrogen and helium. Carbon and oxygen (and later, water) would not appear until 50 million years later, created through stellar fusion. At that point, the difficulty for life to appear was not the temperature, but the scarcity of free heavy elements. Planetary systems emerged, and the first organic compounds may have formed in the protoplanetary disk of dust grains that would eventually create rocky planets like Earth. Although Earth was in a molten state after its birth and may have burned any organics that fell on it, it would have been more receptive once it cooled down. Once the right conditions on Earth were met, life started by a chemical process known as abiogenesis. Alternatively, life may have formed less frequently, then spread—by meteoroids, for example—between habitable planets in a process called panspermia. During most of its stellar evolution, stars combine hydrogen nuclei to make helium nuclei by stellar fusion, and the comparatively lighter weight of helium allows the star to release the extra energy. The process continues until the star uses all of its available fuel, with the speed of consumption being related to the size of the star. During its last stages, stars start combining helium nuclei to form carbon nuclei. The larger stars can further combine carbon nuclei to create oxygen and silicon, oxygen into neon and sulfur, and so on until iron. Ultimately, the star blows much of its content back into the stellar medium, where it would join clouds that would eventually become new generations of stars and planets. Many of those materials are the raw components of life on Earth. As this process takes place in all the universe, said materials are ubiquitous in the cosmos and not a rarity from the Solar System. Earth is a planet in the Solar System, a planetary system formed by a star at the center, the Sun, and the objects that orbit it: other planets, moons, asteroids, and comets. The sun is part of the Milky Way, a galaxy. The Milky Way is part of the Local Group, a galaxy group that is in turn part of the Laniakea Supercluster. The universe is composed of all similar structures in existence. The immense distances between celestial objects are a difficulty for studying extraterrestrial life. So far, humans have only set foot on the Moon and sent robotic probes to other planets and moons in the Solar System. Although probes can withstand conditions that may be lethal to humans, the distances cause time delays: the New Horizons took nine years after launch to reach Pluto. No probe has ever reached extrasolar planetary systems. The Voyager 2 left the Solar System at a speed of 50,000 kilometers per hour; if it headed towards the Alpha Centauri system, the closest one to Earth at 4.4 light years, it would reach it in 100,000 years. Under current technology, such systems can only be studied by telescopes, which have limitations. It is estimated that dark matter has a larger amount of combined matter than stars and gas clouds, but as it plays no role in the stellar evolution of stars and planets, it is usually not taken into account by astrobiology. There is an area around a star, the circumstellar habitable zone or "Goldilocks zone", wherein water may be at the right temperature to exist in liquid form at a planetary surface. This area is neither too close to the star, where water would become steam, nor too far away, where water would be frozen as ice. However, although useful as an approximation, planetary habitability is complex and defined by several factors. Being in the habitable zone is not enough for a planet to be habitable, not even to actually have such liquid water. Venus is located in the solar system's habitable zone, but does not have liquid water because of the conditions of its atmosphere. Jovian planets or gas giants are not considered habitable even if they orbit close enough to their stars as hot Jupiters, due to crushing atmospheric pressures. The actual distances for the habitable zones vary according to the type of star, and even the solar activity of each specific star influences the local habitability. The type of star also defines the time the habitable zone will exist, as its presence and limits will change along with the star's stellar evolution. The Big Bang occurred 13.8 billion years ago, the Solar System was formed 4.6 billion years ago, and the first hominids appeared 6 million years ago. Life on other planets may have started, evolved, given birth to extraterrestrial intelligences, and perhaps even faced a planetary extinction event millions or billions of years ago. When considered from a cosmic perspective, the brief times of existence of Earth's species may suggest that extraterrestrial life may be equally fleeting under such a scale. During a period of about 7 million years, from about 10 to 17 million years after the Big Bang, the background temperature was between 373 and 273 K (100 and 0 °C; 212 and 32 °F), allowing the possibility of liquid water if any planets existed. Avi Loeb (2014) speculated that primitive life might in principle have appeared during this window, which he called "the Habitable Epoch of the Early Universe". Life on Earth is quite ubiquitous across the planet and has adapted over time to almost all the available environments in it, extremophiles and the deep biosphere thrive at even the most hostile ones. As a result, it is inferred that life in other celestial bodies may be equally adaptive. However, the origin of life is unrelated to its ease of adaptation and may have stricter requirements. A celestial body may not have any life on it, even if it were habitable. Likelihood of existence Life in the cosmos beyond Earth has been observed. The hypothesis of ubiquitous extraterrestrial life relies on three main ideas. The first one, the size of the universe, allows for plenty of planets to have a similar habitability to Earth, and the age of the universe gives enough time for a long process analog to the history of Earth to happen there. The second is that the substances that make life, such as carbon and water, are ubiquitous in the universe. The third is that the physical laws are universal, which means that the forces that would facilitate or prevent the existence of life would be the same ones as on Earth. According to this argument, made by scientists such as Carl Sagan and Stephen Hawking, it would be improbable for life not to exist somewhere else other than Earth. This argument is embodied in the Copernican principle, which states that Earth does not occupy a unique position in the Universe, and the mediocrity principle, which states that there is nothing special about life on Earth. Other authors consider instead that life in the cosmos, or at least multicellular life, may actually be rare. The Rare Earth hypothesis maintains that life on Earth is possible because of a series of factors that range from the location in the galaxy and the configuration of the Solar System to local characteristics of the planet, and that it is unlikely that another planet simultaneously meets all such requirements. The proponents of this hypothesis consider that very little evidence suggests the existence of extraterrestrial life and that, at this point, it is just a desired result and not a reasonable scientific explanation for any gathered data. In 1961, astronomer and astrophysicist Frank Drake devised the Drake equation as a way to stimulate scientific dialogue at a meeting on the search for extraterrestrial intelligence (SETI). The Drake equation is a probabilistic argument used to estimate the number of active, communicative extraterrestrial civilizations in the Milky Way galaxy. The Drake equation is:: xix where: and Drake's proposed estimates are as follows, but numbers on the right side of the equation are agreed as speculative and open to substitution: 10,000 = 5 ⋅ 0.5 ⋅ 2 ⋅ 1 ⋅ 0.2 ⋅ 1 ⋅ 10,000 {\displaystyle 10{,}000=5\cdot 0.5\cdot 2\cdot 1\cdot 0.2\cdot 1\cdot 10{,}000} [better source needed] The Drake equation has proved controversial since, although it is written as a math equation, none of its values were known at the time. Although some values may eventually be measured, others are based on social sciences and are not knowable by their very nature. This does not allow one to make noteworthy conclusions from the equation. Based on observations from the Hubble Space Telescope, there are nearly 2 trillion galaxies in the observable universe. It is estimated that at least ten percent of all Sun-like stars have a system of planets. In other words, there are 6.25×1018 stars with planets orbiting them in the observable universe. Even if it is assumed that only one out of a billion of these stars has planets supporting life, there would be some 6.25 billion life-supporting planetary systems in the observable universe. A 2013 study based on results from the Kepler spacecraft estimated that the Milky Way contains at least as many planets as it does stars, resulting in 100–400 billion exoplanets. The Nebular hypothesis that explains the formation of the Solar System and other planetary systems would suggest that those can have several configurations, and not all of them may have rocky planets within the habitable zone. The apparent contradiction between high estimates of the probability of the existence of extraterrestrial civilisations and the lack of evidence for such civilisations is known as the Fermi paradox. Dennis W. Sciama claimed that life's existence in the universe depends on various fundamental constants. Zhi-Wei Wang and Samuel L. Braunstein suggest that a random universe capable of supporting life is likely to be just barely able to do so, giving a potential explanation to the Fermi paradox. Biochemical basis If extraterrestrial life exists, it could range from simple microorganisms and multicellular organisms similar to animals or plants, to complex alien intelligences akin to humans. When scientists talk about extraterrestrial life, they consider all those types. Although it is possible that extraterrestrial life may have other configurations, scientists use the hierarchy of lifeforms from Earth for simplicity, as it is the only one known to exist. The first basic requirement for life is an environment with non-equilibrium thermodynamics, which means that the thermodynamic equilibrium must be broken by a source of energy. The traditional sources of energy in the cosmos are the stars, such as for life on Earth, which depends on the energy of the sun. However, there are other alternative energy sources, such as volcanoes, plate tectonics, and hydrothermal vents. There are ecosystems on Earth in deep areas of the ocean that do not receive sunlight, and take energy from black smokers instead. Magnetic fields and radioactivity have also been proposed as sources of energy, although they would be less efficient ones. Life on Earth requires water in a liquid state as a solvent in which biochemical reactions take place. It is highly unlikely that an abiogenesis process can start within a gaseous or solid medium: the atom speeds, either too fast or too slow, make it difficult for specific ones to meet and start chemical reactions. A liquid medium also allows the transport of nutrients and substances required for metabolism. Sufficient quantities of carbon and other elements, along with water, might enable the formation of living organisms on terrestrial planets with a chemical make-up and temperature range similar to that of Earth. Life based on ammonia rather than water has been suggested as an alternative, though this solvent appears less suitable than water. It is also conceivable that there are forms of life whose solvent is a liquid hydrocarbon, such as methane, ethane or propane. Another unknown aspect of potential extraterrestrial life would be the chemical elements that would compose it. Life on Earth is largely composed of carbon, but there could be other hypothetical types of biochemistry. A replacement for carbon would need to be able to create complex molecules, store information required for evolution, and be freely available in the medium. To create DNA, RNA, or a close analog, such an element should be able to bind its atoms with many others, creating complex and stable molecules. It should be able to create at least three covalent bonds: two for making long strings and at least a third to add new links and allow for diverse information. Only nine elements meet this requirement: boron, nitrogen, phosphorus, arsenic, antimony (three bonds), carbon, silicon, germanium and tin (four bonds). As for abundance, carbon, nitrogen, and silicon are the most abundant ones in the universe, far more than the others. On Earth's crust the most abundant of those elements is silicon, in the Hydrosphere it is carbon and in the atmosphere, it is carbon and nitrogen. Silicon, however, has disadvantages over carbon. The molecules formed with silicon atoms are less stable, and more vulnerable to acids, oxygen, and light. An ecosystem of silicon-based lifeforms would require very low temperatures, high atmospheric pressure, an atmosphere devoid of oxygen, and a solvent other than water. The low temperatures required would add an extra problem, the difficulty to kickstart a process of abiogenesis to create life in the first place. Norman Horowitz, head of the Jet Propulsion Laboratory bioscience section for the Mariner and Viking missions from 1965 to 1976 considered that the great versatility of the carbon atom makes it the element most likely to provide solutions, even exotic solutions, to the problems of survival of life on other planets. However, he also considered that the conditions found on Mars were incompatible with carbon based life. Even if extraterrestrial life is based on carbon and uses water as a solvent, like Earth life, it may still have a radically different biochemistry. Life is generally considered to be a product of natural selection. It has been proposed that to undergo natural selection a living entity must have the capacity to replicate itself, the capacity to avoid damage/decay, and the capacity to acquire and process resources in support of the first two capacities. Life on Earth may have started with an RNA world and later evolved to its current form, where some of the RNA tasks were transferred to DNA and proteins. Extraterrestrial life may still be stuck using RNA, or evolve into other configurations. It is unclear if our biochemistry is the most efficient one that could be generated, or which elements would follow a similar pattern. However, it is likely that, even if cells had a different composition to those from Earth, they would still have a cell membrane. Life on Earth jumped from prokaryotes to eukaryotes and from unicellular organisms to multicellular organisms through evolution. So far no alternative process to achieve such a result has been conceived, even if hypothetical. Evolution requires life to be divided into individual organisms, and no alternative organisation has been satisfactorily proposed either. At the basic level, membranes define the limit of a cell, between it and its environment, while remaining partially open to exchange energy and resources with it. The evolution from simple cells to eukaryotes, and from them to multicellular lifeforms, is not guaranteed. The Cambrian explosion took place thousands of millions of years after the origin of life, and its causes are not fully known yet. On the other hand, the jump to multicellularity took place several times, which suggests that it could be a case of convergent evolution, and so likely to take place on other planets as well. Palaeontologist Simon Conway Morris considers that convergent evolution would lead to kingdoms similar to our plants and animals, and that many features are likely to develop in alien animals as well, such as bilateral symmetry, limbs, digestive systems and heads with sensory organs. Scientists from the University of Oxford analysed it from the perspective of evolutionary theory and wrote in a study in the International Journal of Astrobiology that aliens may be similar to humans. The planetary context would also have an influence: a planet with higher gravity would have smaller animals, and other types of stars can lead to non-green photosynthesizers. The amount of energy available would also affect biodiversity, as an ecosystem sustained by black smokers or hydrothermal vents would have less energy available than those sustained by a star's light and heat, and so its lifeforms would not grow beyond a certain complexity. There is also research in assessing the capacity of life for developing intelligence. It has been suggested that this capacity arises with the number of potential niches a planet contains, and that the complexity of life itself is reflected in the information density of planetary environments, which in turn can be computed from its niches. It is common knowledge that the conditions on other planets in the solar system, in addition to the many galaxies outside of the Milky Way galaxy, are very harsh and seem to be too extreme to harbor any life. The environmental conditions on these planets can have intense UV radiation paired with extreme temperatures, lack of water, and much more that can lead to conditions that don't seem to favor the creation or maintenance of extraterrestrial life. However, there has been much historical evidence that some of the earliest and most basic forms of life on Earth originated in some extreme environments that seem unlikely to have harbored life at least at one point in Earth's history. Fossil evidence as well as many historical theories backed up by years of research and studies have marked environments like hydrothermal vents or acidic hot springs as some of the first places that life could have originated on Earth. These environments can be considered extreme when compared to the typical ecosystems that the majority of life on Earth now inhabit, as hydrothermal vents are scorching hot due to the magma escaping from the Earth's mantle and meeting the much colder oceanic water. Even in today's world, there can be a diverse population of bacteria found inhabiting the area surrounding these hydrothermal vents which can suggest that some form of life can be supported even in the harshest of environments like the other planets in the solar system. The aspects of these harsh environments that make them ideal for the origin of life on Earth, as well as the possibility of creation of life on other planets, is the chemical reactions forming spontaneously. For example, the hydrothermal vents found on the ocean floor are known to support many chemosynthetic processes which allow organisms to utilize energy through reduced chemical compounds that fix carbon. In return, these reactions will allow for organisms to live in relatively low oxygenated environments while maintaining enough energy to support themselves. The early Earth environment was reducing and therefore, these carbon fixing compounds were necessary for the survival and possible origin of life on Earth. With the little amount of information that scientists have found regarding the atmosphere on other planets in the Milky Way galaxy and beyond, the atmospheres are most likely reducing or with very low oxygen levels, especially when compared with Earth's atmosphere. If there were the necessary elements and ions on these planets, the same carbon fixing, reduced chemical compounds occurring around hydrothermal vents could also occur on these planets' surfaces and possibly result in the origin of extraterrestrial life. Planetary habitability in the Solar System The Solar System has a wide variety of planets, dwarf planets, and moons, and each one is studied for its potential to host life. Each one has its own specific conditions that may benefit or harm life. So far, the only lifeforms found are those from Earth. No extraterrestrial intelligence other than humans exists or has ever existed within the Solar System. Astrobiologist Mary Voytek points out that it would be unlikely to find large ecosystems, as they would have already been detected by now. The inner Solar System is likely devoid of life. However, Venus is still of interest to astrobiologists, as it is a terrestrial planet that was likely similar to Earth in its early stages and developed in a different way. There is a greenhouse effect, the surface is the hottest in the Solar System, sulfuric acid clouds, all surface liquid water is lost, and it has a thick carbon-dioxide atmosphere with huge pressure. Comparing both helps to understand the precise differences that lead to beneficial or harmful conditions for life. And despite the conditions against life on Venus, there are suspicions that microbial life-forms may still survive in high-altitude clouds. Mars is a cold and almost airless desert, inhospitable to life. However, recent studies revealed that water on Mars used to be quite abundant, forming rivers, lakes, and perhaps even oceans. Mars may have been habitable back then, and life on Mars may have been possible. But when the planetary core ceased to generate a magnetic field, solar winds removed the atmosphere and the planet became vulnerable to solar radiation. Ancient life-forms may still have left fossilised remains, and microbes may still survive deep underground. As mentioned, the gas giants and ice giants are unlikely to contain life. The most distant solar system bodies, found in the Kuiper Belt and outwards, are locked in permanent deep-freeze, but cannot be ruled out completely. Although the giant planets themselves are highly unlikely to have life, there is much hope to find it on moons orbiting these planets. Europa, from the Jovian system, has a subsurface ocean below a thick layer of ice. Ganymede and Callisto also have subsurface oceans, but life is less likely in them because water is sandwiched between layers of solid ice. Europa would have contact between the ocean and the rocky surface, which helps the chemical reactions. It may be difficult to dig so deep in order to study those oceans, though. Enceladus, a tiny moon of Saturn with another subsurface ocean, may not need to be dug, as it releases water to space in eruption columns. The space probe Cassini flew inside one of these, but could not make a full study because NASA did not expect this phenomenon and did not equip the probe to study ocean water. Still, Cassini detected complex organic molecules, salts, evidence of hydrothermal activity, hydrogen, and methane. Titan is the only celestial body in the Solar System besides Earth that has liquid bodies on the surface. It has rivers, lakes, and rain of hydrocarbons, methane, and ethane, and even a cycle similar to Earth's water cycle. This special context encourages speculations about lifeforms with different biochemistry, but the cold temperatures would make such chemistry take place at a very slow pace. Water is rock-solid on the surface, but Titan does have a subsurface water ocean like several other moons. However, it is of such a great depth that it would be very difficult to access it for study. Scientific search The science that searches and studies life in the universe, both on Earth and elsewhere, is called astrobiology. With the study of Earth's life, the only known form of life, astrobiology seeks to study how life starts and evolves and the requirements for its continuous existence. This helps to determine what to look for when searching for life in other celestial bodies. This is a complex area of study, and uses the combined perspectives of several scientific disciplines, such as astronomy, biology, chemistry, geology, oceanography, and atmospheric sciences. The scientific search for extraterrestrial life is being carried out both directly and indirectly. As of September 2017[update], 3,667 exoplanets in 2,747 systems have been identified, and other planets and moons in the Solar System hold the potential for hosting primitive life such as microorganisms. As of 8 February 2021, an updated status of studies considering the possible detection of lifeforms on Venus (via phosphine) and Mars (via methane) was reported. Scientists search for biosignatures within the Solar System by studying planetary surfaces and examining meteorites. Some claim to have identified evidence that microbial life has existed on Mars. In 1996, a controversial report stated that structures resembling nanobacteria were discovered in a meteorite, ALH84001, formed of rock ejected from Mars. Although all the unusual properties of the meteorite were eventually explained as the result of inorganic processes, the controversy over its discovery laid the groundwork for the development of astrobiology. An experiment on the two Viking Mars landers reported gas emissions from heated Martian soil samples that some scientists argue are consistent with the presence of living microorganisms. Lack of corroborating evidence from other experiments on the same samples suggests that a non-biological reaction is a more likely hypothesis. In February 2005 NASA scientists reported they may have found some evidence of extraterrestrial life on Mars. The two scientists, Carol Stoker and Larry Lemke of NASA's Ames Research Center, based their claim on methane signatures found in Mars's atmosphere resembling the methane production of some forms of primitive life on Earth, as well as on their own study of primitive life near the Rio Tinto river in Spain. NASA officials soon distanced NASA from the scientists' claims, and Stoker herself backed off from her initial assertions. In November 2011, NASA launched the Mars Science Laboratory that landed the Curiosity rover on Mars. It is designed to assess the past and present habitability on Mars using a variety of scientific instruments. The rover landed on Mars at Gale Crater in August 2012. A group of scientists at Cornell University started a catalog of microorganisms, with the way each one reacts to sunlight. The goal is to help with the search for similar organisms in exoplanets, as the starlight reflected by planets rich in such organisms would have a specific spectrum, unlike that of starlight reflected from lifeless planets. If Earth was studied from afar with this system, it would reveal a shade of green, as a result of the abundance of plants with photosynthesis. In August 2011, NASA studied meteorites found on Antarctica, finding adenine, guanine, hypoxanthine, and xanthine. Adenine and guanine are components of DNA, and the others are used in other biological processes. The studies ruled out pollution of the meteorites on Earth, as those components would not be freely available the way they were found in the samples. This discovery suggests that several organic molecules that serve as building blocks of life may be generated within asteroids and comets. In October 2011, scientists reported that cosmic dust contains complex organic compounds ("amorphous organic solids with a mixed aromatic-aliphatic structure") that could be created naturally, and rapidly, by stars. It is still unclear if those compounds played a role in the creation of life on Earth, but Sun Kwok, of the University of Hong Kong, thinks so. "If this is the case, life on Earth may have had an easier time getting started as these organics can serve as basic ingredients for life." In August 2012, and in a world first, astronomers at Copenhagen University reported the detection of a specific sugar molecule, glycolaldehyde, in a distant star system. The molecule was found around the protostellar binary IRAS 16293-2422, which is located 400 light years from Earth. Glycolaldehyde is needed to form ribonucleic acid, or RNA, which is similar in function to DNA. This finding suggests that complex organic molecules may form in stellar systems prior to the formation of planets, eventually arriving on young planets early in their formation. In December 2023, astronomers reported the first time discovery, in the plumes of Enceladus, moon of the planet Saturn, of hydrogen cyanide, a possible chemical essential for life as we know it, as well as other organic molecules, some of which are yet to be better identified and understood. According to the researchers, "these [newly discovered] compounds could potentially support extant microbial communities or drive complex organic synthesis leading to the origin of life." Although most searches are focused on the biology of extraterrestrial life, an extraterrestrial intelligence capable enough to develop a civilization may be detectable by other means as well. Technology may generate technosignatures, effects on the native planet that may not be caused by natural causes. There are three main types of techno-signatures considered: interstellar communications, effects on the atmosphere, and planetary-sized structures such as Dyson spheres. Organizations such as the SETI Institute search the cosmos for potential forms of communication. They started with radio waves, and now search for laser pulses as well. The challenge for this search is that there are natural sources of such signals as well, such as gamma-ray bursts and supernovae, and the difference between a natural signal and an artificial one would be in its specific patterns. Astronomers intend to use artificial intelligence for this, as it can manage large amounts of data and is devoid of biases and preconceptions. Besides, even if there is an advanced extraterrestrial civilization, there is no guarantee that it is transmitting radio communications in the direction of Earth. The length of time required for a signal to travel across space means that a potential answer may arrive decades or centuries after the initial message. The atmosphere of Earth is rich in nitrogen dioxide as a result of air pollution, which can be detectable. The natural abundance of carbon, which is also relatively reactive, makes it likely to be a basic component of the development of a potential extraterrestrial technological civilization, as it is on Earth. Fossil fuels may likely be generated and used on such worlds as well. The abundance of chlorofluorocarbons in the atmosphere can also be a clear technosignature, considering their role in ozone depletion. Light pollution may be another technosignature, as multiple lights on the night side of a rocky planet can be a sign of advanced technological development. However, modern telescopes are not strong enough to study exoplanets with the required level of detail to perceive it. The Kardashev scale proposes that a civilization may eventually start consuming energy directly from its local star. This would require giant structures built next to it, called Dyson spheres. Those speculative structures would cause an excess infrared radiation, that telescopes may notice. The infrared radiation is typical of young stars, surrounded by dusty protoplanetary disks that will eventually form planets. An older star such as the Sun would have no natural reason to have excess infrared radiation. The presence of heavy elements in a star's light-spectrum is another potential biosignature; such elements would (in theory) be found if the star were being used as an incinerator/repository for nuclear waste products. Some astronomers search for extrasolar planets that may be conducive to life, narrowing the search to terrestrial planets within the habitable zones of their stars. Since 1992, over four thousand exoplanets have been discovered (6,128 planets in 4,584 planetary systems including 1,017 multiple planetary systems as of 30 October 2025). The extrasolar planets so far discovered range in size from that of terrestrial planets similar to Earth's size to that of gas giants larger than Jupiter. The number of observed exoplanets is expected to increase greatly in the coming years.[better source needed] The Kepler space telescope has also detected a few thousand candidate planets, of which about 11% may be false positives. There is at least one planet on average per star. About 1 in 5 Sun-like stars[a] have an "Earth-sized"[b] planet in the habitable zone,[c] with the nearest expected to be within 12 light-years distance from Earth. Assuming 200 billion stars in the Milky Way,[d] that would be 11 billion potentially habitable Earth-sized planets in the Milky Way, rising to 40 billion if red dwarfs are included. The rogue planets in the Milky Way possibly number in the trillions. The nearest known exoplanet is Proxima Centauri b, located 4.2 light-years (1.3 pc) from Earth in the southern constellation of Centaurus. As of March 2014[update], the least massive exoplanet known is PSR B1257+12 A, which is about twice the mass of the Moon. The most massive planet listed on the NASA Exoplanet Archive is DENIS-P J082303.1−491201 b, about 29 times the mass of Jupiter, although according to most definitions of a planet, it is too massive to be a planet and may be a brown dwarf instead. Almost all of the planets detected so far are within the Milky Way, but there have also been a few possible detections of extragalactic planets. The study of planetary habitability also considers a wide range of other factors in determining the suitability of a planet for hosting life. One sign that a planet probably already contains life is the presence of an atmosphere with significant amounts of oxygen, since that gas is highly reactive and generally would not last long without constant replenishment. This replenishment occurs on Earth through photosynthetic organisms. One way to analyse the atmosphere of an exoplanet is through spectrography when it transits its star, though this might only be feasible with dim stars like white dwarfs. History and cultural impact The modern concept of extraterrestrial life is based on assumptions that were not commonplace during the early days of astronomy. The first explanations for the celestial objects seen in the night sky were based on mythology. Scholars from Ancient Greece were the first to consider that the universe is inherently understandable and rejected explanations based on supernatural incomprehensible forces, such as the myth of the Sun being pulled across the sky in the chariot of Apollo. They had not developed the scientific method yet and based their ideas on pure thought and speculation, but they developed precursor ideas to it, such as that explanations had to be discarded if they contradict observable facts. The discussions of those Greek scholars established many of the pillars that would eventually lead to the idea of extraterrestrial life, such as Earth being round and not flat. The cosmos was first structured in a geocentric model that considered that the sun and all other celestial bodies revolve around Earth. However, they did not consider them as worlds. In Greek understanding, the world was composed by both Earth and the celestial objects with noticeable movements. Anaximander thought that the cosmos was made from apeiron, a substance that created the world, and that the world would eventually return to the cosmos. Eventually two groups emerged, the atomists that thought that matter at both Earth and the cosmos was equally made of small atoms of the classical elements (earth, water, fire and air), and the Aristotelians who thought that those elements were exclusive of Earth and that the cosmos was made of a fifth one, the aether. Atomist Epicurus thought that the processes that created the world, its animals and plants should have created other worlds elsewhere, along with their own animals and plants. Aristotle thought instead that all the earth element naturally fell towards the center of the universe, and that would make it impossible for other planets to exist elsewhere. Under that reasoning, Earth was not only in the center, it was also the only planet in the universe. Cosmic pluralism, the plurality of worlds, or simply pluralism, describes the philosophical belief in numerous "worlds" in addition to Earth, which might harbor extraterrestrial life. The earliest recorded assertion of extraterrestrial human life is found in ancient scriptures of Jainism. There are multiple "worlds" mentioned in Jain scriptures that support human life. These include, among others, Bharat Kshetra, Mahavideh Kshetra, Airavat Kshetra, and Hari kshetra. Medieval Muslim writers like Fakhr al-Din al-Razi and Muhammad al-Baqir supported cosmic pluralism on the basis of the Qur'an. Chaucer's poem The House of Fame engaged in medieval thought experiments that postulated the plurality of worlds. However, those ideas about other worlds were different from the current knowledge about the structure of the universe, and did not postulate the existence of planetary systems other than the Solar System. When those authors talk about other worlds, they talk about places located at the center of their own systems, and with their own stellar vaults and cosmos surrounding them. The Greek ideas and the disputes between atomists and Aristotelians outlived the fall of the Greek empire. The Great Library of Alexandria compiled information about it, part of which was translated by Islamic scholars and thus survived the end of the Library. Baghdad combined the knowledge of the Greeks, the Indians, the Chinese and its own scholars, and the knowledge expanded through the Byzantine Empire. From there it eventually returned to Europe by the time of the Middle Ages. However, as the Greek atomist doctrine held that the world was created by random movements of atoms, with no need for a creator deity, it became associated with atheism, and the dispute intertwined with religious ones. Still, the Church did not react to those topics in a homogeneous way, and there were stricter and more permissive views within the church itself. The first known mention of the term 'panspermia' was in the writings of the 5th-century BC Greek philosopher Anaxagoras. He proposed the idea that life exists everywhere. By the time of the late Middle Ages there were many known inaccuracies in the geocentric model, but it was kept in use because naked eye observations provided limited data. Nicolaus Copernicus started the Copernican Revolution by proposing that the planets revolve around the sun rather than Earth. His proposal had little acceptance at first because, as he kept the assumption that orbits were perfect circles, his model led to as many inaccuracies as the geocentric one. Tycho Brahe improved the available data with naked-eye observatories, which worked with highly complex sextants and quadrants. Tycho could not make sense of his observations, but Johannes Kepler did: orbits were not perfect circles, but ellipses. This knowledge benefited the Copernican model, which worked now almost perfectly. The invention of the telescope a short time later, perfected by Galileo Galilei, clarified the final doubts, and the paradigm shift was completed. Under this new understanding, the notion of extraterrestrial life became feasible: if Earth is but just a planet orbiting around a star, there may be planets similar to Earth elsewhere. The astronomical study of distant bodies also proved that physical laws are the same elsewhere in the universe as on Earth, with nothing making the planet truly special. The new ideas were met with resistance from the Catholic church. Galileo was tried for the heliocentric model, which was considered heretical, and forced to recant it. The best-known early-modern proponent of ideas of extraterrestrial life was the Italian philosopher Giordano Bruno, who argued in the 16th century for an infinite universe in which every star is surrounded by its own planetary system. Bruno wrote that other worlds "have no less virtue nor a nature different to that of our earth" and, like Earth, "contain animals and inhabitants". Bruno's belief in the plurality of worlds was one of the charges leveled against him by the Venetian Holy Inquisition, which tried and executed him. The heliocentric model was further strengthened by the postulation of the theory of gravity by Sir Isaac Newton. This theory provided the mathematics that explains the motions of all things in the universe, including planetary orbits. By this point, the geocentric model was definitely discarded. By this time, the use of the scientific method had become a standard, and new discoveries were expected to provide evidence and rigorous mathematical explanations. Science also took a deeper interest in the mechanics of natural phenomena, trying to explain not just the way nature works but also the reasons for working that way. There was very little actual discussion about extraterrestrial life before this point, as the Aristotelian ideas remained influential while geocentrism was still accepted. When it was finally proved wrong, it not only meant that Earth was not the center of the universe, but also that the lights seen in the sky were not just lights, but physical objects. The notion that life may exist in them as well soon became an ongoing topic of discussion, although one with no practical ways to investigate. The possibility of extraterrestrials remained a widespread speculation as scientific discovery accelerated. William Herschel, the discoverer of Uranus, was one of many 18th–19th-century astronomers who believed that the Solar System is populated by alien life. Other scholars of the period who championed "cosmic pluralism" included Immanuel Kant and Benjamin Franklin. At the height of the Enlightenment, even the Sun and Moon were considered candidates for extraterrestrial inhabitants. Speculation about life on Mars increased in the late 19th century, following telescopic observation of apparent Martian canals – which soon, however, turned out to be optical illusions. Despite this, in 1895, American astronomer Percival Lowell published his book Mars, followed by Mars and its Canals in 1906, proposing that the canals were the work of a long-gone civilisation. Spectroscopic analysis of Mars's atmosphere began in earnest in 1894, when U.S. astronomer William Wallace Campbell showed that neither water nor oxygen was present in the Martian atmosphere. By 1909 better telescopes and the best perihelic opposition of Mars since 1877 conclusively put an end to the canal hypothesis. As a consequence of the belief in the spontaneous generation there was little thought about the conditions of each celestial body: it was simply assumed that life would thrive anywhere. This theory was disproved by Louis Pasteur in the 19th century. Popular belief in thriving alien civilisations elsewhere in the solar system still remained strong until Mariner 4 and Mariner 9 provided close images of Mars, which debunked forever the idea of the existence of Martians and decreased the previous expectations of finding alien life in general. The end of the spontaneous generation belief forced investigation into the origin of life. Although abiogenesis is the more accepted theory, a number of authors reclaimed the term "panspermia" and proposed that life was brought to Earth from elsewhere. Some of those authors are Jöns Jacob Berzelius (1834), Kelvin (1871), Hermann von Helmholtz (1879) and, somewhat later, by Svante Arrhenius (1903). The science fiction genre, although not so named during the time, developed during the late 19th century. The expansion of the genre of extraterrestrials in fiction influenced the popular perception over the real-life topic, making people eager to jump to conclusions about the discovery of aliens. Science marched at a slower pace, some discoveries fueled expectations and others dashed excessive hopes. For example, with the advent of telescopes, most structures seen on the Moon or Mars were immediately attributed to Selenites or Martians, and later ones (such as more powerful telescopes) revealed that all such discoveries were natural features. A famous case is the Cydonia region of Mars, first imaged by the Viking 1 orbiter. The low-resolution photos showed a rock formation that resembled a human face, but later spacecraft took photos in higher detail that showed that there was nothing special about the site. The search and study of extraterrestrial life became a science of its own, astrobiology. Also known as exobiology, this discipline is studied by the NASA, the ESA, the INAF, and others. Astrobiology studies life from Earth as well, but with a cosmic perspective. For example, abiogenesis is of interest to astrobiology, not because of the origin of life on Earth, but for the chances of a similar process taking place in other celestial bodies. Many aspects of life, from its definition to its chemistry, are analyzed as either likely to be similar in all forms of life across the cosmos or only native to Earth. Astrobiology, however, remains constrained by the current lack of extraterrestrial life-forms to study, as all life on Earth comes from the same ancestor, and it is hard to infer general characteristics from a group with a single example to analyse. The 20th century came with great technological advances, speculations about future hypothetical technologies, and an increased basic knowledge of science by the general population thanks to science divulgation through the mass media. The public interest in extraterrestrial life and the lack of discoveries by mainstream science led to the emergence of pseudosciences that provided affirmative, if questionable, answers to the existence of aliens. Ufology claims that many unidentified flying objects (UFOs) would be spaceships from alien species, and ancient astronauts hypothesis claim that aliens would have visited Earth in antiquity and prehistoric times but people would have failed to understand it by then. Most UFOs or UFO sightings can be readily explained as sightings of Earth-based aircraft (including top-secret aircraft), known astronomical objects or weather phenomenons, or as hoaxes. Looking beyond the pseudosciences, Lewis White Beck strove to elevate the level of public discourse on the topic of extraterrestrial life by tracing the evolution of philosophical thought over the centuries from ancient times into the modern era. His review of the contributions made by Lucretius, Plutarch, Aristotle, Copernicus, Immanuel Kant, John Wilkins, Charles Darwin and Karl Marx demonstrated that even in modern times, humanity could be profoundly influenced in its search for extraterrestrial life by subtle and comforting archetypal ideas which are largely derived from firmly held religious, philosophical and existential belief systems. On a positive note, however, Beck further argued that even if the search for extraterrestrial life proves to be unsuccessful, the endeavor itself could have beneficial consequences by assisting humanity in its attempt to actualize superior ways of living here on Earth. By the 21st century, it was accepted that multicellular life in the Solar System can only exist on Earth, but the interest in extraterrestrial life increased regardless. This is a result of the advances in several sciences. The knowledge of planetary habitability allows to consider on scientific terms the likelihood of finding life at each specific celestial body, as it is known which features are beneficial and harmful for life. Astronomy and telescopes also improved to the point exoplanets can be confirmed and even studied, increasing the number of search places. Life may still exist elsewhere in the Solar System in unicellular form, but the advances in spacecraft allow to send robots to study samples in situ, with tools of growing complexity and reliability. Although no extraterrestrial life has been found and life may still be just a rarity from Earth, there are scientific reasons to suspect that it can exist elsewhere, and technological advances that may detect it if it does. Many scientists are optimistic about the chances of finding alien life. In the words of SETI's Frank Drake, "All we know for sure is that the sky is not littered with powerful microwave transmitters". Drake noted that it is entirely possible that advanced technology results in communication being carried out in some way other than conventional radio transmission. At the same time, the data returned by space probes, and giant strides in detection methods, have allowed science to begin delineating habitability criteria on other worlds, and to confirm that at least other planets are plentiful, though aliens remain a question mark. The Wow! signal, detected in 1977 by a SETI project, remains a subject of speculative debate. On the other hand, other scientists are pessimistic. Jacques Monod wrote that "Man knows at last that he is alone in the indifferent immensity of the universe, whence which he has emerged by chance". In 2000, geologist and paleontologist Peter Ward and astrobiologist Donald Brownlee published a book entitled Rare Earth: Why Complex Life is Uncommon in the Universe.[better source needed] In it, they discussed the Rare Earth hypothesis, in which they claim that Earth-like life is rare in the universe, whereas microbial life is common. Ward and Brownlee are open to the idea of evolution on other planets that is not based on essential Earth-like characteristics such as DNA and carbon. As for the possible risks, theoretical physicist Stephen Hawking warned in 2010 that humans should not try to contact alien life forms. He warned that aliens might pillage Earth for resources. "If aliens visit us, the outcome would be much as when Columbus landed in America, which didn't turn out well for the Native Americans", he said. Jared Diamond had earlier expressed similar concerns. On 20 July 2015, Hawking and Russian billionaire Yuri Milner, along with the SETI Institute, announced a well-funded effort, called the Breakthrough Initiatives, to expand efforts to search for extraterrestrial life. The group contracted the services of the 100-meter Robert C. Byrd Green Bank Telescope in West Virginia in the United States and the 64-meter Parkes Telescope in New South Wales, Australia. On 13 February 2015, scientists (including Geoffrey Marcy, Seth Shostak, Frank Drake and David Brin) at a convention of the American Association for the Advancement of Science, discussed Active SETI and whether transmitting a message to possible intelligent extraterrestrials in the Cosmos was a good idea; one result was a statement, signed by many, that a "worldwide scientific, political and humanitarian discussion must occur before any message is sent". Government responses The 1967 Outer Space Treaty and the 1979 Moon Agreement define rules of planetary protection against potentially hazardous extraterrestrial life. COSPAR also provides guidelines for planetary protection. A committee of the United Nations Office for Outer Space Affairs had in 1977 discussed for a year strategies for interacting with extraterrestrial life or intelligence. The discussion ended without any conclusions. As of 2010, the UN lacks response mechanisms for the case of an extraterrestrial contact. One of the NASA divisions is the Office of Safety and Mission Assurance (OSMA), also known as the Planetary Protection Office. A part of its mission is to "rigorously preclude backward contamination of Earth by extraterrestrial life." In 2016, the Chinese Government released a white paper detailing its space program. According to the document, one of the research objectives of the program is the search for extraterrestrial life. It is also one of the objectives of the Chinese Five-hundred-meter Aperture Spherical Telescope (FAST) program. In 2020, Dmitry Rogozin, the head of the Russian space agency, said the search for extraterrestrial life is one of the main goals of deep space research. He also acknowledged the possibility of existence of primitive life on other planets of the Solar System. The French space agency has an office for the study of "non-identified aero spatial phenomena". The agency is maintaining a publicly accessible database of such phenomena, with over 1600 detailed entries. According to the head of the office, the vast majority of entries have a mundane explanation; but for 25% of entries, their extraterrestrial origin can neither be confirmed nor denied. In 2020, chairman of the Israel Space Agency Isaac Ben-Israel stated that the probability of detecting life in outer space is "quite large". But he disagrees with his former colleague Haim Eshed who stated that there are contacts between an advanced alien civilisation and some of Earth's governments. In fiction Although the idea of extraterrestrial peoples became feasible once astronomy developed enough to understand the nature of planets, they were not thought of as being any different from humans. Having no scientific explanation for the origin of mankind and its relation to other species, there was no reason to expect them to be any other way. This was changed by the 1859 book On the Origin of Species by Charles Darwin, which proposed the theory of evolution. Now with the notion that evolution on other planets may take other directions, science fiction authors created bizarre aliens, clearly distinct from humans. A usual way to do that was to add body features from other animals, such as insects or octopuses. Costuming and special effects feasibility alongside budget considerations forced films and TV series to tone down the fantasy, but these limitations lessened since the 1990s with the advent of computer-generated imagery (CGI), and later on as CGI became more effective and less expensive. Real-life events sometimes captivate people's imagination and this influences the works of fiction. For example, during the Barney and Betty Hill incident, the first recorded claim of an alien abduction, the couple reported that they were abducted and experimented on by aliens with oversized heads, big eyes, pale grey skin, and small noses, a description that eventually became the grey alien archetype once used in works of fiction. See also Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/United_States#cite_note-275] | [TOKENS: 17273] |
Contents United States The United States of America (USA), also known as the United States (U.S.) or America, is a country primarily located in North America. It is a federal republic of 50 states and a federal capital district, Washington, D.C. The 48 contiguous states border Canada to the north and Mexico to the south, with the semi-exclave of Alaska in the northwest and the archipelago of Hawaii in the Pacific Ocean. The United States also asserts sovereignty over five major island territories and various uninhabited islands in Oceania and the Caribbean.[j] It is a megadiverse country, with the world's third-largest land area[c] and third-largest population, exceeding 341 million.[k] Paleo-Indians first migrated from North Asia to North America at least 15,000 years ago, and formed various civilizations. Spanish colonization established Spanish Florida in 1513, the first European colony in what is now the continental United States. British colonization followed with the 1607 settlement of Virginia, the first of the Thirteen Colonies. Enslavement of Africans was practiced in all colonies by 1770 and supplied most of the labor for the Southern Colonies' plantation economy. Clashes with the British Crown began as a civil protest over the illegality of taxation without representation in Parliament and the denial of other English rights. They evolved into the American Revolution, which led to the Declaration of Independence and a society based on universal rights. Victory in the 1775–1783 Revolutionary War brought international recognition of U.S. sovereignty and fueled westward expansion, further dispossessing native inhabitants. As more states were admitted, a North–South division over slavery led the Confederate States of America to declare secession and fight the Union in the 1861–1865 American Civil War. With the United States' victory and reunification, slavery was abolished nationally. By the late 19th century, the U.S. economy outpaced the French, German and British economies combined. As of 1900, the country had established itself as a great power, a status solidified after its involvement in World War I. Following Japan's attack on Pearl Harbor in 1941, the U.S. entered World War II. Its aftermath left the U.S. and the Soviet Union as rival superpowers, competing for ideological dominance and international influence during the Cold War. The Soviet Union's collapse in 1991 ended the Cold War, leaving the U.S. as the world's sole superpower. The U.S. federal government is a representative democracy with a president and a constitution that grants separation of powers under three branches: legislative, executive, and judicial. The United States Congress is a bicameral national legislature composed of the House of Representatives (a lower house based on population) and the Senate (an upper house based on equal representation for each state). Federalism grants substantial autonomy to the 50 states. In addition, 574 Native American tribes have sovereignty rights, and there are 326 Native American reservations. Since the 1850s, the Democratic and Republican parties have dominated American politics. American ideals and values are based on a democratic tradition inspired by the American Enlightenment movement. A developed country, the U.S. ranks high in economic competitiveness, innovation, and higher education. Accounting for over a quarter of nominal global GDP, its economy has been the world's largest since about 1890. It is the wealthiest country, with the highest disposable household income per capita among OECD members, though its wealth inequality is highly pronounced. Shaped by centuries of immigration, the culture of the U.S. is diverse and globally influential. Making up more than a third of global military spending, the country has one of the strongest armed forces and is a designated nuclear state. A member of numerous international organizations, the U.S. plays a major role in global political, cultural, economic, and military affairs. Etymology Documented use of the phrase "United States of America" dates back to January 2, 1776. On that day, Stephen Moylan, a Continental Army aide to General George Washington, wrote a letter to Joseph Reed, Washington's aide-de-camp, seeking to go "with full and ample powers from the United States of America to Spain" to seek assistance in the Revolutionary War effort. The first known public usage is an anonymous essay published in the Williamsburg newspaper The Virginia Gazette on April 6, 1776. Sometime on or after June 11, 1776, Thomas Jefferson wrote "United States of America" in a rough draft of the Declaration of Independence, which was adopted by the Second Continental Congress on July 4, 1776. The term "United States" and its initialism "U.S.", used as nouns or as adjectives in English, are common short names for the country. The initialism "USA", a noun, is also common. "United States" and "U.S." are the established terms throughout the U.S. federal government, with prescribed rules.[l] "The States" is an established colloquial shortening of the name, used particularly from abroad; "stateside" is the corresponding adjective or adverb. "America" is the feminine form of the first word of Americus Vesputius, the Latinized name of Italian explorer Amerigo Vespucci (1454–1512);[m] it was first used as a place name by the German cartographers Martin Waldseemüller and Matthias Ringmann in 1507.[n] Vespucci first proposed that the West Indies discovered by Christopher Columbus in 1492 were part of a previously unknown landmass and not among the Indies at the eastern limit of Asia. In English, the term "America" usually does not refer to topics unrelated to the United States, despite the usage of "the Americas" to describe the totality of the continents of North and South America. History The first inhabitants of North America migrated from Siberia approximately 15,000 years ago, either across the Bering land bridge or along the now-submerged Ice Age coastline. Small isolated groups of hunter-gatherers are said to have migrated alongside herds of large herbivores far into Alaska, with ice-free corridors developing along the Pacific coast and valleys of North America in c. 16,500 – c. 13,500 BCE (c. 18,500 – c. 15,500 BP). The Clovis culture, which appeared around 11,000 BCE, is believed to be the first widespread culture in the Americas. Over time, Indigenous North American cultures grew increasingly sophisticated, and some, such as the Mississippian culture, developed agriculture, architecture, and complex societies. In the post-archaic period, the Mississippian cultures were located in the midwestern, eastern, and southern regions, and the Algonquian in the Great Lakes region and along the Eastern Seaboard, while the Hohokam culture and Ancestral Puebloans inhabited the Southwest. Native population estimates of what is now the United States before the arrival of European colonizers range from around 500,000 to nearly 10 million. Christopher Columbus began exploring the Caribbean for Spain in 1492, leading to Spanish-speaking settlements and missions from what are now Puerto Rico and Florida to New Mexico and California. The first Spanish colony in the present-day continental United States was Spanish Florida, chartered in 1513. After several settlements failed there due to starvation and disease, Spain's first permanent town, Saint Augustine, was founded in 1565. France established its own settlements in French Florida in 1562, but they were either abandoned (Charlesfort, 1578) or destroyed by Spanish raids (Fort Caroline, 1565). Permanent French settlements were founded much later along the Great Lakes (Fort Detroit, 1701), the Mississippi River (Saint Louis, 1764) and especially the Gulf of Mexico (New Orleans, 1718). Early European colonies also included the thriving Dutch colony of New Nederland (settled 1626, present-day New York) and the small Swedish colony of New Sweden (settled 1638 in what became Delaware). British colonization of the East Coast began with the Virginia Colony (1607) and the Plymouth Colony (Massachusetts, 1620). The Mayflower Compact in Massachusetts and the Fundamental Orders of Connecticut established precedents for local representative self-governance and constitutionalism that would develop throughout the American colonies. While European settlers in what is now the United States experienced conflicts with Native Americans, they also engaged in trade, exchanging European tools for food and animal pelts.[o] Relations ranged from close cooperation to warfare and massacres. The colonial authorities often pursued policies that forced Native Americans to adopt European lifestyles, including conversion to Christianity. Along the eastern seaboard, settlers trafficked Africans through the Atlantic slave trade, largely to provide manual labor on plantations. The original Thirteen Colonies[p] that would later found the United States were administered as possessions of the British Empire by Crown-appointed governors, though local governments held elections open to most white male property owners. The colonial population grew rapidly from Maine to Georgia, eclipsing Native American populations; by the 1770s, the natural increase of the population was such that only a small minority of Americans had been born overseas. The colonies' distance from Britain facilitated the entrenchment of self-governance, and the First Great Awakening, a series of Christian revivals, fueled colonial interest in guaranteed religious liberty. Following its victory in the French and Indian War, Britain began to assert greater control over local affairs in the Thirteen Colonies, resulting in growing political resistance. One of the primary grievances of the colonists was the denial of their rights as Englishmen, particularly the right to representation in the British government that taxed them. To demonstrate their dissatisfaction and resolve, the First Continental Congress met in 1774 and passed the Continental Association, a colonial boycott of British goods enforced by local "committees of safety" that proved effective. The British attempt to then disarm the colonists resulted in the 1775 Battles of Lexington and Concord, igniting the American Revolutionary War. At the Second Continental Congress, the colonies appointed George Washington commander-in-chief of the Continental Army, and created a committee that named Thomas Jefferson to draft the Declaration of Independence. Two days after the Second Continental Congress passed the Lee Resolution to create an independent, sovereign nation, the Declaration was adopted on July 4, 1776. The political values of the American Revolution evolved from an armed rebellion demanding reform within an empire to a revolution that created a new social and governing system founded on the defense of liberty and the protection of inalienable natural rights; sovereignty of the people; republicanism over monarchy, aristocracy, and other hereditary political power; civic virtue; and an intolerance of political corruption. The Founding Fathers of the United States, who included Washington, Jefferson, John Adams, Benjamin Franklin, Alexander Hamilton, John Jay, James Madison, Thomas Paine, and many others, were inspired by Classical, Renaissance, and Enlightenment philosophies and ideas. Though in practical effect since its drafting in 1777, the Articles of Confederation was ratified in 1781 and formally established a decentralized government that operated until 1789. After the British surrender at the siege of Yorktown in 1781, American sovereignty was internationally recognized by the Treaty of Paris (1783), through which the U.S. gained territory stretching west to the Mississippi River, north to present-day Canada, and south to Spanish Florida. The Northwest Ordinance (1787) established the precedent by which the country's territory would expand with the admission of new states, rather than the expansion of existing states. The U.S. Constitution was drafted at the 1787 Constitutional Convention to overcome the limitations of the Articles. It went into effect in 1789, creating a federal republic governed by three separate branches that together formed a system of checks and balances. George Washington was elected the country's first president under the Constitution, and the Bill of Rights was adopted in 1791 to allay skeptics' concerns about the power of the more centralized government. His resignation as commander-in-chief after the Revolutionary War and his later refusal to run for a third term as the country's first president established a precedent for the supremacy of civil authority in the United States and the peaceful transfer of power. In the late 18th century, American settlers began to expand westward in larger numbers, many with a sense of manifest destiny. The Louisiana Purchase of 1803 from France nearly doubled the territory of the United States. Lingering issues with Britain remained, leading to the War of 1812, which was fought to a draw. Spain ceded Florida and its Gulf Coast territory in 1819. The Missouri Compromise of 1820, which admitted Missouri as a slave state and Maine as a free state, attempted to balance the desire of northern states to prevent the expansion of slavery into new territories with that of southern states to extend it there. Primarily, the compromise prohibited slavery in all other lands of the Louisiana Purchase north of the 36°30′ parallel. As Americans expanded further into territory inhabited by Native Americans, the federal government implemented policies of Indian removal or assimilation. The most significant such legislation was the Indian Removal Act of 1830, a key policy of President Andrew Jackson. It resulted in the Trail of Tears (1830–1850), in which an estimated 60,000 Native Americans living east of the Mississippi River were forcibly removed and displaced to lands far to the west, causing 13,200 to 16,700 deaths along the forced march. Settler expansion as well as this influx of Indigenous peoples from the East resulted in the American Indian Wars west of the Mississippi. During the colonial period, slavery became legal in all the Thirteen colonies, but by 1770 it provided the main labor force in the large-scale, agriculture-dependent economies of the Southern Colonies from Maryland to Georgia. The practice began to be significantly questioned during the American Revolution, and spurred by an active abolitionist movement that had reemerged in the 1830s, states in the North enacted laws to prohibit slavery within their boundaries. At the same time, support for slavery had strengthened in Southern states, with widespread use of inventions such as the cotton gin (1793) having made slavery immensely profitable for Southern elites. The United States annexed the Republic of Texas in 1845, and the 1846 Oregon Treaty led to U.S. control of the present-day American Northwest. Dispute with Mexico over Texas led to the Mexican–American War (1846–1848). After the victory of the U.S., Mexico recognized U.S. sovereignty over Texas, New Mexico, and California in the 1848 Mexican Cession; the cession's lands also included the future states of Nevada, Colorado and Utah. The California gold rush of 1848–1849 spurred a huge migration of white settlers to the Pacific coast, leading to even more confrontations with Native populations. One of the most violent, the California genocide of thousands of Native inhabitants, lasted into the mid-1870s. Additional western territories and states were created. Throughout the 1850s, the sectional conflict regarding slavery was further inflamed by national legislation in the U.S. Congress and decisions of the Supreme Court. In Congress, the Fugitive Slave Act of 1850 mandated the forcible return to their owners in the South of slaves taking refuge in non-slave states, while the Kansas–Nebraska Act of 1854 effectively gutted the anti-slavery requirements of the Missouri Compromise. In its Dred Scott decision of 1857, the Supreme Court ruled against a slave brought into non-slave territory, simultaneously declaring the entire Missouri Compromise to be unconstitutional. These and other events exacerbated tensions between North and South that would culminate in the American Civil War (1861–1865). Beginning with South Carolina, 11 slave-state governments voted to secede from the United States in 1861, joining to create the Confederate States of America. All other state governments remained loyal to the Union.[q] War broke out in April 1861 after the Confederacy bombarded Fort Sumter. Following the Emancipation Proclamation on January 1, 1863, many freed slaves joined the Union army. The war began to turn in the Union's favor following the 1863 Siege of Vicksburg and Battle of Gettysburg, and the Confederates surrendered in 1865 after the Union's victory in the Battle of Appomattox Court House. Efforts toward reconstruction in the secessionist South had begun as early as 1862, but it was only after President Lincoln's assassination that the three Reconstruction Amendments to the Constitution were ratified to protect civil rights. The amendments codified nationally the abolition of slavery and involuntary servitude except as punishment for crimes, promised equal protection under the law for all persons, and prohibited discrimination on the basis of race or previous enslavement. As a result, African Americans took an active political role in ex-Confederate states in the decade following the Civil War. The former Confederate states were readmitted to the Union, beginning with Tennessee in 1866 and ending with Georgia in 1870. National infrastructure, including transcontinental telegraph and railroads, spurred growth in the American frontier. This was accelerated by the Homestead Acts, through which nearly 10 percent of the total land area of the United States was given away free to some 1.6 million homesteaders. From 1865 through 1917, an unprecedented stream of immigrants arrived in the United States, including 24.4 million from Europe. Most came through the Port of New York, as New York City and other large cities on the East Coast became home to large Jewish, Irish, and Italian populations. Many Northern Europeans as well as significant numbers of Germans and other Central Europeans moved to the Midwest. At the same time, about one million French Canadians migrated from Quebec to New England. During the Great Migration, millions of African Americans left the rural South for urban areas in the North. Alaska was purchased from Russia in 1867. The Compromise of 1877 is generally considered the end of the Reconstruction era, as it resolved the electoral crisis following the 1876 presidential election and led President Rutherford B. Hayes to reduce the role of federal troops in the South. Immediately, the Redeemers began evicting the Carpetbaggers and quickly regained local control of Southern politics in the name of white supremacy. African Americans endured a period of heightened, overt racism following Reconstruction, a time often considered the nadir of American race relations. A series of Supreme Court decisions, including Plessy v. Ferguson, emptied the Fourteenth and Fifteenth Amendments of their force, allowing Jim Crow laws in the South to remain unchecked, sundown towns in the Midwest, and segregation in communities across the country, which would be reinforced in part by the policy of redlining later adopted by the federal Home Owners' Loan Corporation. An explosion of technological advancement, accompanied by the exploitation of cheap immigrant labor, led to rapid economic expansion during the Gilded Age of the late 19th century. It continued into the early 20th, when the United States already outpaced the economies of Britain, France, and Germany combined. This fostered the amassing of power by a few prominent industrialists, largely by their formation of trusts and monopolies to prevent competition. Tycoons led the nation's expansion in the railroad, petroleum, and steel industries. The United States emerged as a pioneer of the automotive industry. These changes resulted in significant increases in economic inequality, slum conditions, and social unrest, creating the environment for labor unions and socialist movements to begin to flourish. This period eventually ended with the advent of the Progressive Era, which was characterized by significant economic and social reforms. Pro-American elements in Hawaii overthrew the Hawaiian monarchy; the islands were annexed in 1898. That same year, Puerto Rico, the Philippines, and Guam were ceded to the U.S. by Spain after the latter's defeat in the Spanish–American War. (The Philippines was granted full independence from the U.S. on July 4, 1946, following World War II. Puerto Rico and Guam have remained U.S. territories.) American Samoa was acquired by the United States in 1900 after the Second Samoan Civil War. The U.S. Virgin Islands were purchased from Denmark in 1917. The United States entered World War I alongside the Allies in 1917 helping to turn the tide against the Central Powers. In 1920, a constitutional amendment granted nationwide women's suffrage. During the 1920s and 1930s, radio for mass communication and early television transformed communications nationwide. The Wall Street Crash of 1929 triggered the Great Depression, to which President Franklin D. Roosevelt responded with the New Deal plan of "reform, recovery and relief", a series of unprecedented and sweeping recovery programs and employment relief projects combined with financial reforms and regulations. Initially neutral during World War II, the U.S. began supplying war materiel to the Allies of World War II in March 1941 and entered the war in December after Japan's attack on Pearl Harbor. Agreeing to a "Europe first" policy, the U.S. concentrated its wartime efforts on Japan's allies Italy and Germany until their final defeat in May 1945. The U.S. developed the first nuclear weapons and used them against the Japanese cities of Hiroshima and Nagasaki in August 1945, ending the war. The United States was one of the "Four Policemen" who met to plan the post-war world, alongside the United Kingdom, the Soviet Union, and China. The U.S. emerged relatively unscathed from the war, with even greater economic power and international political influence. The end of World War II in 1945 left the U.S. and the Soviet Union as superpowers, each with its own political, military, and economic sphere of influence. Geopolitical tensions between the two superpowers soon led to the Cold War. The U.S. implemented a policy of containment intended to limit the Soviet Union's sphere of influence; engaged in regime change against governments perceived to be aligned with the Soviets; and prevailed in the Space Race, which culminated with the first crewed Moon landing in 1969. Domestically, the U.S. experienced economic growth, urbanization, and population growth following World War II. The civil rights movement emerged, with Martin Luther King Jr. becoming a prominent leader in the early 1960s. The Great Society plan of President Lyndon B. Johnson's administration resulted in groundbreaking and broad-reaching laws, policies and a constitutional amendment to counteract some of the worst effects of lingering institutional racism. The counterculture movement in the U.S. brought significant social changes, including the liberalization of attitudes toward recreational drug use and sexuality. It also encouraged open defiance of the military draft (leading to the end of conscription in 1973) and wide opposition to U.S. intervention in Vietnam, with the U.S. totally withdrawing in 1975. A societal shift in the roles of women was significantly responsible for the large increase in female paid labor participation starting in the 1970s, and by 1985 the majority of American women aged 16 and older were employed. The Fall of Communism and the dissolution of the Soviet Union from 1989 to 1991 marked the end of the Cold War and left the United States as the world's sole superpower. This cemented the United States' global influence, reinforcing the concept of the "American Century" as the U.S. dominated international political, cultural, economic, and military affairs. The 1990s saw the longest recorded economic expansion in American history, a dramatic decline in U.S. crime rates, and advances in technology. Throughout this decade, technological innovations such as the World Wide Web, the evolution of the Pentium microprocessor in accordance with Moore's law, rechargeable lithium-ion batteries, the first gene therapy trial, and cloning either emerged in the U.S. or were improved upon there. The Human Genome Project was formally launched in 1990, while Nasdaq became the first stock market in the United States to trade online in 1998. In the Gulf War of 1991, an American-led international coalition of states expelled an Iraqi invasion force that had occupied neighboring Kuwait. The September 11 attacks on the United States in 2001 by the pan-Islamist militant organization al-Qaeda led to the war on terror and subsequent military interventions in Afghanistan and in Iraq. The U.S. housing bubble culminated in 2007 with the Great Recession, the largest economic contraction since the Great Depression. In the 2010s and early 2020s, the United States has experienced increased political polarization and democratic backsliding. The country's polarization was violently reflected in the January 2021 Capitol attack, when a mob of insurrectionists entered the U.S. Capitol and sought to prevent the peaceful transfer of power in an attempted self-coup d'état. Geography The United States is the world's third-largest country by total area behind Russia and Canada.[c] The 48 contiguous states and the District of Columbia have a combined area of 3,119,885 square miles (8,080,470 km2). In 2021, the United States had 8% of the Earth's permanent meadows and pastures and 10% of its cropland. Starting in the east, the coastal plain of the Atlantic seaboard gives way to inland forests and rolling hills in the Piedmont plateau region. The Appalachian Mountains and the Adirondack Massif separate the East Coast from the Great Lakes and the grasslands of the Midwest. The Mississippi River System, the world's fourth-longest river system, runs predominantly north–south through the center of the country. The flat and fertile prairie of the Great Plains stretches to the west, interrupted by a highland region in the southeast. The Rocky Mountains, west of the Great Plains, extend north to south across the country, peaking at over 14,000 feet (4,300 m) in Colorado. The supervolcano underlying Yellowstone National Park in the Rocky Mountains, the Yellowstone Caldera, is the continent's largest volcanic feature. Farther west are the rocky Great Basin and the Chihuahuan, Sonoran, and Mojave deserts. In the northwest corner of Arizona, carved by the Colorado River, is the Grand Canyon, a steep-sided canyon and popular tourist destination known for its overwhelming visual size and intricate, colorful landscape. The Cascade and Sierra Nevada mountain ranges run close to the Pacific coast. The lowest and highest points in the contiguous United States are in the State of California, about 84 miles (135 km) apart. At an elevation of 20,310 feet (6,190.5 m), Alaska's Denali (also called Mount McKinley) is the highest peak in the country and on the continent. Active volcanoes in the U.S. are common throughout Alaska's Alexander and Aleutian Islands. Located entirely outside North America, the archipelago of Hawaii consists of volcanic islands, physiographically and ethnologically part of the Polynesian subregion of Oceania. In addition to its total land area, the United States has one of the world's largest marine exclusive economic zones spanning approximately 4.5 million square miles (11.7 million km2) of ocean. With its large size and geographic variety, the United States includes most climate types. East of the 100th meridian, the climate ranges from humid continental in the north to humid subtropical in the south. The western Great Plains are semi-arid. Many mountainous areas of the American West have an alpine climate. The climate is arid in the Southwest, Mediterranean in coastal California, and oceanic in coastal Oregon, Washington, and southern Alaska. Most of Alaska is subarctic or polar. Hawaii, the southern tip of Florida and U.S. territories in the Caribbean and Pacific are tropical. The United States receives more high-impact extreme weather incidents than any other country. States bordering the Gulf of Mexico are prone to hurricanes, and most of the world's tornadoes occur in the country, mainly in Tornado Alley. Due to climate change in the country, extreme weather has become more frequent in the U.S. in the 21st century, with three times the number of reported heat waves compared to the 1960s. Since the 1990s, droughts in the American Southwest have become more persistent and more severe. The regions considered as the most attractive to the population are the most vulnerable. The U.S. is one of 17 megadiverse countries containing large numbers of endemic species: about 17,000 species of vascular plants occur in the contiguous United States and Alaska, and over 1,800 species of flowering plants are found in Hawaii, few of which occur on the mainland. The United States is home to 428 mammal species, 784 birds, 311 reptiles, 295 amphibians, and around 91,000 insect species. There are 63 national parks, and hundreds of other federally managed monuments, forests, and wilderness areas, administered by the National Park Service and other agencies. About 28% of the country's land is publicly owned and federally managed, primarily in the Western States. Most of this land is protected, though some is leased for commercial use, and less than one percent is used for military purposes. Environmental issues in the United States include debates on non-renewable resources and nuclear energy, air and water pollution, biodiversity, logging and deforestation, and climate change. The U.S. Environmental Protection Agency (EPA) is the federal agency charged with addressing most environmental-related issues. The idea of wilderness has shaped the management of public lands since 1964, with the Wilderness Act. The Endangered Species Act of 1973 provides a way to protect threatened and endangered species and their habitats. The United States Fish and Wildlife Service implements and enforces the Act. In 2024, the U.S. ranked 35th among 180 countries in the Environmental Performance Index. Government and politics The United States is a federal republic of 50 states and a federal capital district, Washington, D.C. The U.S. asserts sovereignty over five unincorporated territories and several uninhabited island possessions. It is the world's oldest surviving federation, and its presidential system of federal government has been adopted, in whole or in part, by many newly independent states worldwide following their decolonization. The Constitution of the United States serves as the country's supreme legal document. Most scholars describe the United States as a liberal democracy.[r] Composed of three branches, all headquartered in Washington, D.C., the federal government is the national government of the United States. The U.S. Constitution establishes a separation of powers intended to provide a system of checks and balances to prevent any of the three branches from becoming supreme. The three-branch system is known as the presidential system, in contrast to the parliamentary system where the executive is part of the legislative body. Many countries around the world adopted this aspect of the 1789 Constitution of the United States, especially in the postcolonial Americas. In the U.S. federal system, sovereign powers are shared between three levels of government specified in the Constitution: the federal government, the states, and Indian tribes. The U.S. also asserts sovereignty over five permanently inhabited territories: American Samoa, Guam, the Northern Mariana Islands, Puerto Rico, and the U.S. Virgin Islands. Residents of the 50 states are governed by their elected state government, under state constitutions compatible with the national constitution, and by elected local governments that are administrative divisions of a state. States are subdivided into counties or county equivalents, and (except for Hawaii) further divided into municipalities, each administered by elected representatives. The District of Columbia is a federal district containing the U.S. capital, Washington, D.C. The federal district is an administrative division of the federal government. Indian country is made up of 574 federally recognized tribes and 326 Indian reservations. They hold a government-to-government relationship with the U.S. federal government in Washington and are legally defined as domestic dependent nations with inherent tribal sovereignty rights. In addition to the five major territories, the U.S. also asserts sovereignty over the United States Minor Outlying Islands in the Pacific Ocean and the Caribbean. The seven undisputed islands without permanent populations are Baker Island, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Atoll, and Palmyra Atoll. U.S. sovereignty over the unpopulated Bajo Nuevo Bank, Navassa Island, Serranilla Bank, and Wake Island is disputed. The Constitution is silent on political parties. However, they developed independently in the 18th century with the Federalist and Anti-Federalist parties. Since then, the United States has operated as a de facto two-party system, though the parties have changed over time. Since the mid-19th century, the two main national parties have been the Democratic Party and the Republican Party. The former is perceived as relatively liberal in its political platform while the latter is perceived as relatively conservative in its platform. The United States has an established structure of foreign relations, with the world's second-largest diplomatic corps as of 2024[update]. It is a permanent member of the United Nations Security Council and home to the United Nations headquarters. The United States is a member of the G7, G20, and OECD intergovernmental organizations. Almost all countries have embassies and many have consulates (official representatives) in the country. Likewise, nearly all countries host formal diplomatic missions with the United States, except Iran, North Korea, and Bhutan. Though Taiwan does not have formal diplomatic relations with the U.S., it maintains close unofficial relations. The United States regularly supplies Taiwan with military equipment to deter potential Chinese aggression. Its geopolitical attention also turned to the Indo-Pacific when the United States joined the Quadrilateral Security Dialogue with Australia, India, and Japan. The United States has a "Special Relationship" with the United Kingdom and strong ties with Canada, Australia, New Zealand, the Philippines, Japan, South Korea, Israel, and several European Union countries such as France, Italy, Germany, Spain, and Poland. The U.S. works closely with its NATO allies on military and national security issues, and with countries in the Americas through the Organization of American States and the United States–Mexico–Canada Free Trade Agreement. The U.S. exercises full international defense authority and responsibility for Micronesia, the Marshall Islands, and Palau through the Compact of Free Association. It has increasingly conducted strategic cooperation with India, while its ties with China have steadily deteriorated. Beginning in 2014, the U.S. had become a key ally of Ukraine. After Donald Trump was elected U.S. president in 2024, he sought to negotiate an end to the Russo-Ukrainian War. He paused all military aid to Ukraine in March 2025, although the aid resumed later. Trump also ended U.S. intelligence sharing with the country, but this too was eventually restored. The president is the commander-in-chief of the United States Armed Forces and appoints its leaders, the secretary of defense and the Joint Chiefs of Staff. The Department of Defense, headquartered at the Pentagon near Washington, D.C., administers five of the six service branches, which are made up of the U.S. Army, Marine Corps, Navy, Air Force, and Space Force. The Coast Guard is administered by the Department of Homeland Security in peacetime and can be transferred to the Department of the Navy in wartime. Total strength of the entire military is about 1.3 million active duty with an additional 400,000 in reserve. The United States spent $997 billion on its military in 2024, which is by far the largest amount of any country, making up 37% of global military spending and accounting for 3.4% of the country's GDP. The U.S. possesses 42% of the world's nuclear weapons—the second-largest stockpile after that of Russia. The U.S. military is widely regarded as the most powerful and advanced in the world. The United States has the third-largest combined armed forces in the world, behind the Chinese People's Liberation Army and Indian Armed Forces. The U.S. military operates about 800 bases and facilities abroad, and maintains deployments greater than 100 active duty personnel in 25 foreign countries. The United States has engaged in over 400 military interventions since its founding in 1776, with over half of these occurring between 1950 and 2019 and 25% occurring in the post-Cold War era. State defense forces (SDFs) are military units that operate under the sole authority of a state government. SDFs are authorized by state and federal law but are under the command of the state's governor. By contrast, the 54 U.S. National Guard organizations[t] fall under the dual control of state or territorial governments and the federal government; their units can also become federalized entities, but SDFs cannot be federalized. The National Guard personnel of a state or territory can be federalized by the president under the National Defense Act Amendments of 1933; this legislation created the Guard and provides for the integration of Army National Guard and Air National Guard units and personnel into the U.S. Army and (since 1947) the U.S. Air Force. The total number of National Guard members is about 430,000, while the estimated combined strength of SDFs is less than 10,000. There are about 18,000 U.S. police agencies from local to national level in the United States. Law in the United States is mainly enforced by local police departments and sheriff departments in their municipal or county jurisdictions. The state police departments have authority in their respective state, and federal agencies such as the Federal Bureau of Investigation (FBI) and the U.S. Marshals Service have national jurisdiction and specialized duties, such as protecting civil rights, national security, enforcing U.S. federal courts' rulings and federal laws, and interstate criminal activity. State courts conduct almost all civil and criminal trials, while federal courts adjudicate the much smaller number of civil and criminal cases that relate to federal law. There is no unified "criminal justice system" in the United States. The American prison system is largely heterogenous, with thousands of relatively independent systems operating across federal, state, local, and tribal levels. In 2025, "these systems hold nearly 2 million people in 1,566 state prisons, 98 federal prisons, 3,116 local jails, 1,277 juvenile correctional facilities, 133 immigration detention facilities, and 80 Indian country jails, as well as in military prisons, civil commitment centers, state psychiatric hospitals, and prisons in the U.S. territories." Despite disparate systems of confinement, four main institutions dominate: federal prisons, state prisons, local jails, and juvenile correctional facilities. Federal prisons are run by the Federal Bureau of Prisons and hold pretrial detainees as well as people who have been convicted of federal crimes. State prisons, run by the department of corrections of each state, hold people sentenced and serving prison time (usually longer than one year) for felony offenses. Local jails are county or municipal facilities that incarcerate defendants prior to trial; they also hold those serving short sentences (typically under a year). Juvenile correctional facilities are operated by local or state governments and serve as longer-term placements for any minor adjudicated as delinquent and ordered by a judge to be confined. In January 2023, the United States had the sixth-highest per capita incarceration rate in the world—531 people per 100,000 inhabitants—and the largest prison and jail population in the world, with more than 1.9 million people incarcerated. An analysis of the World Health Organization Mortality Database from 2010 showed U.S. homicide rates "were 7 times higher than in other high-income countries, driven by a gun homicide rate that was 25 times higher". Economy The U.S. has a highly developed mixed economy that has been the world's largest nominally since about 1890. Its 2024 gross domestic product (GDP)[e] of more than $29 trillion constituted over 25% of nominal global economic output, or 15% at purchasing power parity (PPP). From 1983 to 2008, U.S. real compounded annual GDP growth was 3.3%, compared to a 2.3% weighted average for the rest of the G7. The country ranks first in the world by nominal GDP, second when adjusted for purchasing power parities (PPP), and ninth by PPP-adjusted GDP per capita. In February 2024, the total U.S. federal government debt was $34.4 trillion. Of the world's 500 largest companies by revenue, 138 were headquartered in the U.S. in 2025, the highest number of any country. The U.S. dollar is the currency most used in international transactions and the world's foremost reserve currency, backed by the country's dominant economy, its military, the petrodollar system, its large U.S. treasuries market, and its linked eurodollar. Several countries use it as their official currency, and in others it is the de facto currency. The U.S. has free trade agreements with several countries, including the USMCA. Although the United States has reached a post-industrial level of economic development and is often described as having a service economy, it remains a major industrial power; in 2024, the U.S. manufacturing sector was the world's second-largest by value output after China's. New York City is the world's principal financial center, and its metropolitan area is the world's largest metropolitan economy. The New York Stock Exchange and Nasdaq, both located in New York City, are the world's two largest stock exchanges by market capitalization and trade volume. The United States is at the forefront of technological advancement and innovation in many economic fields, especially in artificial intelligence; electronics and computers; pharmaceuticals; and medical, aerospace and military equipment. The country's economy is fueled by abundant natural resources, a well-developed infrastructure, and high productivity. The largest trading partners of the United States are the European Union, Mexico, Canada, China, Japan, South Korea, the United Kingdom, Vietnam, India, and Taiwan. The United States is the world's largest importer and second-largest exporter.[u] It is by far the world's largest exporter of services. Americans have the highest average household and employee income among OECD member states, and the fourth-highest median household income in 2023, up from sixth-highest in 2013. With personal consumption expenditures of over $18.5 trillion in 2023, the U.S. has a heavily consumer-driven economy and is the world's largest consumer market. The U.S. ranked first in the number of dollar billionaires and millionaires in 2023, with 735 billionaires and nearly 22 million millionaires. Wealth in the United States is highly concentrated; in 2011, the richest 10% of the adult population owned 72% of the country's household wealth, while the bottom 50% owned just 2%. U.S. wealth inequality increased substantially since the late 1980s, and income inequality in the U.S. reached a record high in 2019. In 2024, the country had some of the highest wealth and income inequality levels among OECD countries. Since the 1970s, there has been a decoupling of U.S. wage gains from worker productivity. In 2016, the top fifth of earners took home more than half of all income, giving the U.S. one of the widest income distributions among OECD countries. There were about 771,480 homeless persons in the U.S. in 2024. In 2022, 6.4 million children experienced food insecurity. Feeding America estimates that around one in five, or approximately 13 million, children experience hunger in the U.S. and do not know where or when they will get their next meal. Also in 2022, about 37.9 million people, or 11.5% of the U.S. population, were living in poverty. The United States has a smaller welfare state and redistributes less income through government action than most other high-income countries. It is the only advanced economy that does not guarantee its workers paid vacation nationally and one of a few countries in the world without federal paid family leave as a legal right. The United States has a higher percentage of low-income workers than almost any other developed country, largely because of a weak collective bargaining system and lack of government support for at-risk workers. The United States has been a leader in technological innovation since the late 19th century and scientific research since the mid-20th century. Methods for producing interchangeable parts and the establishment of a machine tool industry enabled the large-scale manufacturing of U.S. consumer products in the late 19th century. By the early 20th century, factory electrification, the introduction of the assembly line, and other labor-saving techniques created the system of mass production. In the 21st century, the United States continues to be one of the world's foremost scientific powers, though China has emerged as a major competitor in many fields. The U.S. has the highest research and development expenditures of any country and ranks ninth as a percentage of GDP. In 2022, the United States was (after China) the country with the second-highest number of published scientific papers. In 2021, the U.S. ranked second (also after China) by the number of patent applications, and third by trademark and industrial design applications (after China and Germany), according to World Intellectual Property Indicators. In 2025 the United States ranked third (after Switzerland and Sweden) in the Global Innovation Index. The United States is considered to be a world leader in the development of artificial intelligence technology. In 2023, the United States was ranked the second most technologically advanced country in the world (after South Korea) by Global Finance magazine. The United States has maintained a space program since the late 1950s, beginning with the establishment of the National Aeronautics and Space Administration (NASA) in 1958. NASA's Apollo program (1961–1972) achieved the first crewed Moon landing with the 1969 Apollo 11 mission; it remains one of the agency's most significant milestones. Other major endeavors by NASA include the Space Shuttle program (1981–2011), the Voyager program (1972–present), the Hubble and James Webb space telescopes (launched in 1990 and 2021, respectively), and the multi-mission Mars Exploration Program (Spirit and Opportunity, Curiosity, and Perseverance). NASA is one of five agencies collaborating on the International Space Station (ISS); U.S. contributions to the ISS include several modules, including Destiny (2001), Harmony (2007), and Tranquility (2010), as well as ongoing logistical and operational support. The United States private sector dominates the global commercial spaceflight industry. Prominent American spaceflight contractors include Blue Origin, Boeing, Lockheed Martin, Northrop Grumman, and SpaceX. NASA programs such as the Commercial Crew Program, Commercial Resupply Services, Commercial Lunar Payload Services, and NextSTEP have facilitated growing private-sector involvement in American spaceflight. In 2023, the United States received approximately 84% of its energy from fossil fuel, and its largest source of energy was petroleum (38%), followed by natural gas (36%), renewable sources (9%), coal (9%), and nuclear power (9%). In 2022, the United States constituted about 4% of the world's population, but consumed around 16% of the world's energy. The U.S. ranks as the second-highest emitter of greenhouse gases behind China. The U.S. is the world's largest producer of nuclear power, generating around 30% of the world's nuclear electricity. It also has the highest number of nuclear power reactors of any country. From 2024, the U.S. plans to triple its nuclear power capacity by 2050. The United States' 4 million miles (6.4 million kilometers) of road network, owned almost entirely by state and local governments, is the longest in the world. The extensive Interstate Highway System that connects all major U.S. cities is funded mostly by the federal government but maintained by state departments of transportation. The system is further extended by state highways and some private toll roads. The U.S. is among the top ten countries with the highest vehicle ownership per capita (850 vehicles per 1,000 people) in 2022. A 2022 study found that 76% of U.S. commuters drive alone and 14% ride a bicycle, including bike owners and users of bike-sharing networks. About 11% use some form of public transportation. Public transportation in the United States is well developed in the largest urban areas, notably New York City, Washington, D.C., Boston, Philadelphia, Chicago, and San Francisco; otherwise, coverage is generally less extensive than in most other developed countries. The U.S. also has many relatively car-dependent localities. Long-distance intercity travel is provided primarily by airlines, but travel by rail is more common along the Northeast Corridor, the only high-speed rail in the U.S. that meets international standards. Amtrak, the country's government-sponsored national passenger rail company, has a relatively sparse network compared to that of Western European countries. Service is concentrated in the Northeast, California, the Midwest, the Pacific Northwest, and Virginia/Southeast. The United States has an extensive air transportation network. U.S. civilian airlines are all privately owned. The three largest airlines in the world, by total number of passengers carried, are U.S.-based; American Airlines became the global leader after its 2013 merger with US Airways. Of the 50 busiest airports in the world, 16 are in the United States, as well as five of the top 10. The world's busiest airport by passenger volume is Hartsfield–Jackson Atlanta International in Atlanta, Georgia. In 2022, most of the 19,969 U.S. airports were owned and operated by local government authorities, and there are also some private airports. Some 5,193 are designated as "public use", including for general aviation. The Transportation Security Administration (TSA) has provided security at most major airports since 2001. The country's rail transport network, the longest in the world at 182,412.3 mi (293,564.2 km), handles mostly freight (in contrast to more passenger-centered rail in Europe). Because they are often privately owned operations, U.S. railroads lag behind those of the rest of the world in terms of electrification. The country's inland waterways are the world's fifth-longest, totaling 25,482 mi (41,009 km). They are used extensively for freight, recreation, and a small amount of passenger traffic. Of the world's 50 busiest container ports, four are located in the United States, with the busiest in the country being the Port of Los Angeles. Demographics The U.S. Census Bureau reported 331,449,281 residents on April 1, 2020,[v] making the United States the third-most-populous country in the world, after India and China. The Census Bureau's official 2025 population estimate was 341,784,857, an increase of 3.1% since the 2020 census. According to the Bureau's U.S. Population Clock, on July 1, 2024, the U.S. population had a net gain of one person every 16 seconds, or about 5400 people per day. In 2023, 51% of Americans age 15 and over were married, 6% were widowed, 10% were divorced, and 34% had never been married. In 2023, the total fertility rate for the U.S. stood at 1.6 children per woman, and, at 23%, it had the world's highest rate of children living in single-parent households in 2019. Most Americans live in the suburbs of major metropolitan areas. The United States has a diverse population; 37 ancestry groups have more than one million members. White Americans with ancestry from Europe, the Middle East, or North Africa form the largest racial and ethnic group at 57.8% of the United States population. Hispanic and Latino Americans form the second-largest group and are 18.7% of the United States population. African Americans constitute the country's third-largest ancestry group and are 12.1% of the total U.S. population. Asian Americans are the country's fourth-largest group, composing 5.9% of the United States population. The country's 3.7 million Native Americans account for about 1%, and some 574 native tribes are recognized by the federal government. In 2024, the median age of the United States population was 39.1 years. While many languages and dialects are spoken in the United States, English is by far the most commonly spoken and written. De facto, English is the official language of the United States, and in 2025, Executive Order 14224 declared English official. However, the U.S. has never had a de jure official language, as Congress has never passed a law to designate English as official for all three federal branches. Some laws, such as U.S. naturalization requirements, nonetheless standardize English. Twenty-eight states and the United States Virgin Islands have laws that designate English as the sole official language; 19 states and the District of Columbia have no official language. Three states and four U.S. territories have recognized local or indigenous languages in addition to English: Hawaii (Hawaiian), Alaska (twenty Native languages),[w] South Dakota (Sioux), American Samoa (Samoan), Puerto Rico (Spanish), Guam (Chamorro), and the Northern Mariana Islands (Carolinian and Chamorro). In total, 169 Native American languages are spoken in the United States. In Puerto Rico, Spanish is more widely spoken than English. According to the American Community Survey (2020), some 245.4 million people in the U.S. age five and older spoke only English at home. About 41.2 million spoke Spanish at home, making it the second most commonly used language. Other languages spoken at home by one million people or more include Chinese (3.40 million), Tagalog (1.71 million), Vietnamese (1.52 million), Arabic (1.39 million), French (1.18 million), Korean (1.07 million), and Russian (1.04 million). German, spoken by 1 million people at home in 2010, fell to 857,000 total speakers in 2020. America's immigrant population is by far the world's largest in absolute terms. In 2022, there were 87.7 million immigrants and U.S.-born children of immigrants in the United States, accounting for nearly 27% of the overall U.S. population. In 2017, out of the U.S. foreign-born population, some 45% (20.7 million) were naturalized citizens, 27% (12.3 million) were lawful permanent residents, 6% (2.2 million) were temporary lawful residents, and 23% (10.5 million) were unauthorized immigrants. In 2019, the top countries of origin for immigrants were Mexico (24% of immigrants), India (6%), China (5%), the Philippines (4.5%), and El Salvador (3%). In fiscal year 2022, over one million immigrants (most of whom entered through family reunification) were granted legal residence. The undocumented immigrant population in the U.S. reached a record high of 14 million in 2023. The First Amendment guarantees the free exercise of religion in the country and forbids Congress from passing laws respecting its establishment. Religious practice is widespread, among the most diverse in the world, and profoundly vibrant. The country has the world's largest Christian population, which includes the fourth-largest population of Catholics. Other notable faiths include Judaism, Buddhism, Hinduism, Islam, New Age, and Native American religions. Religious practice varies significantly by region. "Ceremonial deism" is common in American culture. The overwhelming majority of Americans believe in a higher power or spiritual force, engage in spiritual practices such as prayer, and consider themselves religious or spiritual. In the Southern United States' "Bible Belt", evangelical Protestantism plays a significant role culturally; New England and the Western United States tend to be more secular. Mormonism, a Restorationist movement founded in the U.S. in 1847, is the predominant religion in Utah and a major religion in Idaho. About 82% of Americans live in metropolitan areas, particularly in suburbs; about half of those reside in cities with populations over 50,000. In 2022, 333 incorporated municipalities had populations over 100,000, nine cities had more than one million residents, and four cities—New York City, Los Angeles, Chicago, and Houston—had populations exceeding two million. Many U.S. metropolitan populations are growing rapidly, particularly in the South and West. According to the Centers for Disease Control and Prevention (CDC), average U.S. life expectancy at birth reached 79.0 years in 2024, its highest recorded level. This was an increase of 0.6 years over 2023. The CDC attributed the improvement to a significant fall in the number of fatal drug overdoses in the country, noting that "heart disease continues to be the leading cause of death in the United States, followed by cancer and unintentional injuries." In 2024, life expectancy at birth for American men rose to 76.5 years (+0.7 years compared to 2023), while life expectancy for women was 81.4 years (+0.3 years). Starting in 1998, life expectancy in the U.S. fell behind that of other wealthy industrialized countries, and Americans' "health disadvantage" gap has been increasing ever since. The Commonwealth Fund reported in 2020 that the U.S. had the highest suicide rate among high-income countries. Approximately one-third of the U.S. adult population is obese and another third is overweight. The U.S. healthcare system far outspends that of any other country, measured both in per capita spending and as a percentage of GDP, but attains worse healthcare outcomes when compared to peer countries for reasons that are debated. The United States is the only developed country without a system of universal healthcare, and a significant proportion of the population that does not carry health insurance. Government-funded healthcare coverage for the poor (Medicaid) and for those age 65 and older (Medicare) is available to Americans who meet the programs' income or age qualifications. In 2010, then-President Obama passed the Patient Protection and Affordable Care Act.[x] Abortion in the United States is not federally protected, and is illegal or restricted in 17 states. American primary and secondary education, known in the U.S. as K–12 ("kindergarten through 12th grade"), is decentralized. School systems are operated by state, territorial, and sometimes municipal governments and regulated by the U.S. Department of Education. In general, children are required to attend school or an approved homeschool from the age of five or six (kindergarten or first grade) until they are 18 years old. This often brings students through the 12th grade, the final year of a U.S. high school, but some states and territories allow them to leave school earlier, at age 16 or 17. The U.S. spends more on education per student than any other country, an average of $18,614 per year per public elementary and secondary school student in 2020–2021. Among Americans age 25 and older, 92.2% graduated from high school, 62.7% attended some college, 37.7% earned a bachelor's degree, and 14.2% earned a graduate degree. The U.S. literacy rate is near-universal. The U.S. has produced the most Nobel Prize winners of any country, with 411 (having won 413 awards). U.S. tertiary or higher education has earned a global reputation. Many of the world's top universities, as listed by various ranking organizations, are in the United States, including 19 of the top 25. American higher education is dominated by state university systems, although the country's many private universities and colleges enroll about 20% of all American students. Local community colleges generally offer open admissions, lower tuition, and coursework leading to a two-year associate degree or a non-degree certificate. As for public expenditures on higher education, the U.S. spends more per student than the OECD average, and Americans spend more than all nations in combined public and private spending. Colleges and universities directly funded by the federal government do not charge tuition and are limited to military personnel and government employees, including: the U.S. service academies, the Naval Postgraduate School, and military staff colleges. Despite some student loan forgiveness programs in place, student loan debt increased by 102% between 2010 and 2020, and exceeded $1.7 trillion in 2022. Culture and society The United States is home to a wide variety of ethnic groups, traditions, and customs. The country has been described as having the values of individualism and personal autonomy, as well as a strong work ethic and competitiveness. Voluntary altruism towards others also plays a major role; according to a 2016 study by the Charities Aid Foundation, Americans donated 1.44% of total GDP to charity—the highest rate in the world by a large margin. Americans have traditionally been characterized by a unifying political belief in an "American Creed" emphasizing consent of the governed, liberty, equality under the law, democracy, social equality, property rights, and a preference for limited government. The U.S. has acquired significant hard and soft power through its diplomatic influence, economic power, military alliances, and cultural exports such as American movies, music, video games, sports, and food. The influence that the United States exerts on other countries through soft power is referred to as Americanization. Nearly all present Americans or their ancestors came from Europe, Africa, or Asia (the "Old World") within the past five centuries. Mainstream American culture is a Western culture largely derived from the traditions of European immigrants with influences from many other sources, such as traditions brought by slaves from Africa. More recent immigration from Asia and especially Latin America has added to a cultural mix that has been described as a homogenizing melting pot, and a heterogeneous salad bowl, with immigrants contributing to, and often assimilating into, mainstream American culture. Under the First Amendment to the Constitution, the United States is considered to have the strongest protections of free speech of any country. Flag desecration, hate speech, blasphemy, and lese majesty are all forms of protected expression. A 2016 Pew Research Center poll found that Americans were the most supportive of free expression of any polity measured. Additionally, they are the "most supportive of freedom of the press and the right to use the Internet without government censorship". The U.S. is a socially progressive country with permissive attitudes surrounding human sexuality. LGBTQ rights in the United States are among the most advanced by global standards. The American Dream, or the perception that Americans enjoy high levels of social mobility, plays a key role in attracting immigrants. Whether this perception is accurate has been a topic of debate. While mainstream culture holds that the United States is a classless society, scholars identify significant differences between the country's social classes, affecting socialization, language, and values. Americans tend to greatly value socioeconomic achievement, but being ordinary or average is promoted by some as a noble condition as well. The National Foundation on the Arts and the Humanities is an agency of the United States federal government that was established in 1965 with the purpose to "develop and promote a broadly conceived national policy of support for the humanities and the arts in the United States, and for institutions which preserve the cultural heritage of the United States." It is composed of four sub-agencies: Colonial American authors were influenced by John Locke and other Enlightenment philosophers. The American Revolutionary Period (1765–1783) is notable for the political writings of Benjamin Franklin, Alexander Hamilton, Thomas Paine, and Thomas Jefferson. Shortly before and after the Revolutionary War, the newspaper rose to prominence, filling a demand for anti-British national literature. An early novel is William Hill Brown's The Power of Sympathy, published in 1791. Writer and critic John Neal in the early- to mid-19th century helped advance America toward a unique literature and culture by criticizing predecessors such as Washington Irving for imitating their British counterparts, and by influencing writers such as Edgar Allan Poe, who took American poetry and short fiction in new directions. Ralph Waldo Emerson and Margaret Fuller pioneered the influential Transcendentalism movement; Henry David Thoreau, author of Walden, was influenced by this movement. The conflict surrounding abolitionism inspired writers, like Harriet Beecher Stowe, and authors of slave narratives, such as Frederick Douglass. Nathaniel Hawthorne's The Scarlet Letter (1850) explored the dark side of American history, as did Herman Melville's Moby-Dick (1851). Major American poets of the 19th century American Renaissance include Walt Whitman, Melville, and Emily Dickinson. Mark Twain was the first major American writer to be born in the West. Henry James achieved international recognition with novels like The Portrait of a Lady (1881). As literacy rates rose, periodicals published more stories centered around industrial workers, women, and the rural poor. Naturalism, regionalism, and realism were the major literary movements of the period. While modernism generally took on an international character, modernist authors working within the United States more often rooted their work in specific regions, peoples, and cultures. Following the Great Migration to northern cities, African-American and black West Indian authors of the Harlem Renaissance developed an independent tradition of literature that rebuked a history of inequality and celebrated black culture. An important cultural export during the Jazz Age, these writings were a key influence on Négritude, a philosophy emerging in the 1930s among francophone writers of the African diaspora. In the 1950s, an ideal of homogeneity led many authors to attempt to write the Great American Novel, while the Beat Generation rejected this conformity, using styles that elevated the impact of the spoken word over mechanics to describe drug use, sexuality, and the failings of society. Contemporary literature is more pluralistic than in previous eras, with the closest thing to a unifying feature being a trend toward self-conscious experiments with language. Twelve American laureates have won the Nobel Prize in Literature. Media in the United States is broadly uncensored, with the First Amendment providing significant protections, as reiterated in New York Times Co. v. United States. The four major broadcasters in the U.S. are the National Broadcasting Company (NBC), Columbia Broadcasting System (CBS), American Broadcasting Company (ABC), and Fox Broadcasting Company (Fox). The four major broadcast television networks are all commercial entities. The U.S. cable television system offers hundreds of channels catering to a variety of niches. In 2021, about 83% of Americans over age 12 listened to broadcast radio, while about 40% listened to podcasts. In the prior year, there were 15,460 licensed full-power radio stations in the U.S. according to the Federal Communications Commission (FCC). Much of the public radio broadcasting is supplied by National Public Radio (NPR), incorporated in February 1970 under the Public Broadcasting Act of 1967. U.S. newspapers with a global reach and reputation include The Wall Street Journal, The New York Times, The Washington Post, and USA Today. About 800 publications are produced in Spanish. With few exceptions, newspapers are privately owned, either by large chains such as Gannett or McClatchy, which own dozens or even hundreds of newspapers; by small chains that own a handful of papers; or, in an increasingly rare situation, by individuals or families. Major cities often have alternative newspapers to complement the mainstream daily papers, such as The Village Voice in New York City and LA Weekly in Los Angeles. The five most-visited websites in the world are Google, YouTube, Facebook, Instagram, and ChatGPT—all of them American-owned. Other popular platforms used include X (formerly Twitter) and Amazon. In 2025, the U.S. was the world's second-largest video game market by revenue (after China). In 2015, the U.S. video game industry consisted of 2,457 companies that employed around 220,000 jobs and generated $30.4 billion in revenue. There are 444 game publishers, developers, and hardware companies in California alone. According to the Game Developers Conference (GDC), the U.S. is the top location for video game development, with 58% of the world's game developers based there in 2025. The United States is well known for its theater. Mainstream theater in the United States derives from the old European theatrical tradition and has been heavily influenced by the British theater. By the middle of the 19th century, America had created new distinct dramatic forms in the Tom Shows, the showboat theater and the minstrel show. The central hub of the American theater scene is the Theater District in Manhattan, with its divisions of Broadway, off-Broadway, and off-off-Broadway. Many movie and television celebrities have gotten their big break working in New York productions. Outside New York City, many cities have professional regional or resident theater companies that produce their own seasons. The biggest-budget theatrical productions are musicals. U.S. theater has an active community theater culture. The Tony Awards recognizes excellence in live Broadway theater and are presented at an annual ceremony in Manhattan. The awards are given for Broadway productions and performances. One is also given for regional theater. Several discretionary non-competitive awards are given as well, including a Special Tony Award, the Tony Honors for Excellence in Theatre, and the Isabelle Stevenson Award. Folk art in colonial America grew out of artisanal craftsmanship in communities that allowed commonly trained people to individually express themselves. It was distinct from Europe's tradition of high art, which was less accessible and generally less relevant to early American settlers. Cultural movements in art and craftsmanship in colonial America generally lagged behind those of Western Europe. For example, the prevailing medieval style of woodworking and primitive sculpture became integral to early American folk art, despite the emergence of Renaissance styles in England in the late 16th and early 17th centuries. The new English styles would have been early enough to make a considerable impact on American folk art, but American styles and forms had already been firmly adopted. Not only did styles change slowly in early America, but there was a tendency for rural artisans there to continue their traditional forms longer than their urban counterparts did—and far longer than those in Western Europe. The Hudson River School was a mid-19th-century movement in the visual arts tradition of European naturalism. The 1913 Armory Show in New York City, an exhibition of European modernist art, shocked the public and transformed the U.S. art scene. American Realism and American Regionalism sought to reflect and give America new ways of looking at itself. Georgia O'Keeffe, Marsden Hartley, and others experimented with new and individualistic styles, which would become known as American modernism. Major artistic movements such as the abstract expressionism of Jackson Pollock and Willem de Kooning and the pop art of Andy Warhol and Roy Lichtenstein developed largely in the United States. Major photographers include Alfred Stieglitz, Edward Steichen, Dorothea Lange, Edward Weston, James Van Der Zee, Ansel Adams, and Gordon Parks. The tide of modernism and then postmodernism has brought global fame to American architects, including Frank Lloyd Wright, Philip Johnson, and Frank Gehry. The Metropolitan Museum of Art in Manhattan is the largest art museum in the United States and the fourth-largest in the world. American folk music encompasses numerous music genres, variously known as traditional music, traditional folk music, contemporary folk music, or roots music. Many traditional songs have been sung within the same family or folk group for generations, and sometimes trace back to such origins as the British Isles, mainland Europe, or Africa. The rhythmic and lyrical styles of African-American music in particular have influenced American music. Banjos were brought to America through the slave trade. Minstrel shows incorporating the instrument into their acts led to its increased popularity and widespread production in the 19th century. The electric guitar, first invented in the 1930s, and mass-produced by the 1940s, had an enormous influence on popular music, in particular due to the development of rock and roll. The synthesizer, turntablism, and electronic music were also largely developed in the U.S. Elements from folk idioms such as the blues and old-time music were adopted and transformed into popular genres with global audiences. Jazz grew from blues and ragtime in the early 20th century, developing from the innovations and recordings of composers such as W.C. Handy and Jelly Roll Morton. Louis Armstrong and Duke Ellington increased its popularity early in the 20th century. Country music developed in the 1920s, bluegrass and rhythm and blues in the 1940s, and rock and roll in the 1950s. In the 1960s, Bob Dylan emerged from the folk revival to become one of the country's most celebrated songwriters. The musical forms of punk and hip hop both originated in the United States in the 1970s. The United States has the world's largest music market, with a total retail value of $15.9 billion in 2022. Most of the world's major record companies are based in the U.S.; they are represented by the Recording Industry Association of America (RIAA). Mid-20th-century American pop stars, such as Frank Sinatra and Elvis Presley, became global celebrities and best-selling music artists, as have artists of the late 20th century, such as Michael Jackson, Madonna, Whitney Houston, and Mariah Carey, and of the early 21st century, such as Eminem, Britney Spears, Lady Gaga, Katy Perry, Taylor Swift and Beyoncé. The United States has the world's largest apparel market by revenue. Apart from professional business attire, American fashion is eclectic and predominantly informal. Americans' diverse cultural roots are reflected in their clothing; however, sneakers, jeans, T-shirts, and baseball caps are emblematic of American styles. New York, with its Fashion Week, is considered to be one of the "Big Four" global fashion capitals, along with Paris, Milan, and London. A study demonstrated that general proximity to Manhattan's Garment District has been synonymous with American fashion since its inception in the early 20th century. A number of well-known designer labels, among them Tommy Hilfiger, Ralph Lauren, Tom Ford and Calvin Klein, are headquartered in Manhattan. Labels cater to niche markets, such as preteens. New York Fashion Week is one of the most influential fashion shows in the world, and is held twice each year in Manhattan; the annual Met Gala, also in Manhattan, has been called the fashion world's "biggest night". The U.S. film industry has a worldwide influence and following. Hollywood, a district in central Los Angeles, the nation's second-most populous city, is also metonymous for the American filmmaking industry. The major film studios of the United States are the primary source of the most commercially successful movies selling the most tickets in the world. Largely centered in the New York City region from its beginnings in the late 19th century through the first decades of the 20th century, the U.S. film industry has since been primarily based in and around Hollywood. Nonetheless, American film companies have been subject to the forces of globalization in the 21st century, and an increasing number of films are made elsewhere. The Academy Awards, popularly known as "the Oscars", have been held annually by the Academy of Motion Picture Arts and Sciences since 1929, and the Golden Globe Awards have been held annually since January 1944. The industry peaked in what is commonly referred to as the "Golden Age of Hollywood", from the early sound period until the early 1960s, with screen actors such as John Wayne and Marilyn Monroe becoming iconic figures. In the 1970s, "New Hollywood", or the "Hollywood Renaissance", was defined by grittier films influenced by French and Italian realist pictures of the post-war period. The 21st century has been marked by the rise of American streaming platforms, which came to rival traditional cinema. Early settlers were introduced by Native Americans to foods such as turkey, sweet potatoes, corn, squash, and maple syrup. Of the most enduring and pervasive examples are variations of the native dish called succotash. Early settlers and later immigrants combined these with foods they were familiar with, such as wheat flour, beef, and milk, to create a distinctive American cuisine. New World crops, especially pumpkin, corn, potatoes, and turkey as the main course are part of a shared national menu on Thanksgiving, when many Americans prepare or purchase traditional dishes to celebrate the occasion. Characteristic American dishes such as apple pie, fried chicken, doughnuts, french fries, macaroni and cheese, ice cream, hamburgers, hot dogs, and American pizza derive from the recipes of various immigrant groups. Mexican dishes such as burritos and tacos preexisted the United States in areas later annexed from Mexico, and adaptations of Chinese cuisine as well as pasta dishes freely adapted from Italian sources are all widely consumed. American chefs have had a significant impact on society both domestically and internationally. In 1946, the Culinary Institute of America was founded by Katharine Angell and Frances Roth. This would become the United States' most prestigious culinary school, where many of the most talented American chefs would study prior to successful careers. The United States restaurant industry was projected at $899 billion in sales for 2020, and employed more than 15 million people, representing 10% of the nation's workforce directly. It is the country's second-largest private employer and the third-largest employer overall. The United States is home to over 220 Michelin star-rated restaurants, 70 of which are in New York City. Wine has been produced in what is now the United States since the 1500s, with the first widespread production beginning in what is now New Mexico in 1628. In the modern U.S., wine production is undertaken in all fifty states, with California producing 84 percent of all U.S. wine. With more than 1,100,000 acres (4,500 km2) under vine, the United States is the fourth-largest wine-producing country in the world, after Italy, Spain, and France. The classic American diner, a casual restaurant type originally intended for the working class, emerged during the 19th century from converted railroad dining cars made stationary. The diner soon evolved into purpose-built structures whose number expanded greatly in the 20th century. The American fast-food industry developed alongside the nation's car culture. American restaurants developed the drive-in format in the 1920s, which they began to replace with the drive-through format by the 1940s. American fast-food restaurant chains, such as McDonald's, Burger King, Chick-fil-A, Kentucky Fried Chicken, Dunkin' Donuts and many others, have numerous outlets around the world. The most popular spectator sports in the U.S. are American football, basketball, baseball, soccer, and ice hockey. Their premier leagues are, respectively, the National Football League, the National Basketball Association, Major League Baseball, Major League Soccer, and the National Hockey League, All these leagues enjoy wide-ranging domestic media coverage and, except for the MLS, all are considered the preeminent leagues in their respective sports in the world. While most major U.S. sports such as baseball and American football have evolved out of European practices, basketball, volleyball, skateboarding, and snowboarding are American inventions, many of which have become popular worldwide. Lacrosse and surfing arose from Native American and Native Hawaiian activities that predate European contact. The market for professional sports in the United States was approximately $69 billion in July 2013, roughly 50% larger than that of Europe, the Middle East, and Africa combined. American football is by several measures the most popular spectator sport in the United States. Although American football does not have a substantial following in other nations, the NFL does have the highest average attendance (67,254) of any professional sports league in the world. In the year 2024, the NFL generated over $23 billion, making them the most valued professional sports league in the United States and the world. Baseball has been regarded as the U.S. "national sport" since the late 19th century. The most-watched individual sports in the U.S. are golf and auto racing, particularly NASCAR and IndyCar. On the collegiate level, earnings for the member institutions exceed $1 billion annually, and college football and basketball attract large audiences, as the NCAA March Madness tournament and the College Football Playoff are some of the most watched national sporting events. In the U.S., the intercollegiate sports level serves as the main feeder system for professional and Olympic sports, with significant exceptions such as Minor League Baseball. This differs greatly from practices in nearly all other countries, where publicly and privately funded sports organizations serve this function. Eight Olympic Games have taken place in the United States. The 1904 Summer Olympics in St. Louis, Missouri, were the first-ever Olympic Games held outside of Europe. The Olympic Games will be held in the U.S. for a ninth time when Los Angeles hosts the 2028 Summer Olympics. U.S. athletes have won a total of 2,968 medals (1,179 gold) at the Olympic Games, the most of any country. In other international competition, the United States is the home of a number of prestigious events, including the America's Cup, World Baseball Classic, the U.S. Open, and the Masters Tournament. The U.S. men's national soccer team has qualified for eleven World Cups, while the women's national team has won the FIFA Women's World Cup and Olympic soccer tournament four and five times, respectively. The 1999 FIFA Women's World Cup was hosted by the United States. Its final match was attended by 90,185, setting the world record for largest women's sporting event crowd at the time. The United States hosted the 1994 FIFA World Cup and will co-host, along with Canada and Mexico, the 2026 FIFA World Cup. See also Notes References This article incorporates text from a free content work. Licensed under CC BY-SA IGO 3.0 (license statement/permission). Text taken from World Food and Agriculture – Statistical Yearbook 2023, FAO, FAO. External links 40°N 100°W / 40°N 100°W / 40; -100 (United States of America) |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Science_divulgation] | [TOKENS: 6600] |
Contents Science communication Science communication encompasses a wide range of activities that connect science and society. Common goals of science communication include informing non-experts about scientific findings, raising the public awareness of and interest in science, influencing people's attitudes and behaviors, informing public policy, and engaging with diverse communities to address societal problems. The term "science communication" generally refers to settings in which audiences are not experts on the scientific topic being discussed (outreach), though some authors categorize expert-to-expert communication ("inreach" such as publication in scientific journals) as a type of science communication. Examples of outreach include science journalism and health communication. Since science has political, moral, and legal implications, science communication can help bridge gaps between different stakeholders in public policy, industry, and civil society with trust-building playing a central role in this process. Science communicators are a broad group of people: scientific experts, science journalists, science artists, medical professionals, nature center educators, science advisors for policymakers, and everyone else who communicates with the public about science. They often use entertainment and persuasion techniques including humour, storytelling, and metaphors to connect with their audience's values and interests. Science communication also exists as an interdisciplinary field of social science research on topics such as misinformation, public opinion of emerging technologies, and the politicization and polarization of science. For decades, science communication research has had only limited influence on science communication practice, and vice-versa, but both communities are increasingly attempting to bridge research and practice. Historically, academic scientists were discouraged from spending time on public outreach, but that has begun to change. Research funders have raised their expectations for researchers to have broader impacts beyond publication in academic journals. An increasing number of scientists, especially younger scholars, are expressing interest in engaging the public through social media and in-person events, though they still perceive significant institutional barriers to doing so. Science communication is closely related to the fields of informal science education, citizen science, and public engagement with science, and there is no general agreement on whether or how to distinguish them. Like other aspects of society, science communication is influenced by systemic inequalities that impact both inreach and outreach. Motivations Writing in 1987, Geoffery Thomas and John Durant advocated various reasons to increase public understanding of science, or scientific literacy. More trained engineers and scientists could allow a nation to be more competitive economically.: 11–17 Science can also benefit individuals. Science can simply have aesthetic appeal (e.g., popular science or science fiction). Living in an increasingly technological society, background scientific knowledge can help to negotiate it. The science of happiness is an example of a field whose research can have direct and obvious implications for individuals. Governments and societies might also benefit from more scientific literacy, since an informed electorate promotes a more democratic society. Moreover, science can inform moral decision making (e.g., answering questions about whether animals can feel pain, how human activity influences climate, or even a science of morality). In 1990, Steven Hilgartner, a scholar in science and technology studies, criticized some academic research in public understanding of science. Hilgartner argued that what he called "the dominant view" of science popularization tends to imply a tight boundary around those who can articulate true, reliable knowledge. By defining a "deficient public" as recipients of knowledge, the scientists get to emphasize their own identity as experts, according to Hilgartner. Understood in this way, science communication may explicitly exist to connect scientists with the rest of society, but science communication may reinforce the boundary between the public and the experts (according to work by Brian Wynne in 1992 and Massimiano Bucchi in 1998). In 2016, the scholarly journal Public Understanding of Science ran an essay competition on the "deficit model" or "deficit concept" of science communication and published a series of articles answering the question "In science communication, why does the idea of a public deficit always return?" in different ways; for example, Carina Cortassa's essay argued that the deficit model of science communication is just a special case of an omnipresent problem studied in social epistemology of testimony, the problem of "epistemic asymmetry", which arises whenever some people know more about some things than other people. Science communication is just one kind of attempt to reduce epistemic asymmetry between people who may know more and people who may know less about a certain subject. Biologist Randy Olson said in 2009 that anti-science groups can often be so motivated, and so well funded, that the impartiality of science organizations in politics can lead to crises of public understanding of science. He cited examples of denialism (for instance, climate change denial) to support this worry. Journalist Robert Krulwich likewise argued in 2008 that the stories scientists tell compete with the efforts of people such as Turkish creationist Adnan Oktar. Krulwich explained that attractive, easy to read, and cheap creationist textbooks were sold by the thousands to schools in Turkey (despite their strong secular tradition) due to the efforts of Oktar. Astrobiologist David Morrison has spoken of repeated disruption of his work by popular anti-scientific phenomena, having been called upon to assuage public fears of an impending cataclysm involving an unseen planetary object—first in 2008, and again in 2012 and 2017. Methods Science popularization figures such as Carl Sagan and Neil deGrasse Tyson are partly responsible for the view of science or a specific science discipline within the general public. However, the degree of knowledge and experience a science popularizer has can vary greatly. Because of this, some science communication can depend on sensationalism. As a Forbes contributor put it, "The main job of physics popularizers is the same as it is for any celebrity: get more famous." Another point in the controversy of popular science is the idea of how public debate can affect public opinion. A relevant and highly public example of this is climate change. A science communication study appearing in The New York Times proves that "even a fractious minority wields enough power to skew a reader's perception of a [science news] story" and that even "firmly worded (but not uncivil) disagreements between commenters affected readers' perception of science." This causes some to worry about the popularizing of science in the public, questioning whether the further popularization of science will cause pressure towards generalization or sensationalism. Marine biologist and film-maker Randy Olson published Don't Be Such a Scientist: Talking Substance in an Age of Style. In the book he describes how there has been an unproductive negligence when it comes to teaching scientists to communicate. Don't be Such a Scientist is written to his fellow scientists, and he says they need to "lighten up". He adds that scientists are ultimately the most responsible for promoting and explaining science to the public and media. This, Olson says, should be done according to a good grasp of social science; scientists must use persuasive and effective means like story telling. Olson acknowledges that the stories told by scientists need not only be compelling but also accurate to modern science—and says this added challenge must simply be confronted. He points to figures like Carl Sagan as effective popularizers, partly because such figures actively cultivate a likeable image. At his commencement address to Caltech students, journalist Robert Krulwich delivered a speech entitled "Tell me a story". Krulwich says that scientists are actually given many opportunities to explain something interesting about science or their work, and that they must seize such opportunities. He says scientists must resist shunning the public, as Sir Isaac Newton did in his writing, and instead embrace metaphors the way Galileo did; Krulwich suggests that metaphors only become more important as the science gets more difficult to understand. He adds that telling stories of science in practice, of scientists' success stories and struggles, helps convey that scientists are real people. Finally, Krulwich advocates for the importance of scientific values in general, and helping the public to understand that scientific views are not mere opinions, but hard-won knowledge. Actor Alan Alda helped scientists and PhD students get more comfortable with communication with the help of drama coaches (they use the acting techniques of Viola Spolin). Matthew Nisbet described the use of opinion leaders as intermediaries between scientists and the public as a way to reach the public via trained individuals who are more closely engaged with their communities, such as "teachers, business leaders, attorneys, policymakers, neighborhood leaders, students, and media professionals". Examples of initiatives that have taken this approach include Science & Engineering Ambassadors, sponsored by the National Academy of Sciences, and Science Booster Clubs, coordinated by the National Center for Science Education. Similar to how evidence-based medicine gained a foothold in medical communication decades ago, researchers Eric Jensen and Alexander Gerber have argued that science communication would benefit from evidence-based prescriptions since the field faces related challenges. In particular, they argued that the lack of collaboration between researchers and practitioners is a problem: "Ironically, the challenges begin with communication about science communication evidence.": 2 The overall effectiveness of the science communication field is limited by the lack of effective transfer mechanisms for practitioners to apply research in their work and perhaps even investigate, together with researchers, communication strategies, Jensen and Gerber said. Closer collaboration could enrich the spectrum of science communication research and increase the existing methodological toolbox, including more longitudinal and experimental studies. Evidence-based science communication would combine the best available evidence from systematic research, underpinned by established theory, as well as practitioners' acquired skills and expertise, reducing the double-disconnect between scholarship and practice. Neither adequately take into account the other side's priorities, needs and possible solutions, Jensen and Gerber argued; bridging the gap and fostering closer collaboration could allow for mutual learning, enhancing the overall advancements of science communication as a young field. In the preface of The Selfish Gene, Richard Dawkins wrote: "Three imaginary readers looked over my shoulder while I was writing, and I now dedicate the book to them. [...] First the general reader, the layman [...] second the expert [and] third the student". Many criticisms of the public understanding of science movement have emphasized that this thing they were calling the public was somewhat of an (unhelpful) black box. Approaches to the public changed with the move away from the public understanding of science. Science communication researchers and practitioners now often showcase their desire to listen to non-scientists as well as acknowledging an awareness of the fluid and complex nature of (post/late) modern social identities. At the very least, people will use plurals: publics or audiences. As the editor of the scholarly journal Public Understanding of Science put it in a special issue on publics: We have clearly moved from the old days of the deficit frame and thinking of publics as monolithic to viewing publics as active, knowledgeable, playing multiple roles, receiving as well as shaping science. (Einsiedel, 2007: 5) However, Einsiedel goes on to suggest both views of the public are "monolithic" in their own way; they both choose to declare what something called the public is. Some promoters of public understanding of science might have ridiculed publics for their ignorance, but an alternative "public engagement with science and technology" romanticizes its publics for their participatory instincts, intrinsic morality or simple collective wisdom. As Susanna Hornig Priest concluded in her 2009 introduction essay on science's contemporary audiences, the job of science communication might be to help non-scientists feel they are not excluded as opposed to always included; that they can join in if they want, rather than that there is a necessity to spend their lives engaging. The process of quantifiably surveying public opinion of science is now largely associated with the public understanding of science movement (some would say unfairly). In the US, Jon Miller is the name most associated with such work and well known for differentiating between identifiable "attentive" or "interested" publics (that is to say science fans) and those who do not care much about science and technology. Miller's work questioned whether the American public had the following four attributes of scientific literacy: In some respects, John Durant's work surveying British public applied similar ideas to Miller. However, they were slightly more concerned with attitudes to science and technology, rather than just how much knowledge people had. They also looked at public confidence in their knowledge, considering issues such as the gender of those ticking "don't know" boxes. We can see aspects of this approach, as well as a more "public engagement with science and technology" influenced one, reflected within the Eurobarometer studies of public opinion. These have been running since 1973 to monitor public opinion in the member states, with the aim of helping the preparation of policy (and evaluation of policy). They look at a host of topics, not just science and technology but also defense, the euro, enlargement of the European Union, and culture. Eurobarometer's 2008 study of Europeans' Attitudes to Climate Change is a good example. It focuses on respondents' "subjective level of information"; asking "personally, do you think that you are well informed or not about...?" rather than checking what people knew. Science communication can be analyzed through frame analysis, a research method used to analyze how people understand situations and activities. Some features of this analysis are listed below. People make an enormous number of decisions every day, and to approach all of them in a careful, methodical manner is impractical. They therefore often use mental shortcuts known as "heuristics" to quickly arrive at acceptable inferences. Tversky and Kahneman originally proposed three heuristics, listed below, although there are many others that have been discussed in later research. The most effective science communication efforts take into account the role that heuristics play in everyday decision-making. Many outreach initiatives focus solely on increasing the public's knowledge, but studies have found little, if any, correlation between knowledge levels and attitudes towards scientific issues. Inclusive science communication seeks to build equity by prioritizing communication that is built with and for marginalized groups that are not reached through typical top-down science communication. Science communication is affected by the same implicit inequities embedded in the production of science research. It has traditionally centered Western science and communicated in Western language. Māori researcher Linda Tuhiwai Smith details how scientific research is "inextricably linked to European imperialism and colonialism". The field's focus on Western science results in publicizing "discoveries" by Western scientists that have been known to Indigenous scientists and communities for generations, continuing the cycle of colonial exploitation of physical and intellectual resources. Collin Bjork notes that science communication is linked to oppression because European colonizers "employed both the English language and western science as tools for subjugating others". Today, English is still considered the international language of science and 80% of science journals in Scopus are published in English. As a result, most science journalism also communicates in English or must use English sources, limiting the audience that science communication can reach. Just as science has historically excluded communities of Black, Indigenous and people of color, LGBTQ+ communities and communities of lower socioeconomic status or education, science communication has also failed to center these audiences. Science communication cannot be inclusive or effective if these communities are not involved in both the creation and dissemination of science information. One strategy to improve inclusivity in science communication is by building philanthropic coalitions with marginalized communities. The 2018 article titled "The Civic Science Imperative" in the Stanford Social Innovation Review (SSIR) outlined how civic science could expand inclusion in science and science communication. Civic science fosters public engagement with science issues so citizens can spur meaningful policy, societal or democratic change. This article outlined the strategies of supporting effective science communication and engagement, building diverse coalitions, building flexibility to meet changing goals, centering shared values, and using research and feedback loops to increase trust. However, the authors of the 2020 SSIR article "How Science Philanthropy Can Build Equity" warned that these approaches will not combat systemic barriers of racism, sexism, ableism, xenophobia or classism without the principles of diversity, equity and inclusion (DEI). DEI in science communication can take many forms, but will always: include marginalized groups in the goal setting, design and implementation of the science communication; use experts to determine the unique values, needs and communication style of the community being reached; test to determine the best way to reach each segment of a community; and include ways to mitigate harm or stress for community members who engage with this work. Efforts to make science communication more inclusive can focus on a global, national or local community. The Metcalf Institute for Marine & Environmental Reporting at the University of Rhode Island produced a survey of these practices in 2020. "How Science Philanthropy Can Build Equity" also lists several successful civic science projects and approaches. Complementary methods for including diverse voices include the use of poetry, participatory arts, film, and games, all of which have been used to engage various publics by monitoring, deliberating, and responding to their attitudes toward science and scientific discourse. Science in popular culture and the media While scientific study began to emerge as a popular discourse following the Renaissance and the Enlightenment, science was not widely funded or exposed to the public until the nineteenth century. Most science prior to this was funded by individuals under private patronage and was studied in exclusive groups, like the Royal Society. Public science emerged due to a gradual social change, resulting from the rise of the middle class in the nineteenth century. As scientific inventions, like the conveyor belt and the steam locomotive entered and enhanced the lifestyle of people in the nineteenth century, scientific inventions began to be widely funded by universities and other public institutions in an effort to increase scientific research. Since scientific achievements were beneficial to society, the pursuit of scientific knowledge resulted in science as a profession. Scientific institutions, like the National Academy of Sciences or the British Association for the Advancement of Science are examples of leading platforms for the public discussion of science. David Brewster, founder of the British Association for the Advancement of Science, believed in regulated publications in order to effectively communicate their discoveries, "so that scientific students may know where to begin their labours." As the communication of science reached a wider audience, due to the professionalization of science and its introduction to the public sphere, the interest in the subject increased. There was a change in media production in the nineteenth century. The invention of the steam-powered printing press enabled more pages to be printed per hour, which resulted in cheaper texts. Book prices gradually dropped, which gave the working classes the ability to purchase them. No longer reserved for the elite, affordable and informative texts were made available to a mass audience. Historian Aileen Fyfe noted that, as the nineteenth century experienced a set of social reforms that sought to improve the lives of those in the working classes, the availability of public knowledge was valuable for intellectual growth. As a result, there were reform efforts to further the knowledge of the less educated. The Society for the Diffusion of Useful Knowledge, led by Henry Brougham, attempted to organize a system for widespread literacy for all classes. Additionally, weekly periodicals, like the Penny Magazine, were aimed to educate the general public on scientific achievements in a comprehensive manner. As the audience for scientific texts expanded, the interest in public science did as well. "Extension lectures" were installed in some universities, like Oxford and Cambridge, which encouraged members of the public to attend lectures. In America, traveling lectures were a common occurrence in the nineteenth century and attracted hundreds of viewers. These public lectures were a part of the lyceum movement and demonstrated basic scientific experiments, which advanced scientific knowledge for both the educated and uneducated viewers. Not only did the popularization of public science enlighten the general public through mass media, but it also enhanced communication within the scientific community. Although scientists had been communicating their discoveries and achievements through print for centuries, publications with a variety of subjects decreased in popularity. Alternatively, publications in discipline-specific journals were crucial for a successful career in the sciences in the nineteenth century. As a result, scientific journals such as Nature or National Geographic possessed a large readership and received substantial funding by the end of the nineteenth century as the popularization of science continued. Science can be communicated to the public in many different ways. According to Karen Bultitude, a science communication lecturer at University College London, these can be broadly categorized into three groups: traditional journalism, live or face-to-face events, and online interaction. Traditional journalism (for example, newspapers, magazines, television and radio) has the advantage of reaching large audiences; in the past, this is way most people regularly accessed information about science. Traditional media is also more likely to produce information that is high quality (well written or presented), as it will have been produced by professional journalists. Traditional journalism is often also responsible for setting agendas and having an impact on government policy. The traditional journalistic method of communication is one-way, so there can be no dialogue with the public, and science stories can often be reduced in scope so that there is a limited focus for a mainstream audience, who may not be able to comprehend the bigger picture from a scientific perspective. However, there is new research now available on the role of newspapers and television channels in constituting "scientific public spheres" which enable participation of a wide range of actors in public deliberations. Another disadvantage of traditional journalism is that, once a science story is taken up by mainstream media, the scientist(s) involved no longer has any direct control over how his or her work is communicated, which may lead to misunderstanding or misinformation. Research in this area demonstrates how the relationship between journalists and scientists has been strained in some instances. On one hand scientists have reported being frustrated with things like journalists oversimplifying or dramatizing of their work, while on the other hand journalists find scientists difficult to work with and ill-equipped to communicate their work to a general audience. Despite this potential tension, a comparison of scientists from several countries has shown that many scientists are pleased with their media interactions and engage often. However, the use of traditional media sources, like newspapers and television, has steadily declined as primary sources for science information, while the internet has rapidly increased in prominence. In 2016, 55% of Americans reported using the internet as their primary source to learn about science and technology, compared to 24% reporting TV and 4% reporting newspapers were their primary sources. Additionally, traditional media outlets have dramatically decreased the number of, or in some cases eliminated, science journalists and the amount of science-related content they publish. The second category is live or face-to-face events, such as public lectures in museums or universities, debates, science busking, "sci-art" exhibits, Science Cafés and science festivals. Citizen science or crowd-sourced science (scientific research conducted, in whole or in part, by amateur or nonprofessional scientists) can be done with a face-to-face approach, online, or as a combination of the two to engage in science communication. Research has shown that members of the public seek out science information that is entertaining, but also helping citizens to critically participate in risk regulation and S&T governance. Therefore, it is important to bear this aspect in mind when communicating scientific information to the public (for example, through events combining science communication and comedy, such as Festival of the Spoken Nerd, or during scientific controversies). The advantages of this approach are that it is more personal and allows scientists to interact with the public, allowing for two-way dialogue. Scientists are also better able to control content using this method. Disadvantages of this method include the limited reach, it can also be resource-intensive and costly and also, it may be that only audiences with an existing interest in science will be attracted. Another opportunity for budding science communicators is through FameLab. This programme was created by Cheltenham Festivals in 2005 and is the largest science communication competition and training programme in the world. FameLab discovers, trains and promotes the best new voices in science (including social sciences), technology, engineering and maths. Participants have just three minutes to convey a scientific concept of their choice to an audience and expert panel of judges. The winner is the speaker who best demonstrates FameLab's 3 C's – Content, Clarity and Charisma. The third category is online interaction; for example, websites, blogs, wikis and podcasts can be used for science communication, as can other social media or forms of artificial intelligence like AI-Chatbots. Online methods of communicating science have the potential to reach huge audiences, can allow direct interaction between scientists and the public, and the content is always accessible and can be somewhat controlled by the scientist. Additionally, online communication of science can help boost scientists' reputation through increased citations, better circulation of articles, and establishing new collaborations. Online communication also allows for both one-way and two-way communication, depending on the audience's and the author's preferences. However, there are disadvantages in that it is difficult to control how content is picked up by others, and regular attention and updating is needed. When considering whether or not to engage in science communication online, scientists should review what science communication research has shown to be the potential positive and negative outcomes. Online communication has given rise to movements like open science, which advocates for making science more accessible. However, when engaging in communication about science online, scientists should consider not publicizing or reporting findings from their research until it has been peer-reviewed and published, as journals may not accept the work after it has been circulated under the "Ingelfinger rule". Other considerations revolve around how scientists will be perceived by other scientists for engaging in communication. For example, some scholars have criticized engaged, popular scholars using concepts like the Sagan effect or Kardashian Index. Despite these criticisms, many scientists are taking to communicating their work on online platforms, a sign of potentially changing norms in the field. According to Lesen et al. (2016), art has been a tool increasingly used to attract the public to science. Either formally or in an informal context, an integration between artists and scientists could potentially raise awareness of the general public about current topics in science, technology, engineering and mathematics (STEM). The arts have the power of creating emotional links between the public and a research topic and create a collaborative atmosphere that can "activate science" in a different way. Learning through the affection domain, in contrast to the cognitive domain, increases motivation and using the arts to communicate scientific knowledge this way could increase dramatically engagement. One example is Ed Hawkins's warming stripes graphics which were included in Pirouette: Turning Points in Design, an exhibition of design icons at the Museum of Modern Art highlighting design "as an agent of change". By using Twitter, scientists and science communicators can discuss scientific topics with many types of audiences with various points of view. Studies published in 2012 by Gunther Eysenbach shed light on how Twitter not only communicates science to the public but also affects advances in the science community. However, as of 2024, engagement from academics reduced on Twitter. Alison Bert, editor in chief of Elsevier Connect, wrote a 2014 news article titled "How to use social media for science" that reported on a panel about social media at that year's AAAS meeting, in which panelists Maggie Koerth-Baker, Kim Cobb, and Danielle N. Lee noted some potential benefits and drawbacks to scientists of sharing their research on Twitter. Koerth-Baker, for example, commented on the importance of keeping public and private personas on social media separate in order to maintain professionalism online. Interviewed in 2014, Karen Peterson, director of Scientific Career Development at Fred Hutchinson Cancer Research Center stressed the importance for scientists of using social networks such as Facebook and Twitter to establish an online presence. Kimberly Collins et al., writing in PLOS One in 2016, explained reasons why some scientists were hesitant to join Twitter. Some scientists were hesitant to use social media outlets such as Twitter due to lack of knowledge of the platform, and inexperience with how to make meaningful posts. Some scientists did not see the meaning in using Twitter as a platform to share their research or have the time to add the information into the accounts themselves. In 2016, Elena Milani created the SciHashtag Project, which is a condensed collection of Twitter hashtags about science communication. In 2017, a study done by the Pew Research Center found that about "a quarter of social media users (26%) follow science accounts" on social media. This group of users "places both more importance and comparatively more trust on science news that comes to them through social media". Scientists have also used other social media platforms, including Instagram and Reddit, to establish a connection with the public and discuss science. The public understanding of science movement "Public understanding of science", "public awareness of science" and "public engagement with science and technology" are all terms coined with a movement involving governments and societies in the late 20th century. During the late 19th century, science became a professional subject and influenced by governmental suggestions. Prior to this, public understanding of science was very low on the agenda. However, some well-known figures such as Michael Faraday ran lectures aimed at the non-expert public, his being the famous Christmas Lectures which began in 1825. The 20th century saw groups founded on the basis they could position science in a broader cultural context and allow scientists to communicate their knowledge in a way that could reach and be understood by the general public. In the UK, The Bodmer Report (or The Public Understanding of Science as it is more formally known) published in 1985 by The Royal Society changed the way scientists communicated their work to the public. The report was designed to "review the nature and extent of the public understanding of science in the United Kingdom and its adequacy for an advanced democracy".: 5–7 Chaired by the geneticist Sir Walter Bodmer alongside famous scientists as well as broadcaster Sir David Attenborough, the report was evidenced by all of the major sectors concerned; scientists, politicians, journalists and industrialists but not the general public.: 5–7 One of the main assumptions drawn from the report was everybody should have some grasp of science and this should be introduced from a young age by teachers who are suitably qualified in the subject area. The report also asked for further media coverage of science including via newspapers and television which has ultimately led to the establishment of platforms such as the Vega Science Trust. In both the UK and the United States following the Second World War, public views of scientists swayed from great praise to resentment. Therefore, the Bodmer Report highlighted concerns from the scientific community that their withdrawal from society was causing scientific research funding to be weak. Bodmer promoted the communication of science to a wider more general public by expressing to British scientists that it was their responsibility to publicize their research. An upshot of the publication of the report was the creation of the Committee on the Public Understanding of Science (COPUS), a collaboration between the British Association for the Advancement of Science, the Royal Society and the Royal Institution. The engagement between these individual societies caused the necessity for a public understanding of science movement to be taken seriously. COPUS also awarded grants for specific outreach activities allowing the public understanding to come to the fore. Ultimately leading to a cultural shift in the way scientists publicized their work to the wider non-expert community. Although COPUS no longer exists within the UK the name has been adopted in the US by the Coalition on the Public Understanding of Science. An organization which is funded by the US National Academy of Sciences and the National Science Foundation and focuses on popular science projects such as science cafes, festivals, magazines and citizen science schemes. In the European Union, public views on public-funded research and the role of governmental institutions in funding scientific activities were being questioned as the budget allocated was increasing. Therefore, the European Commission encouraged strongly and later obligated research organizations to communicate about their research activities and results widely and to the general public. This is being done by integrating a communication plan into their research project that increases the public visibility of the project using an accessible language and adapted channels and materials. See also Notes and references Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Comparison_of_software_and_protocols_for_distributed_social_networking] | [TOKENS: 58] |
Contents Comparison of software and protocols for distributed social networking The following is a comparison of both software and protocols that are used for distributed social networking. Software 2) Sharing personal data with "friends" 3) Use of personal data for "personal applications" AGPL Protocols See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Meta_Platforms#cite_note-247] | [TOKENS: 8626] |
Contents Meta Platforms Meta Platforms, Inc. (doing business as Meta) is an American multinational technology company headquartered in Menlo Park, California. Meta owns and operates several prominent social media platforms and communication services, including Facebook, Instagram, WhatsApp, Messenger, Threads and Manus. The company also operates an advertising network for its own sites and third parties; as of 2023[update], advertising accounted for 97.8 percent of its total revenue. Meta has been described as a part of Big Tech, which refers to the largest six tech companies in the United States, Alphabet (Google), Amazon, Apple, Meta (Facebook), Microsoft, and Nvidia, which are also the largest companies in the world by market capitalization. The company was originally established in 2004 as TheFacebook, Inc., and was renamed Facebook, Inc. in 2005. In 2021, it rebranded as Meta Platforms, Inc. to reflect a strategic shift toward developing the metaverse—an interconnected digital ecosystem spanning virtual and augmented reality technologies. In 2023, Meta was ranked 31st on the Forbes Global 2000 list of the world's largest public companies. As of 2022, it was the world's third-largest spender on research and development, with R&D expenses totaling US$35.3 billion. History Facebook filed for an initial public offering (IPO) on January 1, 2012. The preliminary prospectus stated that the company sought to raise $5 billion, had 845 million monthly active users, and a website accruing 2.7 billion likes and comments daily. After the IPO, Zuckerberg would retain 22% of the total shares and 57% of the total voting power in Facebook. Underwriters valued the shares at $38 each, valuing the company at $104 billion, the largest valuation yet for a newly public company. On May 16, one day before the IPO, Facebook announced it would sell 25% more shares than originally planned due to high demand. The IPO raised $16 billion, making it the third-largest in US history (slightly ahead of AT&T Mobility and behind only General Motors and Visa). The stock price left the company with a higher market capitalization than all but a few U.S. corporations—surpassing heavyweights such as Amazon, McDonald's, Disney, and Kraft Foods—and made Zuckerberg's stock worth $19 billion. The New York Times stated that the offering overcame questions about Facebook's difficulties in attracting advertisers to transform the company into a "must-own stock". Jimmy Lee of JPMorgan Chase described it as "the next great blue-chip". Writers at TechCrunch, on the other hand, expressed skepticism, stating, "That's a big multiple to live up to, and Facebook will likely need to add bold new revenue streams to justify the mammoth valuation." Trading in the stock, which began on May 18, was delayed that day due to technical problems with the Nasdaq exchange. The stock struggled to stay above the IPO price for most of the day, forcing underwriters to buy back shares to support the price. At the closing bell, shares were valued at $38.23, only $0.23 above the IPO price and down $3.82 from the opening bell value. The opening was widely described by the financial press as a disappointment. The stock set a new record for trading volume of an IPO. On May 25, 2012, the stock ended its first full week of trading at $31.91, a 16.5% decline. On May 22, 2012, regulators from Wall Street's Financial Industry Regulatory Authority announced that they had begun to investigate whether banks underwriting Facebook had improperly shared information only with select clients rather than the general public. Massachusetts Secretary of State William F. Galvin subpoenaed Morgan Stanley over the same issue. The allegations sparked "fury" among some investors and led to the immediate filing of several lawsuits, one of them a class action suit claiming more than $2.5 billion in losses due to the IPO. Bloomberg estimated that retail investors may have lost approximately $630 million on Facebook stock since its debut. S&P Global Ratings added Facebook to its S&P 500 index on December 21, 2013. On May 2, 2014, Zuckerberg announced that the company would be changing its internal motto from "Move fast and break things" to "Move fast with stable infrastructure". The earlier motto had been described as Zuckerberg's "prime directive to his developers and team" in a 2009 interview in Business Insider, in which he also said, "Unless you are breaking stuff, you are not moving fast enough." In November 2016, Facebook announced the Microsoft Windows client of gaming service Facebook Gameroom, formerly Facebook Games Arcade, at the Unity Technologies developers conference. The client allows Facebook users to play "native" games in addition to its web games. The service was closed in June 2021. Lasso was a short-video sharing app from Facebook similar to TikTok that was launched on iOS and Android in 2018 and was aimed at teenagers. On July 2, 2020, Facebook announced that Lasso would be shutting down on July 10. In 2018, the Oculus lead Jason Rubin sent his 50-page vision document titled "The Metaverse" to Facebook's leadership. In the document, Rubin acknowledged that Facebook's virtual reality business had not caught on as expected, despite the hundreds of millions of dollars spent on content for early adopters. He also urged the company to execute fast and invest heavily in the vision, to shut out HTC, Apple, Google and other competitors in the VR space. Regarding other players' participation in the metaverse vision, he called for the company to build the "metaverse" to prevent their competitors from "being in the VR business in a meaningful way at all". In May 2019, Facebook founded Libra Networks, reportedly to develop their own stablecoin cryptocurrency. Later, it was reported that Libra was being supported by financial companies such as Visa, Mastercard, PayPal and Uber. The consortium of companies was expected to pool in $10 million each to fund the launch of the cryptocurrency coin named Libra. Depending on when it would receive approval from the Swiss Financial Market Supervisory authority to operate as a payments service, the Libra Association had planned to launch a limited format cryptocurrency in 2021. Libra was renamed Diem, before being shut down and sold in January 2022 after backlash from Swiss government regulators and the public. During the COVID-19 pandemic, the use of online services, including Facebook, grew globally. Zuckerberg predicted this would be a "permanent acceleration" that would continue after the pandemic. Facebook hired aggressively, growing from 48,268 employees in March 2020 to more than 87,000 by September 2022. Following a period of intense scrutiny and damaging whistleblower leaks, news started to emerge on October 21, 2021 about Facebook's plan to rebrand the company and change its name. In the Q3 2021 earnings call on October 25, Mark Zuckerberg discussed the ongoing criticism of the company's social services and the way it operates, and pointed to the pivoting efforts to building the metaverse – without mentioning the rebranding and the name change. The metaverse vision and the name change from Facebook, Inc. to Meta Platforms was introduced at Facebook Connect on October 28, 2021. Based on Facebook's PR campaign, the name change reflects the company's shifting long term focus of building the metaverse, a digital extension of the physical world by social media, virtual reality and augmented reality features. "Meta" had been registered as a trademark in the United States in 2018 (after an initial filing in 2015) for marketing, advertising, and computer services, by a Canadian company that provided big data analysis of scientific literature. This company was acquired in 2017 by the Chan Zuckerberg Initiative (CZI), a foundation established by Zuckerberg and his wife, Priscilla Chan, and became one of their projects. Following the rebranding announcement, CZI announced that it had already decided to deprioritize the earlier Meta project, thus it would be transferring its rights to the name to Meta Platforms, and the previous project would end in 2022. Soon after the rebranding, in early February 2022, Meta reported a greater-than-expected decline in profits in the fourth quarter of 2021. It reported no growth in monthly users, and indicated it expected revenue growth to stall. It also expected measures taken by Apple Inc. to protect user privacy to cost it some $10 billion in advertisement revenue, an amount equal to roughly 8% of its revenue for 2021. In meeting with Meta staff the day after earnings were reported, Zuckerberg blamed competition for user attention, particularly from video-based apps such as TikTok. The 27% reduction in the company's share price which occurred in reaction to the news eliminated some $230 billion of value from Meta's market capitalization. Bloomberg described the decline as "an epic rout that, in its sheer scale, is unlike anything Wall Street or Silicon Valley has ever seen". Zuckerberg's net worth fell by as much as $31 billion. Zuckerberg owns 13% of Meta, and the holding makes up the bulk of his wealth. According to published reports by Bloomberg on March 30, 2022, Meta turned over data such as phone numbers, physical addresses, and IP addresses to hackers posing as law enforcement officials using forged documents. The law enforcement requests sometimes included forged signatures of real or fictional officials. When asked about the allegations, a Meta representative said, "We review every data request for legal sufficiency and use advanced systems and processes to validate law enforcement requests and detect abuse." In June 2022, Sheryl Sandberg, the chief operating officer of 14 years, announced she would step down that year. Zuckerberg said that Javier Olivan would replace Sandberg, though in a “more traditional” role. In March 2022, Meta (except Meta-owned WhatsApp) and Instagram were banned in Russia and added to the Russian list of terrorist and extremist organizations for alleged Russophobia and hate speech (up to genocidal calls) amid the ongoing Russian invasion of Ukraine. Meta appealed against the ban, but it was upheld by a Moscow court in June of the same year. Also in March 2022, Meta and Italian eyewear giant Luxottica released Ray-Ban Stories, a series of smartglasses which could play music and take pictures. Meta and Luxottica parent company EssilorLuxottica declined to disclose sales on the line of products as of September 2022, though Meta has expressed satisfaction with its customer feedback. In July 2022, Meta saw its first year-on-year revenue decline when its total revenue slipped by 1% to $28.8bn. Analysts and journalists accredited the loss to its advertising business, which has been limited by Apple's app tracking transparency feature and the number of people who have opted not to be tracked by Meta apps. Zuckerberg also accredited the decline to increasing competition from TikTok. On October 27, 2022, Meta's market value dropped to $268 billion, a loss of around $700 billion compared to 2021, and its shares fell by 24%. It lost its spot among the top 20 US companies by market cap, despite reaching the top 5 in the previous year. In November 2022, Meta laid off 11,000 employees, 13% of its workforce. Zuckerberg said the decision to aggressively increase Meta's investments had been a mistake, as he had wrongly predicted that the surge in e-commerce would last beyond the COVID-19 pandemic. He also attributed the decline to increased competition, a global economic downturn and "ads signal loss". Plans to lay off a further 10,000 employees began in April 2023. The layoffs were part of a general downturn in the technology industry, alongside layoffs by companies including Google, Amazon, Tesla, Snap, Twitter and Lyft. Starting from 2022, Meta scrambled to catch up to other tech companies in adopting specialized artificial intelligence hardware and software. It had been using less expensive CPUs instead of GPUs for AI work, but that approach turned out to be less efficient. The company gifted the Inter-university Consortium for Political and Social Research $1.3 million to finance the Social Media Archive's aim to make their data available to social science research. In 2023, Ireland's Data Protection Commissioner imposed a record EUR 1.2 billion fine on Meta for transferring data from Europe to the United States without adequate protections for EU citizens.: 250 In March 2023, Meta announced a new round of layoffs that would cut 10,000 employees and close 5,000 open positions to make the company more efficient. Meta revenue surpassed analyst expectations for the first quarter of 2023 after announcing that it was increasing its focus on AI. On July 6, Meta launched a new app, Threads, a competitor to Twitter. Meta announced its artificial intelligence model Llama 2 in July 2023, available for commercial use via partnerships with major cloud providers like Microsoft. It was the first project to be unveiled out of Meta's generative AI group after it was set up in February. It would not charge access or usage but instead operate with an open-source model to allow Meta to ascertain what improvements need to be made. Prior to this announcement, Meta said it had no plans to release Llama 2 for commercial use. An earlier version of Llama was released to academics. In August 2023, Meta announced its permanent removal of news content from Facebook and Instagram in Canada due to the Online News Act, which requires Canadian news outlets to be compensated for content shared on its platform. The Online News Act was in effect by year-end, but Meta will not participate in the regulatory process. In October 2023, Zuckerberg said that AI would be Meta's biggest investment area in 2024. Meta finished 2023 as one of the best-performing technology stocks of the year, with its share price up 150 percent. Its stock reached an all-time high in January 2024, bringing Meta within 2% of achieving $1 trillion market capitalization. In November 2023 Meta Platforms launched an ad-free service in Europe, allowing subscribers to opt-out of personal data being collected for targeted advertising. A group of 28 European organizations, including Max Schrems' advocacy group NOYB, the Irish Council for Civil Liberties, Wikimedia Europe, and the Electronic Privacy Information Center, signed a 2024 letter to the European Data Protection Board (EDPB) expressing concern that this subscriber model would undermine privacy protections, specifically GDPR data protection standards. Meta removed the Facebook and Instagram accounts of Iran's Supreme Leader Ali Khamenei in February 2024, citing repeated violations of its Dangerous Organizations & Individuals policy. As of March, Meta was under investigation by the FDA for alleged use of their social media platforms to sell illegal drugs. On 16 May 2024, the European Commission began an investigation into Meta over concerns related to child safety. In May 2023, Iraqi social media influencer Esaa Ahmed-Adnan encountered a troubling issue when Instagram removed his posts, citing false copyright violations despite his content being original and free from copyrighted material. He discovered that extortionists were behind these takedowns, offering to restore his content for $3,000 or provide ongoing protection for $1,000 per month. This scam, exploiting Meta’s rights management tools, became widespread in the Middle East, revealing a gap in Meta’s enforcement in developing regions. An Iraqi nonprofit Tech4Peace’s founder, Aws al-Saadi helped Ahmed-Adnan and others, but the restoration process was slow, leading to significant financial losses for many victims, including prominent figures like Ammar al-Hakim. This situation highlighted Meta’s challenges in balancing global growth with effective content moderation and protection. On 16 September 2024, Meta announced it had banned Russian state media outlets from its platforms worldwide due to concerns about "foreign interference activity." This decision followed allegations that RT and its employees funneled $10 million through shell companies to secretly fund influence campaigns on various social media channels. Meta's actions were part of a broader effort to counter Russian covert influence operations, which had intensified since the invasion. At its 2024 Connect conference, Meta presented Orion, its first pair of augmented reality glasses. Though Orion was originally intended to be sold to consumers, the manufacturing process turned out to be too complex and expensive. Instead, the company pivoted to producing a small number of the glasses to be used internally. On 4 October 2024, Meta announced about its new AI model called Movie Gen, capable of generating realistic video and audio clips based on user prompts. Meta stated it would not release Movie Gen for open development, preferring to collaborate directly with content creators and integrate it into its products by the following year. The model was built using a combination of licensed and publicly available datasets. On October 31, 2024, ProPublica published an investigation into deceptive political advertisement scams that sometimes use hundreds of hijacked profiles and facebook pages run by organized networks of scammers. The authors cited spotty enforcement by Meta as a major reason for the extent of the issue. In November 2024, TechCrunch reported that Meta were considering building a $10bn global underwater cable spanning 25,000 miles. In the same month, Meta closed down 2 million accounts on Facebook and Instagram that were linked to scam centers in Myanmar, Laos, Cambodia, the Philippines, and the United Arab Emirates doing pig butchering scams. In December 2024, Meta announced that, beginning February 2025, they would require advertisers to run ads about financial services in Australia to verify information about who are the beneficiary and the payer in a bid to regulate scams. On December 4, 2024, Meta announced it will invest US$10 billion for its largest AI data center in northeast Louisiana, powered by natural gas facilities. On the 11th of that month, Meta experienced a global outage, impacting accounts on all of their social media and messaging applications. Outage reports from DownDetector reached 70,000+ and 100,000+ within minutes for Instagram and Facebook, respectively. In January 2025, Meta announced plans to roll back its diversity, equity, and inclusion (DEI) initiatives, citing shifts in the "legal and policy landscape" in the United States following the 2024 presidential election. The decision followed reports that CEO Mark Zuckerberg sought to align the company more closely with the incoming Trump administration, including changes to content moderation policies and executive leadership. The new content moderation policies continued to bar insults about a person's intellect or mental illness, but made an exception to allow calling LGBTQ people mentally ill because they are gay or transgender. Later that month, Meta agreed to pay $25 million to settle a 2021 lawsuit brought by Donald Trump for suspending his social media accounts after the January 6 riots. Changes to Meta's moderation policies were controversial among its oversight board, with a significant divide in opinion between the board's US conservatives and its global members. In June 2025, Meta Platforms Inc. has decided to make a multibillion-dollar investment into artificial intelligence startup Scale AI. The financing could exceed $10 billion in value which would make it one of the largest private company funding events of all time. In October 2025, it was announced that Meta would be laying off 600 employees in the artificial intelligence unit to perform better and simpler. They referred to their AI unit as "bloated" and are seeking to trim down the department. This mass layoff is going to impact Meta’s AI infrastructure units, Fundamental Artificial Intelligence Research unit (FAIR) and other product-related positions. Mergers and acquisitions Meta has acquired multiple companies (often identified as talent acquisitions). One of its first major acquisitions was in April 2012, when it acquired Instagram for approximately US$1 billion in cash and stock. In October 2013, Facebook, Inc. acquired Onavo, an Israeli mobile web analytics company. In February 2014, Facebook, Inc. announced it would buy mobile messaging company WhatsApp for US$19 billion in cash and stock. The acquisition was completed on October 6. Later that year, Facebook bought Oculus VR for $2.3 billion in cash and stock, which released its first consumer virtual reality headset in 2016. In late November 2019, Facebook, Inc. announced the acquisition of the game developer Beat Games, responsible for developing one of that year's most popular VR games, Beat Saber. In Late 2022, after Facebook Inc rebranded to Meta Platforms Inc, Oculus was rebranded to Meta Quest. In May 2020, Facebook, Inc. announced it had acquired Giphy for a reported cash price of $400 million. It will be integrated with the Instagram team. However, in August 2021, UK's Competition and Markets Authority (CMA) stated that Facebook, Inc. might have to sell Giphy, after an investigation found that the deal between the two companies would harm competition in display advertising market. Facebook, Inc. was fined $70 million by CMA for deliberately failing to report all information regarding the acquisition and the ongoing antitrust investigation. In October 2022, the CMA ruled for a second time that Meta be required to divest Giphy, stating that Meta already controls half of the advertising in the UK. Meta agreed to the sale, though it stated that it disagrees with the decision itself. In May 2023, Giphy was divested to Shutterstock for $53 million. In November 2020, Facebook, Inc. announced that it planned to purchase the customer-service platform and chatbot specialist startup Kustomer to promote companies to use their platform for business. It has been reported that Kustomer valued at slightly over $1 billion. The deal was closed in February 2022 after regulatory approval. In September 2022, Meta acquired Lofelt, a Berlin-based haptic tech startup. In December 2025, it was announced Meta had acquired the AI-wearables startup, Limitless. In the same month, they also acquired another AI startup, Manus AI, for $2 billion. Manus announced in December that its platform had achieved $100mm in recurring revenue just 8 months after its launch and Meta said it will scale the platform to many other businesses. In January 2026, it was announced Meta proposed acquisition of Manus was undergoing preliminary scrutiny by Chinese regulators. The examination concerns the cross-border transfer of artificial intelligence technology developed in China. Lobbying In 2020, Facebook, Inc. spent $19.7 million on lobbying, hiring 79 lobbyists. In 2019, it had spent $16.7 million on lobbying and had a team of 71 lobbyists, up from $12.6 million and 51 lobbyists in 2018. Facebook was the largest spender of lobbying money among the Big Tech companies in 2020. The lobbying team includes top congressional aide John Branscome, who was hired in September 2021, to help the company fend off threats from Democratic lawmakers and the Biden administration. In December 2024, Meta donated $1 million to the inauguration fund for then-President-elect Donald Trump. In 2025, Meta was listed among the donors funding the construction of the White House State Ballroom. Partnerships February 2026, Meta announced a long-term partnership with Nvidia. Censorship In August 2024, Mark Zuckerberg sent a letter to Jim Jordan indicating that during the COVID-19 pandemic the Biden administration repeatedly asked Meta to limit certain COVID-19 content, including humor and satire, on Facebook and Instagram. In 2016 Meta hired Jordana Cutler, formerly an employee at the Israeli Embassy to the United States, as its policy chief for Israel and the Jewish Diaspora. In this role, Cutler pushed for the censorship of accounts belonging to Students for Justice in Palestine chapters in the United States. Critics have said that Cutler's position gives the Israeli government an undue influence over Meta policy, and that few countries have such high levels of contact with Meta policymakers. Following the election of Donald Trump in 2025, various sources noted possible censorship related to the Democratic Party on Instagram and other Meta platforms. In February 2025, a Meta rep flagged journalist Gil Duran's article and other "critiques of tech industry figures" as spam or sensitive content, limiting their reach. In March 2025, Meta attempted to block former employee Sarah Wynn-Williams from promoting or further distributing her memoir, Careless People, that includes allegations of unaddressed sexual harassment in the workplace by senior executives. The New York Times reports that the arbitration is among Meta's most forcible attempts to repudiate a former employee's account of workplace dynamics. Publisher Macmillan reacted to the ruling by the Emergency International Arbitral Tribunal by stating that it will ignore its provisions. As of 15 March 2025[update], hardback and digital versions of Careless People were being offered for sale by major online retailers. From October 2025, Meta began removing and restricting access for accounts related to LGBTQ, reproductive health and abortion information pages on its platforms. Martha Dimitratou, executive director of Repro Uncensored, called Meta's shadow-banning of these issues "One of the biggest waves of censorship we are seeing". Disinformation concerns Since its inception, Meta has been accused of being a host for fake news and misinformation. In the wake of the 2016 United States presidential election, Zuckerberg began to take steps to eliminate the prevalence of fake news, as the platform had been criticized for its potential influence on the outcome of the election. The company initially partnered with ABC News, the Associated Press, FactCheck.org, Snopes and PolitiFact for its fact-checking initiative; as of 2018, it had over 40 fact-checking partners across the world, including The Weekly Standard. A May 2017 review by The Guardian found that the platform's fact-checking initiatives of partnering with third-party fact-checkers and publicly flagging fake news were regularly ineffective, and appeared to be having minimal impact in some cases. In 2018, journalists working as fact-checkers for the company criticized the partnership, stating that it had produced minimal results and that the company had ignored their concerns. In 2024 Meta's decision to continue to disseminate a falsified video of US president Joe Biden, even after it had been proven to be fake, attracted criticism and concern. In January 2025, Meta ended its use of third-party fact-checkers in favor of a user-run community notes system similar to the one used on X. While Zuckerberg supported these changes, saying that the amount of censorship on the platform was excessive, the decision received criticism by fact-checking institutions, stating that the changes would make it more difficult for users to identify misinformation. Meta also faced criticism for weakening its policies on hate speech that were designed to protect minorities and LGBTQ+ individuals from bullying and discrimination. While moving its content review teams from California to Texas, Meta changed their hateful conduct policy to eliminate restrictions on anti-LGBT and anti-immigrant hate speech, as well as explicitly allowing users to accuse LGBT people of being mentally ill or abnormal based on their sexual orientation or gender identity. In January 2025, Meta faced significant criticism for its role in removing LGBTQ+ content from its platforms, amid its broader efforts to address anti-LGBTQ+ hate speech. The removal of LGBTQ+ themes was noted as part of the wider crackdown on content deemed to violate its community guidelines. Meta's content moderation policies, which were designed to combat harmful speech and protect users from discrimination, inadvertently led to the removal or restriction of LGBTQ+ content, particularly posts highlighting LGBTQ+ identities, support, or political issues. According to reports, LGBTQ+ posts, including those that simply celebrated pride or advocated for LGBTQ+ rights, were flagged and removed for reasons that some critics argue were vague or inconsistently applied. Many LGBTQ+ activists and users on Meta's platforms expressed concern that such actions stifled visibility and expression, potentially isolating LGBTQ+ individuals and communities, especially in spaces that were historically important for outreach and support. Lawsuits Numerous lawsuits have been filed against the company, both when it was known as Facebook, Inc., and as Meta Platforms. In March 2020, the Office of the Australian Information Commissioner (OAIC) sued Facebook, for significant and persistent infringements of the rule on privacy involving the Cambridge Analytica fiasco. Every violation of the Privacy Act is subject to a theoretical cumulative liability of $1.7 million. The OAIC estimated that a total of 311,127 Australians had been exposed. On December 8, 2020, the U.S. Federal Trade Commission and 46 states (excluding Alabama, Georgia, South Carolina, and South Dakota), the District of Columbia and the territory of Guam, launched Federal Trade Commission v. Facebook as an antitrust lawsuit against Facebook. The lawsuit concerns Facebook's acquisition of two competitors—Instagram and WhatsApp—and the ensuing monopolistic situation. FTC alleges that Facebook holds monopolistic power in the U.S. social networking market and seeks to force the company to divest from Instagram and WhatsApp to break up the conglomerate. William Kovacic, a former chairman of the Federal Trade Commission, argued the case will be difficult to win as it would require the government to create a counterfactual argument of an internet where the Facebook-WhatsApp-Instagram entity did not exist, and prove that harmed competition or consumers. In November 2025, it was ruled that Meta did not violate antitrust laws and holds no monopoly in the market. On December 24, 2021, a court in Russia fined Meta for $27 million after the company declined to remove unspecified banned content. The fine was reportedly tied to the company's annual revenue in the country. In May 2022, a lawsuit was filed in Kenya against Meta and its local outsourcing company Sama. Allegedly, Meta has poor working conditions in Kenya for workers moderating Facebook posts. According to the lawsuit, 260 screeners were declared redundant with confusing reasoning. The lawsuit seeks financial compensation and an order that outsourced moderators be given the same health benefits and pay scale as Meta employees. In June 2022, 8 lawsuits were filed across the U.S. over the allege that excessive exposure to platforms including Facebook and Instagram has led to attempted or actual suicides, eating disorders and sleeplessness, among other issues. The litigation follows a former Facebook employee's testimony in Congress that the company refused to take responsibility. The company noted that tools have been developed for parents to keep track of their children's activity on Instagram and set time limits, in addition to Meta's "Take a break" reminders. In addition, the company is providing resources specific to eating disorders as well as developing AI to prevent children under the age of 13 signing up for Facebook or Instagram. In June 2022, Meta settled a lawsuit with the US Department of Justice. The lawsuit, which was filed in 2019, alleged that the company enabled housing discrimination through targeted advertising, as it allowed homeowners and landlords to run housing ads excluding people based on sex, race, religion, and other characteristics. The U.S. Department of Justice stated that this was in violation of the Fair Housing Act. Meta was handed a penalty of $115,054 and given until December 31, 2022, to shadow the algorithm tool. In January 2023, Meta was fined €390 million for violations of the European Union General Data Protection Regulation. In May 2023, the European Data Protection Board fined Meta a record €1.2 billion for breaching European Union data privacy laws by transferring personal data of Facebook users to servers in the U.S. In July 2024, Meta agreed to pay the state of Texas US$1.4 billion to settle a lawsuit brought by Texas Attorney General Ken Paxton accusing the company of collecting users' biometric data without consent, setting a record for the largest privacy-related settlement ever obtained by a state attorney general. In October 2024, Meta Platforms faced lawsuits in Japan from 30 plaintiffs who claimed they were defrauded by fake investment ads on Facebook and Instagram, featuring false celebrity endorsements. The plaintiffs are seeking approximately $2.8 million in damages. In April 2025, the Kenyan High Court ruled that a US$2.4 billion lawsuit in which three plaintiffs claim that Facebook inflamed civil violence in Ethiopia in 2021 could proceed. In April 2025, Meta was fined €200 million ($230 million) for breaking the Digital Markets Act, by imposing a “consent or pay” system that forces users to either allow their personal data to be used to target advertisements, or pay a subscription fee for advertising-free versions of Facebook and Instagram. In late April 2025, a case was filed against Meta in Ghana over the alleged psychological distress experienced by content moderators employed to take down disturbing social media content including depictions of murders, extreme violence and child sexual abuse. Meta moved the moderation service to the Ghanaian capital of Accra after legal issues in the previous location Kenya. The new moderation company is Teleperformance, a multinational corporation with a history of worker's rights violation. Reports suggests the conditions are worse here than in the previous Kenyan location, with many workers afraid of speaking out due to fear of returning to conflict zones. Workers reported developing mental illnesses, attempted suicides, and low pay. In 26 January 2026, a New Mexico state court case was filed, suggesting that Mark Zuckerberg approved allowing minors to access artificial intelligence chatbot companions that safety staffers warned were capable of sexual interactions. In 2020, the company UReputation, which had been involved in several cases concerning the management of digital armies[clarification needed], filed a lawsuit against Facebook, accusing it of unlawfully transmitting personal data to third parties. Legal actions were initiated in Tunisia, France, and the United States. In 2025, the United States District court for the Northern District of Georgia approved a discovery procedure, allowing UReputation to access documents and evidence held by Meta. Structure Meta's key management consists of: As of October 2022[update], Meta had 83,553 employees worldwide. As of June 2024[update], Meta's board consisted of the following directors; Meta Platforms is mainly owned by institutional investors, who hold around 80% of all shares. Insiders control the majority of voting shares. The three largest individual investors in 2024 were Mark Zuckerberg, Sheryl Sandberg and Christopher K. Cox. The largest shareholders in late 2024/early 2025 were: Roger McNamee, an early Facebook investor and Zuckerberg's former mentor, said Facebook had "the most centralized decision-making structure I have ever encountered in a large company". Facebook co-founder Chris Hughes has stated that chief executive officer Mark Zuckerberg has too much power, that the company is now a monopoly, and that, as a result, it should be split into multiple smaller companies. In an op-ed in The New York Times, Hughes said he was concerned that Zuckerberg had surrounded himself with a team that did not challenge him, and that it is the U.S. government's job to hold him accountable and curb his "unchecked power". He also said that "Mark's power is unprecedented and un-American." Several U.S. politicians agreed with Hughes. European Union Commissioner for Competition Margrethe Vestager stated that splitting Facebook should be done only as "a remedy of the very last resort", and that it would not solve Facebook's underlying problems. Revenue Facebook ranked No. 34 in the 2020 Fortune 500 list of the largest United States corporations by revenue, with almost $86 billion in revenue most of it coming from advertising. One analysis of 2017 data determined that the company earned US$20.21 per user from advertising. According to New York, since its rebranding, Meta has reportedly lost $500 billion as a result of new privacy measures put in place by companies such as Apple and Google which prevents Meta from gathering users' data. In February 2015, Facebook announced it had reached two million active advertisers, with most of the gain coming from small businesses. An active advertiser was defined as an entity that had advertised on the Facebook platform in the last 28 days. In March 2016, Facebook announced it had reached three million active advertisers with more than 70% from outside the United States. Prices for advertising follow a variable pricing model based on auctioning ad placements, and potential engagement levels of the advertisement itself. Similar to other online advertising platforms like Google and Twitter, targeting of advertisements is one of the chief merits of digital advertising compared to traditional media. Marketing on Meta is employed through two methods based on the viewing habits, likes and shares, and purchasing data of the audience, namely targeted audiences and "look alike" audiences. The U.S. IRS challenged the valuation Facebook used when it transferred IP from the U.S. to Facebook Ireland (now Meta Platforms Ireland) in 2010 (which Facebook Ireland then revalued higher before charging out), as it was building its double Irish tax structure. The case is ongoing and Meta faces a potential fine of $3–5bn. The U.S. Tax Cuts and Jobs Act of 2017 changed Facebook's global tax calculations. Meta Platforms Ireland is subject to the U.S. GILTI tax of 10.5% on global intangible profits (i.e. Irish profits). On the basis that Meta Platforms Ireland Limited is paying some tax, the effective minimum US tax for Facebook Ireland will be circa 11%. In contrast, Meta Platforms Inc. would incur a special IP tax rate of 13.125% (the FDII rate) if its Irish business relocated to the U.S. Tax relief in the U.S. (21% vs. Irish at the GILTI rate) and accelerated capital expensing, would make this effective U.S. rate around 12%. The insignificance of the U.S./Irish tax difference was demonstrated when Facebook moved 1.5bn non-EU accounts to the U.S. to limit exposure to GDPR. Facilities Users outside of the U.S. and Canada contract with Meta's Irish subsidiary, Meta Platforms Ireland Limited (formerly Facebook Ireland Limited), allowing Meta to avoid US taxes for all users in Europe, Asia, Australia, Africa and South America. Meta is making use of the Double Irish arrangement which allows it to pay 2–3% corporation tax on all international revenue. In 2010, Facebook opened its fourth office, in Hyderabad, India, which houses online advertising and developer support teams and provides support to users and advertisers. In India, Meta is registered as Facebook India Online Services Pvt Ltd. It also has offices or planned sites in Chittagong, Bangladesh; Dublin, Ireland; and Austin, Texas, among other cities. Facebook opened its London headquarters in 2017 in Fitzrovia in central London. Facebook opened an office in Cambridge, Massachusetts in 2018. The offices were initially home to the "Connectivity Lab", a group focused on bringing Internet access to those who do not have access to the Internet. In April 2019, Facebook opened its Taiwan headquarters in Taipei. In March 2022, Meta opened new regional headquarters in Dubai. In September 2023, it was reported that Meta had paid £149m to British Land to break the lease on Triton Square London office. Meta reportedly had another 18 years left on its lease on the site. As of 2023, Facebook operated 21 data centers. It committed to purchase 100% renewable energy and to reduce its greenhouse gas emissions 75% by 2020. Its data center technologies include Fabric Aggregator, a distributed network system that accommodates larger regions and varied traffic patterns. Reception US Representative Alexandria Ocasio-Cortez responded in a tweet to Zuckerberg's announcement about Meta, saying: "Meta as in 'we are a cancer to democracy metastasizing into a global surveillance and propaganda machine for boosting authoritarian regimes and destroying civil society ... for profit!'" Ex-Facebook employee Frances Haugen and whistleblower behind the Facebook Papers responded to the rebranding efforts by expressing doubts about the company's ability to improve while led by Mark Zuckerberg, and urged the chief executive officer to resign. In November 2021, a video published by Inspired by Iceland went viral, in which a Zuckerberg look-alike promoted the Icelandverse, a place of "enhanced actual reality without silly looking headsets". In a December 2021 interview, SpaceX and Tesla chief executive officer Elon Musk said he could not see a compelling use-case for the VR-driven metaverse, adding: "I don't see someone strapping a frigging screen to their face all day." In January 2022, Louise Eccles of The Sunday Times logged into the metaverse with the intention of making a video guide. She wrote: Initially, my experience with the Oculus went well. I attended work meetings as an avatar and tried an exercise class set in the streets of Paris. The headset enabled me to feel the thrill of carving down mountains on a snowboard and the adrenaline rush of climbing a mountain without ropes. Yet switching to the social apps, where you mingle with strangers also using VR headsets, it was at times predatory and vile. Eccles described being sexually harassed by another user, as well as "accents from all over the world, American, Indian, English, Australian, using racist, sexist, homophobic and transphobic language". She also encountered users as young as 7 years old on the platform, despite Oculus headsets being intended for users over 13. See also References External links 37°29′06″N 122°08′54″W / 37.48500°N 122.14833°W / 37.48500; -122.14833 |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Sceptrum_et_Manus_Iustitiae] | [TOKENS: 200] |
Contents Sceptrum et Manus Iustitiae Sceptrum et Manus Iustitiae (Latin for scepter and hand of justice) was a constellation created by Augustin Royer in 1679 to honor king Louis XIV of France. It was formed from stars of what is today the constellations Lacerta and western Andromeda. Due to the awkward name the constellation was modified and name changed a couple of times, for example some old star maps show Sceptrum Imperiale, Stellio and Scettro, and Johannes Hevelius's star map divides the area between the new Lacerta and as a chain end fettering Andromeda. The connection with the later constellation Frederici Honores, that occupied the chain end of Andromeda, is unclear, except that both represent a regal spire attributed to varying regents. See also References External links This constellation-related article is a stub. You can help Wikipedia by adding missing information. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Chittagong] | [TOKENS: 10375] |
Contents Chittagong Chittagong (/ˈtʃɪtəɡɒŋ/ CHIT-ə-gong), officially Chattogram (Bengali: চট্টগ্রাম, romanized: Côṭṭôgrām, IPA: [ˈt͡ʃɔʈːoɡram] ⓘ, traditionally Bengali: চাটগাঁও, romanized: Cāṭgão; Chittagonian: চিটাং/সিটাং, romanized: Sitang), is the second-largest city in Bangladesh. Home to the Port of Chittagong, it is the busiest port in Bangladesh and the Bay of Bengal. The city is also the business capital of Bangladesh. It is the administrative seat of an eponymous division and district. The city is located on the banks of the Karnaphuli River between the Chittagong Hill Tracts and the Bay of Bengal. In 2022, the Chittagong District had a population of approximately 9.2 million according to a census conducted by the government of Bangladesh. In 2022, the city area had a population of more than 5.6 million. The city is home to many large local businesses and plays an important role in the Bangladeshi economy. One of the world's oldest ports with a functional natural harbor for centuries, Chittagong appeared on ancient Greek and Roman maps, including on Ptolemy's world map. It was located on the southern branch of the Silk Road. In the 9th century, merchants from the Abbasid Caliphate established a trading post in Chittagong. The port fell to the Muslim conquest of Bengal during the 14th century. It was the site of a royal mint under the Delhi Sultanate, Bengal Sultanate and Mughal Empire. Between the 15th and 17th centuries, Chittagong was also a centre of administrative, literary, commercial and maritime activities in Arakan, a narrow strip of land along the eastern coast of the Bay of Bengal which was under strong Bengali influence for 350 years. During the 16th century, the port became a Portuguese trading post and João de Barros described it as "the most famous and wealthy city of the Kingdom of Bengal". The Mughal Empire expelled the Portuguese and Arakanese in 1666. The Nawab of Bengal ceded the port to the British East India Company in 1793. The Port of Chittagong was re-organized in 1887 and its busiest shipping links were with British Burma. In 1928, Chittagong was declared a "Major Port" of British India. During World War II, Chittagong was a base for Allied Forces engaged in the Burma Campaign. The port city began to expand and industrialize during the 1940s, particularly after the Partition of British India. The city was the historic terminus of the Assam Bengal Railway and Pakistan Eastern Railway. During the Bangladesh Liberation War in 1971, Chittagong was the site of the Bangladeshi declaration of independence. The port city has benefited from the growth of heavy industry, logistics, and manufacturing in Bangladesh. Trade unionism was strong during the 1990s. Chittagong accounts for 12% of Bangladesh's GDP, including 40% of industrial output, 80% of international trade, and 50% of tax revenue. The port city is home to many of the oldest and largest companies in the country. The Port of Chittagong is one of the busiest ports in South Asia. The largest base of the Bangladesh Navy is located in Chittagong, along with an air base of the Bangladesh Air Force, garrisons of the Bangladesh Army and the main base of the Bangladesh Coast Guard. The eastern zone of the Bangladesh Railway is based in Chittagong. The Chittagong Stock Exchange is one of the twin stock markets of Bangladesh with over 700 listed companies. The Chittagong Tea Auction is a commodity exchange dealing with Bangladeshi tea. The CEPZ and KEPZ are key industrial zones with foreign direct investments. The city is served by Shah Amanat International Airport for domestic and external flights. Karnaphuli Tunnel, the first and only underwater road tunnel of South Asia, is located in Chittagong. The city is the hometown of prominent economists, a Nobel laureate, scientists, freedom fighters and entrepreneurs. Chittagong has a high degree of religious and ethnic diversity among Bangladeshi cities, despite having a great Muslim majority. Minorities include Hindus, Christians, Chakmas, Marmas, Baruas, Tripuris, Garos and others. Etymology The etymology of Chittagong is uncertain. The port city has been known by various names in history, including Chatigaon, Chatigam, Chattagrama, Islamabad, Chattala, Chaityabhumi and Porto Grande De Bengala. The Bengali word for Chittagong, Chattogram (চট্টগ্রাম), has the suffix "-gram" (গ্রাম), meaning "village" in Standard Bengali. The earliest records, before Islam reached the region, state that it was a place of chaitya or Buddhist monasteries. The city had a very large Buddhist population before Islam. The city was renamed Islamabad (City of Islam) during the Mughal era. The name continues to be used in the old city. In April 2018, the Cabinet Division of the Government of Bangladesh decided to change the city's name to Chattogram, based on its Bengali spelling and pronunciation; the move was criticized in the Bangladeshi media. One explanation credits the first Arab traders for Shatt Al Ghangh (Arabic: شط الغنغ) where shatt means "Delta" and ghangh stood for the Ganges, from that term Chattala evolved. The Arakanese chronicle that a king named Tsu-la-taing Tsandaya (Sula Taing Chandra), after conquering Bengal, set up a stone pillar as a trophy/memorial at the place since called Tst-ta-gaung as the limit of conquest. History Stone Age fossils and tools unearthed in the region indicate that Chittagong has been inhabited since Neolithic times. It is an ancient port city, with a recorded history dating back to the 4th century BC. Its harbour was mentioned in Ptolemy's world map in the 2nd century as one of the most impressive ports in the East. The region was part of the ancient Bengali Samatata and Harikela kingdoms. The Chandra dynasty once dominated the area and was followed by the Varman dynasty and Deva dynasty. Chinese traveller Xuanzang described the area as "a sleeping beauty rising from mist and water" in the 7th century. Many Sufi missionaries settled in Chittagong and played an instrumental role in the spread of Islam. Sultan Fakhruddin Mubarak Shah of Sonargaon conquered Chittagong in 1340, making it a part of Sultanate of Bengal. It was the principal maritime gateway to the kingdom, which was reputed as one of the wealthiest states in the Indian subcontinent. Medieval Chittagong was a hub for maritime trade with China, Sumatra, the Maldives, Sri Lanka, the Middle East, and East Africa. It was notable for its medieval trades in pearls, silk, muslin, rice, bullion, horses, and gunpowder. The port was also a major shipbuilding hub. Ibn Battuta visited the port city in 1345. Niccolò de' Conti, from Venice, also visited around the same time as Battuta. Chinese admiral Zheng He's treasure fleet anchored in Chittagong during imperial missions to the Sultanate of Bengal. Dhaniya Manikya conquered Chittagong in 1513. Hossain Shah sent his noble commander Gorai Mallik to attack Tripura. Gorai Mallik recaptured the territories lost. But the following year Dhaniya Manikya again conquered Chittagong. The Arakanese ruled over Chittagong spanned from the late 16th century to 1666, marking a significant yet turbulent era in the region's history. The Kingdom of Mrauk U, centered on the west coast of present-day Myanmar, expanded into south-eastern Bengal, with Chittagong becoming a strategic part of its domain. The Arakanese maintained their power through alliances with the Portuguese, who were instrumental in fortifying their control. Chittagong evolved into a centre of trade and piracy during this time, with Portuguese and Arakanese forces frequently raiding Mughal territories. The blending of Bengali, Buddhist, and Portuguese influences made the region a unique cultural and administrative frontier. The decline of Arakanese rule was triggered by political conflicts, including their involvement in the Mughal succession struggle. The assassination of Mughal prince Shah Shuja in Arakan strained relations with the Mughal Empire, prompting a decisive campaign led by Subahdar Shaista Khan in 1666. The Mughals recaptured Chittagong, ending nearly a century of Arakanese dominance. This period left a lasting legacy on the region, highlighting the interplay of trade, politics, and cultural exchange between Bengal and Arakan. Chittagong featured prominently in the military history of the Bengal Sultanate, including during the Reconquest of Arakan and the Bengal Sultanate–Kingdom of Mrauk U War of 1512–1516. During the 13th and 16th centuries, Arabs and Persians heavily colonized the port city of Chittagong, initially arriving for trade and to spread Islam. Most Arab settlers arrived from the trade route between Iraq and Chittagong and were perhaps the prime reason for the spread of Islam to Bangladesh. The first Persian settlers also arrived for trade and religious purposes, with the possible goal of Persianisation as well. Persians and other Iranic peoples have deeply affected the history of the Bengal Sultanate, with Persian being one of the main languages of the Muslim state, as well as also influencing the Chittagonian language and writing scripts. Two decades after Vasco Da Gama's landing in Calicut, the Bengal Sultanate permitted the Portuguese settlement in Chittagong to be established in 1528. It became the first European colonial enclave in Bengal. The Bengal Sultanate lost control of Chittagong in 1531 after Arakan declared independence and the established Kingdom of Mrauk U. This altered geopolitical landscape allowed the Portuguese unhindered control of Chittagong for over a century. Portuguese ships from Goa and Malacca began frequenting the port city in the 16th century. The cartaz system was introduced and required all ships in the area to purchase naval trading licenses from the Portuguese settlement. Slave trade and piracy flourished. The nearby island of Sandwip was conquered in 1602. In 1615, the Portuguese Navy defeated a joint Dutch East India Company and Arakanese fleet near the coast of Chittagong. In 1666, the Mughal government of Bengal led by viceroy Shaista Khan moved to retake Chittagong from Portuguese and Arakanese control by launching the Mughal conquest of Chittagong. The Mughals attacked the Arakanese from the jungle with a 6,500-strong army, which was further supported by 288 Mughal naval ships blockading the Chittagong harbor. After three days of battle, the Arakanese surrendered. The Mughals expelled the Portuguese from Chittagong. Mughal rule ushered a new era in the history of Chittagong territory to the southern bank of Kashyapnadi (Kaladan River). The port city was renamed Islamabad. The Grand Trunk Road connected it with North India and Central Asia. Economic growth increased due to an efficient system of land grants for clearing hinterlands for cultivation. The Mughals also contributed to the architecture of the area, including the building of Fort Ander and many mosques. Chittagong was integrated into the prosperous Bengali economy, which also included Orissa and Bihar. Shipbuilding increased dramatically under the Mughal rule, and the Ottoman Sultans had many Ottoman warships built in Chittagong during this period. In 1685, the British East India Company sent out an expedition under Admiral Nicholson with the instructions to seize and fortify Chittagong on behalf of the English; however, the expedition proved abortive. Two years later, the company's Court of Directors decided to make Chittagong the headquarters of their Bengal trade and sent out a fleet of ten or eleven ships to seize it under Captain Heath. However, after reaching Chittagong in early 1689, the fleet found the city too strongly held and abandoned their attempt at capturing it. The city was possessed by the Nawab of Bengal until 1793 when East India Company took complete control of the former Mughal province of Bengal. The First Anglo-Burmese War in 1823 threatened the British hold on Chittagong. There were several rebellions against British rule, notably during the Indian rebellion of 1857, when the 2nd, 3rd, and 4th companies of the 34th Bengal Infantry Regiment revolted and released all prisoners from the city's jail. In a backlash, the rebels were suppressed by the Sylhet Light Infantry. Arakan was annexed in 1826 and incorporated into the Bengal Presidency. Agriculturalists from Chittagong played a key role in the development of the rice economy in Arakan. The economy of northern Arakan was integrated with the Chittagong economy. During this period, Arakan Division became one of the top rice exporters in the world. Bengalis from Chittagong were vital to the success of Arakan's rice industry. Railways were introduced in 1865, beginning with the Eastern Bengal Railway connecting Chittagong to Dacca and Calcutta. Chittagong became the main gateway to Eastern Bengal and Assam. In the 1890s, Chittagong became the terminus of Assam Bengal Railway. The hinterland of Chittagong Port covered the tea and jute producing regions of Assam and Bengal, as well as Assam's oil industry. Chittagong was also linked to the crucial oil and gas industry in Burma. Chittagong was a major center of trade with British Burma. It hosted many prominent companies of the British Empire. The Chittagong armoury raid by Bengali revolutionaries in 1930 was a major event in British India's anti-colonial history. During World War II, Chittagong became a frontline city in the Southeast Asian Theater. It was a critical air, naval and military base for Allied Forces during the Burma Campaign against Japan. The Imperial Japanese Army Air Force carried out air raids on Chittagong in April and May 1942, in the run-up to the aborted Japanese invasion of Bengal. After the Battle of Imphal, the tide turned in favour of the Allied Forces. Units of the United States Army Air Forces' 4th Combat Cargo Group were stationed in Chittagong Airfield in 1945. Commonwealth forces included troops from Britain, India, Australia, and New Zealand. The war had major negative impacts on the city, including the growth of refugees and the Great Famine of 1943. Many wealthy Chittagonians profited from wartime commerce. 715 soldiers are buried at the Chittagong War Cemetery, which is maintained by the Commonwealth War Graves Commission. Allied soldiers constitute the bulk of burials in the cemetery. A few Japanese soldiers are also buried. Remembrance Day services are held each year at the cemetery, with diplomats from Commonwealth countries like the UK, Bangladesh, Australia, India and Pakistan, as well as the United States and Japan, usually in attendance. The Partition of British India in 1947 made Chittagong the chief port of East Pakistan. By March 1948, the Chittagong harbour became a bustling port for international shipping.[citation needed] The Chittagong Tea Auction was set up in 1949. The port city had branches of the Chartered Bank of India, Australia and China, Burmah Oil (known locally as Burmah Eastern), and the James Finlay shipping business. Wealthy Muslim families from British India and British Burma shifted their corporate headquarters to Chittagong. The Ispahani family shifted the head office of M. M. Ispahani Limited from Calcutta to Chittagong. The Ispahanis also relocated the Eastern Federal Insurance Company from Calcutta to Chittagong. The Ispahanis set up the Victory Jute Mills, the Chittagong Jute Manufacturing Company, and the Pahartali Textile Mills. The Africawala brothers set up the first steel re-rolling mills in Chittagong in 1952, which eventually became BSRM. Banks, shipping companies and insurance firms proliferated the city. Many British-owned businesses in East Pakistan were based in Chittagong. Britain's former flag carrier BOAC operated flights to the city. The Agrabad area emerged as the central business district in the 1950s and 1960s, with many corporate offices. The Ispahani Building and Jamuna Bhaban are some of the corporate buildings from this period. The Karnaphuli Paper Mills were built in 1959. The project to build the Eastern Refinery was started in 1963; and was partly funded by the last Shah of Iran. The Agrabad Chamber of Commerce was formed in 1963. It later became the Foreign Investors' Chamber of Commerce and Industry in Bangladesh. The Chittagong Development Authority (CDA) was created by the government to promote urban planning; while wealthy families like the Ispahanis contributed to social welfare by setting up schools and hospitals. The lawyer and industrialist A K Khan, who set up A K Khan & Company in the aftermath of World War II, represented Chittagong in the federal cabinet of East and West Pakistan. However, East Pakistanis complained of a lack of investment in Chittagong in comparison to Karachi in West Pakistan, even though East Pakistan generated more exports and had a larger population. The Awami League demanded that the country's naval headquarters be shifted from Karachi to Chittagong. During the Bangladesh Liberation War in 1971, which was waged under the leadership of Sheikh Mujibur Rahman, Chittagong witnessed heavy fighting between rebel Bengali military regiments and the Pakistan Army. It covered Sector 1 in the Mukti Bahini chain of command. Major Ziaur Rahman was the sector commander. The Bangladeshi Declaration of Independence was broadcast from Kalurghat Radio Station and transmitted internationally through foreign ships in Chittagong Port. Ziaur Rahman and M A Hannan announced the independence declaration from Chittagong. A K Khan drafted the English version of Zia's broadcast. These radio broadcasts began the journey of Swadhin Bangla Betar Kendra, which contributed heavily towards the Liberation. The Pakistani military, and supporting Razakar militias, carried out widespread atrocities against civilians in the city. Mukti Bahini naval commandos drowned several Pakistani warships during Operation Jackpot in August 1971. In December 1971, the Bangladesh Air Force and the Indian Air Force carried out the heavy bombing of facilities occupied by the Pakistani military. A naval blockade was also enforced. After the war, the Soviet Union offer to clear mines in Chittagong Port at free of cost, while Sweden offered to clear mines in Mongla port. 22 vessels of the Soviet Pacific Fleet sailed from Vladivostok to Chittagong in May 1972. The process of clearing mines in the dense water harbor took nearly a year and claimed the life of Soviet marine Yuri V Redkin. Chittagong soon regained its status as a major port, with cargo tonnage surpassing pre-war levels in 1973. In the immediate aftermath of 1971, many industries were nationalized. But in Chittagong, factories and business properties were given back to their private owners. The Ispahani family had to write only one letter in order to get back all their properties from the Awami League government of Prime Minister Sheikh Mujibur Rahman. In free market reforms launched by President Ziaur Rahman in the late 1970s, the city became home to the first export processing zones in Bangladesh. Zia was assassinated during an attempted military coup in Chittagong in 1981. The 1991 Bangladesh cyclone inflicted heavy damage on the city. The Japanese government financed the construction of several heavy industries and an international airport in the 1980s and 1990s. Bangladeshi private sector investments increased since 1991, especially with the formation of the Chittagong Stock Exchange in 1995. A new airport opened in 2000. The port city has been the pivot of Bangladesh's emerging economy in recent years, with the country's rising GDP growth rate. Chittagong has seen several infrastructure projects taken up by the government of Prime Minister Sheikh Hasina, including the Chittagong Elevated Expressway, the first underwater tunnel in South Asia, the expansion of its port, and new parks, power plants and flyovers. Geography Chittagong lies at 22°20′06″N 91°49′57″E / 22.33500°N 91.83250°E / 22.33500; 91.83250. It straddles the coastal foothills of the Chittagong Hill Tracts in south-eastern Bangladesh. The Karnaphuli River runs along the southern banks of the city, including its central business district. The river enters the Bay of Bengal in an estuary located 12 kilometres (7.5 mi) west of downtown Chittagong. Mount Sitakunda is the highest peak in Chittagong District, with an elevation of 351 metres (1,152 ft). Within the city itself, the highest peak is Batali Hill at 85.3 metres (280 ft). Chittagong has many lakes that were created under the Mughal rule. In 1924, an engineering team of the Assam Bengal Railway established the Foy's Lake. Major sediment outflows from the Ganges (or Padma) and Brahmaputra rivers form tidal flats around the city. The Chittagong Division is known for its rich biodiversity. Over 2000 of Bangladesh's 6000 flowering plants grow in the region. Its hills and jungles are laden with waterfalls, fast flowing river streams and elephant reserves. St. Martin's Island, within the Chittagong Division, is the only coral island in the country. The fishing port of Cox's Bazar is home to one of the world's longest natural beaches. In the east, there are the three hill districts of Bandarban, Rangamati, and Khagrachari, home to the highest mountains in Bangladesh. The region has numerous protected areas, including the Teknaf Game Reserve and the Sitakunda Botanical Garden and Eco Park. Patenga beach in the main seafront of Chittagong, located 14 kilometres (8.7 mi) west of the city. Under the Köppen climate classification, Chittagong has a tropical monsoon climate (Am). Chittagong is vulnerable to North Indian Ocean tropical cyclones. The deadliest tropical cyclone to strike Chittagong was the 1991 Bangladesh cyclone, which killed 138,000 people and left as many as 10 million homeless. Government The Chittagong City Corporation (CCC) is responsible for governing municipal areas in the Chittagong Metropolitan Area. It is headed by the mayor of Chittagong. The mayor and ward councillors are elected every five years. The mayor is Shahadat Hossain, as of December 2024. The city corporation's mandate is limited to basic civic services, however, the CCC is credited for keeping Chittagong one of the cleaner and most eco-friendly cities in Bangladesh. Its principal sources of revenue are municipal taxes and conservancy charges. The Chittagong Development Authority is responsible for implementing the city's urban planning. The deputy commissioner and district magistrate are the chiefs of local administration as part of the Government of Bangladesh. Law enforcement is provided by the Chittagong Metropolitan Police and the Rapid Action Battalion-7. The district and sessions judges are the heads of the local judiciary on behalf of the Supreme Court of Bangladesh. The Divisional Special Judge's Court is located in the colonial-era Chittagong Court Building. Chittagong is a strategically important military port on the Bay of Bengal. The Chittagong Naval Area is the principal base of the Bangladesh Navy and the home port of most Bangladeshi warships. The Bangladesh Naval Academy and the navy's elite special force- Special Warfare Diving and Salvage (SWADS) are also based in the city. The Bangladesh Army's 24th Infantry Division is based in Chittagong Cantonment, and the Bangladesh Air Force maintains the BAF Zahurul Haq Air Base in Chittagong. The city is also home to the Bangladesh Military Academy, the premier training institute for the country's armed forces. In the 1860s, the American consulate-general in the Bengal Presidency included a consular agency in Chittagong. Today, Chittagong hosts an assistant high commission of India and a consulate general of Russia. The city also has honorary consulates of Turkey, Japan, Germany, South Korea, Malaysia, Italy, and the Philippines. Economy A substantial share of Bangladesh's national GDP is attributed to Chittagong. As of the early 2000s, the port city contributed 12% of the nation's economy. Chittagong generates for 40% of Bangladesh's industrial output, 80% of its international trade and 50% of its governmental revenue. The Chittagong Stock Exchange has more than 700 listed companies, with a market capitalisation of US$32 billion in June 2015. The city is home to many of the country's oldest and largest corporations. The Port of Chittagong handled US$60 billion in annual trade in 2011, ranking 3rd in South Asia after the Port of Mumbai and the Port of Colombo. The port is part of the Maritime Silk Road that runs from the Chinese coast via the Suez Canal to the Mediterranean and on to the Upper Adriatic region of Trieste with rail connections to Central and Eastern Europe. The Agrabad area is the main central business district of the city. Major Bangladeshi conglomerates headquartered in Chittagong include M. M. Ispahani Limited, BSRM, A K Khan & Company, PHP Group, James Finlay Bangladesh, the Habib Group, the S. Alam Group of Industries, Seamark Group, KDS Group, Abul Khair Group and the T. K. Group of Industries. Major state-owned firms headquartered there include Pragati Industries, the Jamuna Oil Company, the Bangladesh Shipping Corporation, and the Padma Oil Company. The Chittagong Export Processing Zone was ranked by the UK-based magazine, Foreign Direct Investment, as one of the leading special economic zones in the world, in 2010. Other SEZs include the Karnaphuli Export Processing Zone and Korean EPZ. The city's key industrial sectors include petroleum, steel, shipbuilding, chemicals, pharmaceuticals, textiles, jute, leather goods, vegetable oil refineries, glass manufacturing, electronics and motor vehicles. The Chittagong Tea Auction sets the price of Bangladesh Tea. The Eastern Refinery is Bangladesh's largest oil refinery. GlaxoSmithKline has had operations in Chittagong since 1967. Western Marine Shipyard is a leading Bangladeshi shipbuilder and exporter of medium-sized ocean-going vessels. In 2011–12, Chittagong exported approximately US$4.5 billion in ready-made garments. The Karnaphuli Paper Mills were established in 1953. International banks operating in Chittagong include HSBC, Standard Chartered and Citibank NA. Chittagong is often called Bangladesh's commercial capital due to its diversified industrial base and seaport. The port city has ambitions to develop as a global financial centre and regional transshipment hub, given its proximity to North East India, Burma, Nepal, Bhutan and Southwest China. By 2024, the Chittagong-based S Alam Group emerged as one of Bangladesh's most powerful conglomerates, with interests in energy, commodities, infrastructure, economic zones, healthcare, textiles and fintech. S Alam's projects include a $640 million steel plant, a $2.6 billion power plant and a $3 billion renewable energy plant. It is investing 580 billion BDT in two industrial zones in Chittagong. S Alam also has substantial offshore assets, including a billion dollars' worth of real estate in Singapore. Its portfolio in Singapore includes the city-state's Hilton Garden Inn Serangoon hotel. The S Alam Group enjoys close ties with the ruling Awami League party in Bangladesh. The group has been subjected to intense media scrutiny. Architecture The Anderkilla Shahi Jame Mosque is a well-known Mughal property in Chittagong. Anderkilla (Bengali: আন্দরকিল্লা) means "Inner fort". The mosque was built in 1667 by Umed Khan, the son of Shaista Khan, after the Mughal conquest of Chittagong. The mosque is the only surviving part of a hilltop Mughal fort. A surviving remnant of the 17th century Portuguese presence is Darul Adalat in the premises of Government Hazi Mohammad Mohsin College, Chittagong. The Kadam Mubarak Mosque in Jamal Khan was built in 1723 by a faujdar during the reign of the Nawabs of Bengal. During British rule, colonial officials lived in hilltop bungalows, which would feature a spacious balcony or verandah, chimneys, fireplaces and big gardens. The Firingi Bazaar has many colonial houses which belonged to rich local residents. The well-known buildings from the British colonial period include the Battali Railway Station, Central Railway Building, Chittagong Circuit House and Chittagong Court Building. The old Circuit House was originally built in the style of Tudor Revival architecture. The Chittagong Court Building exhibits influence of Neoclassical architecture from the late 19th century. JM Sen Hall was a town hall built in 1920. One of the grand old mansions of Chittagong is the PK Sen Bhaban. The First Karnaphuli Bridge, which was a steel bridge, was built in 1930. The Kalurghat Bridge was completed in 1931. Stripped Classicism and elements of art deco can be seen in Agrabad. M. M. Ispahani Limited relocated its head office to Chittagong from Calcutta after the partition of India; the Ispahani building in Agrabad was influenced by the art deco style. Another building with 1930s classical and art deco elements is the headquarters of the Jamuna Oil Company. The building has a dome and modernist columns inspired by the style of the 1930s and 1940s. Culture An inhabitant of Chittagong is called Chittagonian in English. For centuries, the port city has been a melting pot for people from all over the world. Its historic trade networks have left a lasting impact on its language, culture, and cuisine. The Chittagonian language, although identified as a nonstandard dialect of Bengali, is considered to be a separate language by many linguists. The Chittagonian language has many Arabic, Persian, English and Portuguese loanwords. The popular traditional feast of Mezban features the serving of hot beef dish with white rice. Another dish named kala-bhuna of Chittagong, made with traditional spices, mustard oil, and beef through a special cooking style, is also renowned all over Bangladesh. The cultivation of pink pearls is a historic activity in Chittagong. Its Mughal-era name, Islamabad (City of Islam), continues to be used in the old city. The name was given due to the port city's history as a gateway for early Islamic missionaries in Bengal. Notable Islamic architecture in Chittagong can be seen in the historic Bengal Sultanate-era Hammadyar Mosque and the Mughal Fort of Anderkilla. Chittagong is known as the Land of the Twelve Saints due to the prevalence of major Sufi Muslim shrines in the district. Historically, Sufism played an important role in the spread of Islam in the region. Prominent dargahs include the mausoleums of Shah Amanat, Badr Auliya, Miskin Shah, Garibullah Shah and the shrine of Bayazid Bastami among many others. The Bastami shrine hosts a pond of black softshell turtles, a critically endangered species of freshwater turtle. During the medieval period, many poets thrived in the region when it was part of the Bengal Sultanate and the Kingdom of Mrauk U. Under the patronage of Sultan Alauddin Husain Shah's governor in Chittagong, Kabindra Parameshvar wrote his Pandabbijay, a Bengali adaptation of the Mahabharata. Daulat Qazi lived in the region during the 17th-century reign of the Kingdom of Mrauk U. Chittagong is home to several important Hindu temples, including the Chandranath Temple on the outskirts of the city, which is dedicated to the Hindu goddess Sita. The city also hosts the country's largest Buddhist monastery and council of monks. The Roman Catholic Diocese of Chittagong is the oldest catholic mission in Bengal. Major cultural organizations in the city include the Theatre Institute Chittagong and the Chittagong Performing Arts Academy. The city has a vibrant contemporary art scene. Being home to the pioneering rock bands in the country like Souls and LRB, Chittagong is regarded as the "birthplace of Bangladeshi rock music". There is also the Chattogram City Corporation Public Library. Demographics At the 2022 Census, Chittagong had a population of 3,230,507. By gender, the population was 50.89% male and 49.11% female, and the literacy rate in the city was approximately 84.49% percent. Muslims, numbering approximately 2,841,595, form the overwhelming majority of the city's population, with the rest being 329,566 Hindus, 53,181 Buddhist and 4793 Christian. Chittagong was a melting pot of ethnicities during the Bengal Sultanate and Mughal Bengal periods. Muslim immigration started as early as the seventh century, and significant Muslim settlements occurred during the medieval period. Muslim traders, rulers, and preachers from Persia and Arabia were the early Muslim settlers, and their descendants are the majority of the current Muslim population of the city. The city has a relatively wealthy and economically influential Shia Muslim community, including Ismailis and Twelver Shias. The city also has many ethnic minorities, especially members of indigenous groups from the frontier hills of Chittagong Division, including Chakmas, Rakhines and Tripuris; as well as Rohingya refugees. The Bengali-speaking Theravada Buddhists of the area, known as Baruas, are one of the oldest communities in Chittagong and one of the last remnants of Buddhism in Bangladesh. Descendants of Portuguese settlers, often known as Firingis, also live in Chittagong, as well as Catholics, who largely live in the old Portuguese enclave of Paterghatta. There is also a small Urdu-speaking Bihari community living in the ethnic enclave known as Bihari Colony. Like other major urban centres in South Asia, Chittagong has experienced steady growth in its informal settlements as a result of the increasing economic activities in the city and emigration from rural areas. According to a poverty reduction publication of the International Monetary Fund, there were 1,814 slums within the city corporation area, inhabited by about 1.8 million slum dwellers, the second highest in the country after the capital, Dhaka. The slum dwellers often face eviction by the local authorities, charging them with illegal abode on government lands. In the early 1990s, Chittagong had a population of just over 1.5 million, of which there were an estimated 66,676 squatters living in 69 areas. Media and communications Various newspapers, including daily, opposition, and business newspapers, are based in Chittagong. Daily newspapers include Dainik Azadi, Peoples View, The Daily Suprobhat Bangladesh, Daily Purbokone, Life, Karnafuli, Jyoti, Rashtrobarta and Azan. Furthermore, there are several weekly and monthly newspapers. These include weeklies such as Chattala, Jyoti, Sultan, Chattagram Darpan, and the monthlies such as Sanshodhani, Purobi, Mukulika, and Simanto. The only press council in Chittagong is the Chittagong Press Club. Government-owned Bangladesh Television, with its Chittagong station, and Bangladesh Betar have transmission centres in the city. Privately owned Ekushey Television formerly broadcast on VHF channel 9 in Chittagong during its existence on terrestrial television. Chittagong has been featured in all aspects of Bangladeshi popular culture, including television, movies, journals, music, and books. Nearly all televisions and radios in Bangladesh have coverage in Chittagong. Renowned Bollywood film director Ashutosh Gowariker directed a movie based on the 1930s Chittagong Uprising, Movie's name is Khelein Hum Jee Jaan Sey in which Abhishek Bachchan played the lead role. Utilities The southern zone of the Bangladesh Power Development Board is responsible for supplying electricity to city dwellers. The fire services are provided by the Bangladesh Fire Service & Civil Defence department, under the Ministry of Home Affairs. Total Electricity Consumption is approximately 1000 megawatts in the city proper. But in the whole Chittagong urban and city proper, it will be 1300 megawatts plus-minus. The power plant will be in production next year and its production power is 1320 megawatts and it creates Chittagong City as the energy production hub of Bangladesh The water supply and sewage systems are managed by the Chittagong Water Supply and Sewerage Authority (Chittagong WASA). Water is primarily drawn from Karnaphuli River and then purified in the Mohra Purification Plant. Chittagong has extensive GSM and CDMA coverage, served by all the major mobile operators of the country, including Grameenphone, Banglalink, Citycell, Robi, TeleTalk and Airtel Bangladesh. However, landline telephone services are provided through the state-owned Bangladesh Telegraph and Telephone Board (BTTB), as well as some private operators. BTTB also provides broadband Internet services, along with some private Internet service providers (ISPs), including the 4G service providers Banglalion and Qubee. Administrative area The Chattogram is divided into 16 thanas: Akbarshah, Bakoliya, Bandar, Bayazid, Chandgaon, Double Mooring, Halishahar, Khulshi, Kotwali, Pahartali, Panchlaish, Patenga, Chawkbazar, Sadarghat, EPZ, and Karnaphuli. The thanas are subdivided into 41 wards and 211 mahallas. 41 wards are governed by elected representatives under the Chattogram City Corporation. Education and research The education system of Chittagong is similar to that of rest of Bangladesh, with four main forms of schooling. The general education system, conveyed in both Bangla and English versions, follows the curriculum prepared by the National Curriculum and Textbook Board, part of the Ministry of Education. Students are required to take two major board examinations are :the Secondary School Certificate (SSC) and the Higher Secondary School Certificate (HSC) before moving onto higher education. The Board of Intermediate and Secondary Education, Chittagong is responsible for administering SSC and HSC examinations within the city. The Madrasah education system is primarily based on Islamic studies, though other subjects are also taught. Students are prepared according to the Dakhil and Alim examinations, which are controlled by the Bangladesh Madrasah Education Board and are equivalent to SSC and HSC examinations of the general education system respectively. There are also several private schools in the city, usually referred to as English medium schools, which follow the General Certificate of Education. The British Council supervises the O Levels and A levels examinations, conducted twice a year, through the Cambridge International and Edexcel examination boards. The Technical and Vocational education system is governed by the Directorate of Technical Education (DTE) and follow the curriculum prepared by Bangladesh Technical Education Board (BTEB). Chittagong College, established in 1869, is the earliest modern institution for higher education in the city. Chittagong Veterinary and Animal Sciences University is the only public university located in Chittagong city. Chittagong Medical College is the only government medical college in Chittagong. University of Chittagong is located 22 kilometres (14 miles) north and Chittagong University of Engineering and Technology is located 25 kilometres (16 miles) north of the Chittagong city. The University of Chittagong, established in 1966 is one of the largest universities in Bangladesh. Chittagong University of Engineering and Technology, established in 1968, is one of the five public engineering universities in Bangladesh and the only engineering university in the Chittagong Division. The city also hosts several other private universities and medical colleges. The BGC Trust University Bangladesh, Chittagong Independent University (CIU), Asian University for Women, Port City International University, East Delta University, International Islamic University, Premier University, Southern University, University of Information Technology and Sciences and the University of Science & Technology Chittagong are among them. Health The Chittagong Medical College Hospital is the largest state-owned hospital in Chittagong. The Chittagong General Hospital, established in 1901, is the oldest hospital in the city. The Bangladesh Institute of Tropical and Infectious Diseases (BITID) is based the city. Other government-run medical centers in the city include the Family Welfare Centre, TB Hospital, Infectious Disease Hospital, Diabetic Hospital, Mother and Children Hospital, and the Police Hospital. Among the city's private hospitals are the Bangabandhu Memorial Hospital (BBMH), Chittagong Metropolitan Hospital, Chevron Clinic, Surgiscope Hospital, CSCR, Centre Point Hospital, Park View Hospital, Max Hospital & diagnosis, Imperial Hospital LTD., Evercare Hospital Ltd., National Hospital and Mount Hospital Ltd. Private Medical Colleges: Transport Transport in Chittagong is similar to that of the capital, Dhaka. large avenues and roads are present throughout the metropolis. There are various bus systems and taxi services, as well as smaller 'baby' or 'CNG' taxis, which are tricycle-structured motor vehicles. Foreign and local ridesharing companies like Uber and Pathao are operating in the city. There are also traditional manual rickshaws, which are very common. In the 2010s, the Chittagong Development Authority (CDA) undertook construction of numerous flyovers and road improvements aimed at easing the traffic congestion in Chittagong. The largest of these projects is the Chittagong Outer Ring Road, which runs along the coast for 15.7 kilometres (9.8 mi) from Patenga to Sagorika Industrial Area. The four-lane ring road is meant to ease gridlock in Chittagong city, and the 33-foot (10 m) embankment on which it is built is intended to protect coastal areas from natural disasters. When the project was approved in 2011, it was expected to be finished in 2014. Construction didn't start until 2015, and is ongoing as of 2025. The original cost estimate has risen almost fourfold, to Tk 33.24 billion ($275M as of 2025). The authority also began the construction of a 9.3 kilometres (5.8 mi) underwater expressway tunnel through the Karnaphuli river to ensure better connectivity between the northern and southern parts of Chittagong. This tunnel will be the first of its kind in South Asia. The N1 (Dhaka-Chittagong Highway), a major arterial national highway, is the only way to access the city by motor vehicle from most other parts of the country. It is considered a crowded and dangerous highway. This highway is also part of AH41 route of the Asian Highway Network. It has been upgraded to four lanes. The N106 (Chittagong-Rangamati Highway) is another major national highway that connects the Chittagong Hill Tracts with the Oxygen Square. Chittagong can also be accessed by rail. It has a station on the metre gauge, the eastern section of the Bangladesh Railway, whose headquarters are also located within the city. There are two main railway stations, on Station Road and in the Pahartali Thana. Trains to Dhaka, Sylhet, Comilla, and Bhairab are available from Chittagong. The Chittagong Circular Railway was introduced in 2013 to ease traffic congestion and to ensure better public transport service for commuters within the city. The railway includes high-speed DEMU trains with a carrying capacity of 300 passengers. These DEMU trains also travel on the Chittagong-Laksham route which connects the city with Comilla. The Shah Amanat International Airport (IATA: CGP, ICAO: VGEG), located at South Patenga, serves as Chittagong's only airport. It is the second busiest airport in Bangladesh. The airport is capable of annually handling 1.5 million passengers and 6,000 tonnes of cargo. Known as Chittagong Airfield during World War II, the airport was used as a supply point by the United States Army Air Forces' Tenth Air Force during the Burma Campaign 1944–45. It officially became a Bangladeshi airport in 1972 after Bangladesh's liberation war. International services fly to major cities of the Arabian Peninsula as well as to Indian city of Kolkata. At present, Middle Eastern airlines like Air Arabia, Flydubai, Jazeera Airways, Oman Air and SalamAir operate flights from the city to these destinations along with airlines of Bangladesh. All Bangladeshi airlines operate regular domestic flights to Dhaka. The airport was formerly known as MA Hannan International Airport but was renamed after a famous Sufi saint Shah Amanat on 2 April 2005 by the Government. Sports Chittagong has produced numerous cricketers, footballers, and athletes, who have performed at the national level. Tamim Iqbal, Akram Khan, Minhajul Abedin, Aftab Ahmed, Nafees Iqbal, Nazimuddin, Faisal Hossain, Tareq Aziz, Mominul Haque, Nayeem Hasan, Mamunul Islam, Ashish Bhadra, Shahidul Alam Sohel are some of the most prominent figures among them. Cricket is the most popular sport in Chittagong, while football, tennis and kabaddi are also popular. Several stadiums are located in Chittagong with the main one being the multipurpose MA Aziz Stadium, which has a seating capacity of 20,000 and hosts football matches in addition to cricket. MA Aziz Stadium was the stadium where Bangladesh achieved its first-ever Test cricket victory, against Zimbabwe in 2005. The stadium now focuses only on football, and is currently the main football venue of the city. Zohur Ahmed Chowdhury Stadium, is currently the main cricket venue of the city, which was awarded Test status in 2006, hosting both domestic and international cricket matches. The city hosted two group matches of the 2011 ICC Cricket World Cup, both taking place in Zohur Ahmed Chowdhury Stadium. It also co-hosted 2014 ICC World Twenty20 along with Dhaka and Sylhet, Zohur Ahmed Chowdhury Stadium hosted 15 group stage matches. Other stadiums in Chittagong include the Women's Complex Ground. Major sporting clubs such as, Mohammedan Sporting Club and Abahani Chittagong are also located in the city. Chittagong is also home to the Bangladesh Premier League franchise, the Chattogram Challengers. Notable residents Sister cities See also Notes External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/The_Exodus] | [TOKENS: 7353] |
Contents The Exodus The Exodus (Hebrew: יציאת מצרים, romanized: Yəṣīʾat Mīṣrayīm, lit. 'Departure from Egypt'[a]) is the founding myth[b] of the Israelites whose narrative is spread over four of the five books of the Pentateuch (specifically, Exodus, Leviticus, Numbers, and Deuteronomy). The narrative of the Exodus describes a history of Egyptian bondage of the Israelites followed by their exodus from Egypt through a passage in the Red Sea, in pursuit of the Promised Land under the leadership of Moses. The story of the Exodus is central in Judaism. It is recounted daily in Jewish prayers and celebrated in festivals such as Passover. Early Christians saw the Exodus as a typological prefiguration of resurrection and salvation by Jesus. The Exodus is also recounted in the Quran as part of the extensive referencing of the life of Moses, a major prophet in Islam. The narrative has also resonated with various groups in more recent centuries, such as among African Americans striving for freedom and civil rights, and in liberation theology. The consensus of modern scholars on the historicity of the Exodus is that the Pentateuch does not give an accurate account of the origins of the Israelites, who appear instead to have formed as an entity in the central highlands of Canaan in the late second millennium BCE (around the time of the Late Bronze Age collapse) from the indigenous Canaanite culture. Most modern scholars believe that some elements in the story of the Exodus might have some historical basis, but that any such basis has little resemblance to the story told in the Pentateuch. While the majority of modern scholars date the composition of the Pentateuch to the period of the Achaemenid Empire (5th century BCE), some of the elements of this narrative are older, since allusions to the story are made by 8th-century BCE prophets such as Amos and Hosea. In the Bible The Exodus tells a story of the enslavement of the Israelites, the Plagues of Egypt, the departure of the Israelites from Egypt, the revelations at Mount Sinai, and the Israelite wanderings in the wilderness up to the borders of Canaan. Its message is that the Israelites were delivered from slavery by Yahweh their god, and therefore belong to him by covenant. The story of the Exodus is told in the first half of the book of Exodus, with the remainder recounting the 1st year in the wilderness, and followed by a narrative of 39 more years in the books of Leviticus, Numbers, and Deuteronomy, the last four of the first five books of the Bible (also called the Torah or Pentateuch). In the first book of the Pentateuch, the Book of Genesis, the Israelites had come to live in Egypt in the Land of Goshen during a famine, under the protection of an Israelite, Joseph, who had become a high official in the court of the Egyptian pharaoh. Exodus begins with the death of Joseph and the ascension of a new pharaoh "who did not know Joseph" (Exodus 1:8). The pharaoh becomes concerned by the number and strength of the Israelites in Egypt and enslaves them, commanding them to build at two "supply" or "store cities" called Pithom and Rameses (Exodus 1:11).[c] The pharaoh also orders the slaughter at birth of all male Hebrew children. One Hebrew child, however, is rescued and abandoned in a floating basket on the Nile. He is found and adopted by Pharaoh's daughter, who names him Moses. Grown to a young man, Moses kills an Egyptian he sees beating a Hebrew slave, and takes refuge in the land of Midian, where he marries Tzipporah, a daughter of the Midianite priest Jethro. The old pharaoh dies and a new one ascends the throne. According to Ezekiel 20:8-9, the enslaved Israelites also practised "abominations" and worshiped the gods of Egypt. This provoked Yahweh to destroy them but he relented to avoid his name being "profaned". Meanwhile, Moses goes to Mount Horeb, where Yahweh appears in a burning bush and commands him to go to Egypt to free the Hebrew slaves and bring them to the Promised Land in Canaan. Yahweh also speaks to Moses's brother Aaron, and the two assemble the Israelites and perform miraculous signs to rouse their belief in Yahweh's promise. Moses and Aaron then go to Pharaoh and ask him to let the Israelites go into the desert for a religious festival, but he refuses and increases their workload, commanding them to make bricks without straw. Moses and Aaron return to Pharaoh and ask him to free the Israelites and let them depart. Pharaoh demands Moses to perform a miracle, and Aaron throws down Moses' staff, which turns into a tannin (sea monster or snake) (Exodus 7:8-13); however, Pharaoh's magicians[d] are also able to do this, though Moses' serpent devours the others. Pharaoh refuses to let the Israelites go. After this, Yahweh inflicts a series of Plagues on the Egyptians each time Moses repeats his demand and Pharaoh refuses to release the Israelites. Pharaoh's magicians are able to match the first plagues, in which Yahweh turns the Nile to blood and produces a plague of frogs, but they cannot match any plagues starting with the third, the plague of gnats. After each plague, Pharaoh asks the Israelites to worship Yahweh to remove the plague, then still refuses to free them. Moses is commanded to fix the first month of Aviv at the head of the Hebrew calendar. He instructs the Israelites to take a lamb on the 10th day, and on the 14th day to slaughter it and daub its blood on their doorposts and lintels, and to observe the Passover meal that night, the night of the full moon. In the final plague, Yahweh sends an angel to each house to kill the firstborn son and firstborn cattle, but the houses of the Israelites are spared by the blood on their doorposts. Yahweh commands the Israelites to commemorate this event in "a perpetual ordinance" (Exodus 12:14). Pharaoh finally casts the Israelites out of Egypt after his firstborn son is killed. Yahweh leads the Israelites in the form of a pillar of cloud in the day and a pillar of fire at night. However, once the Israelites have left, Yahweh "hardens" Pharaoh's heart to change his mind and pursue the Israelites to the shore of the Red Sea. Moses uses his staff to part the Red Sea, and the Israelites cross on dry ground, but the sea closes on the pursuing Egyptians, drowning them all. The Israelites begin to complain, and Yahweh miraculously provides them with water and food, eventually raining manna down for them to eat. The Amalekites attack at Rephidim, but are defeated. Jethro, the father-in-law of Moses, convinces him to appoint judges for the tribes of Israel. The Israelites reach the Sinai Desert and Yahweh calls Moses to Mount Sinai, where Yahweh reveals himself to his people and establishes the Ten Commandments and Mosaic covenant: the Israelites are to keep his torah (law, instruction), and Yahweh promises them the land of Canaan. Yahweh establishes the Aaronic priesthood and detailed rules for ritual worship, among other laws. However, in Moses's absence the Israelites sin against Yahweh by creating the idol of a golden calf. As punishment Yahweh has the Levites kill three thousand of the Israelites (Exodus 32:28), and Yahweh sends a plague on them. The Israelites now accept the covenant, which is reestablished; they build a tabernacle for Yahweh, and receive their laws. Yahweh commands Moses to take a census of the Israelites and establishes the duties of the Levites. Then the Israelites depart from Mount Sinai. Yahweh commands Moses to send twelve spies ahead to Canaan to scout the land. The spies discover that the Canaanites are formidable, and to dissuade the Israelites from invading, the spies falsely report that Canaan is full of giants (Numbers 13:30-33). The Israelites refuse to go to Canaan, and Yahweh declares that the generation that left Egypt will have to pass away before the Israelites can enter the promised land. The Israelites will have to remain in the wilderness for forty years, and Yahweh kills the spies through a plague except for the righteous Joshua and Caleb, who will be allowed to enter the promised land (Numbers 13:36-38). A group of Israelites led by Korah, son of Izhar, rebels against Moses, but Yahweh opens the earth and sends them living to Sheol (Numbers 16:1-33). The Israelites come to the oasis of Kadesh Barnea, where Miriam dies and the Israelites remain for nineteen years. To provide water, Yahweh commands Moses to get water from a rock by speaking to it, but Moses instead strikes the rock with his staff, for which Yahweh forbids him from entering the Promised Land. Moses sends a messenger to the king of Edom requesting passage through his land to Canaan, but the king refuses. The Israelites then go to Mount Hor, where Aaron dies. The Israelites try to go around Edom, but the Israelites complain about lack of bread and water, so Yahweh sends a plague of poisonous snakes to afflict them (Numbers 21:4-7). After Moses prays for deliverance, Yahweh has him create a brazen serpent, and the Israelites who look at it are cured (Numbers 21:8-9). The Israelites are soon in conflict with various other kingdoms, and king Balak of Moab asks the seer Balaam to curse the Israelites, but Balaam blesses them instead. Some Israelites begin having sexual relations with Moabite women and worshipping Moabite gods, so Yahweh orders Moses to impale the idolators and sends another plague. The full extent of Yahweh's wrath is averted when Phinehas impales an Israelite and a Midianite woman having intercourse (Numbers 25:7-9). Yahweh commands the Israelites to destroy the Midianites, and Moses and Phinehas take another census. Then they conquer the lands of Og and Sihon in Transjordan, settling the Gadites, Reubenites, and half the Tribe of Manasseh there. Moses then addresses the Israelites for a final time on the banks of the Jordan River, reviewing their travels and giving them further laws. Yahweh tells Moses to summon Joshua to lead the conquest of Canaan. Yahweh tells Moses to ascend Mount Nebo, from where he sees the Promised Land, and dies. The climax of the Exodus is the covenant (binding legal agreement) between God and the Israelites mediated by Moses at Sinai: Yahweh will protect the Israelites as his chosen people for all time, and the Israelites will keep Yahweh's laws and worship only him. The covenant is described in stages: at Exodus 24:3–8 the Israelites agree to abide by the "book of the covenant" that Moses has just read to them; shortly afterwards God writes the "words of the covenant" – the Ten Commandments – on stone tablets; and finally, as the people gather in Moab to cross into the promised land of Canaan, Moses reveals Yahweh's new covenant "beside the covenant he made with them at Horeb" (Deuteronomy 29:1). The laws are set out in a number of codes: Origins and historicity There are two main positions on the historicity of the Exodus in modern scholarship. The majority position is that the biblical Exodus narrative has some historical basis, although there is little of historical fact in it.[e] The other position, often associated with the school of Biblical minimalism, is that the biblical exodus traditions are the invention of the exilic and post-exilic Jewish community, with little to no historical basis. The biblical Exodus narrative is best understood as a founding myth of the Jewish people, providing an ideological foundation for their culture and institutions, not an accurate depiction of the history of the Israelites. The view that the biblical narrative is essentially correct unless it can explicitly be proved wrong (Biblical maximalism) is today held by "few, if any [...] in mainstream scholarship, only on the more fundamentalist fringes." There is no direct evidence for any of the people or events of Exodus in non-biblical ancient texts or in archaeological remains, and this has led most scholars to omit the Exodus events from comprehensive histories of Israel. Most mainstream scholars do not accept the biblical Exodus account as history for a number of reasons. Most agree that the Exodus stories were written centuries after the apparent setting of the stories. Scholars argue that the Book of Exodus itself attempts to ground the event firmly in history, reconstructing a date for the exodus as the 2666th year after creation (Exodus 12:40-41), the construction of the tabernacle to year 2667 (Exodus 40:1-2, 17), stating that the Israelites dwelled in Egypt for 430 years (Exodus 12:40-41), and specifying place names such as Goshen (Gen. 46:28), Pithom, and Ramesses (Exod. 1:11), as well as the count of 600,000 Israelite men (Exodus 12:37). The Book of Numbers further states that the number of Israelite males aged 20 years and older in the desert during the wandering was 603,550, which works out to a total population of 2.5-3 million including women and children—far more than could be supported by the Sinai Desert. The geography is vague with regions such as Goshen unidentified,[f] and there are internal problems with dating in the Pentateuch. No modern attempt to identify a historical Egyptian as a prototype for Moses has found wide acceptance, and no period in Egyptian history matches the biblical accounts of the Exodus. Some elements of the story are miraculous and defy rational explanation, such as the Plagues of Egypt and the Crossing of the Red Sea. The Bible does not mention the names of any of the pharaohs involved, further obscuring comparison of archaeologically recovered Egyptian history with the biblical narrative. While ancient Egyptian texts from the New Kingdom mention "Asiatics" living in Egypt as slaves and workers, these people cannot be securely connected to the Israelites, and no contemporary Egyptian text mentions a large-scale exodus of slaves like that described in the Bible. The earliest surviving historical mention of the Israelites, the Egyptian Merneptah Stele (c. 1207 BCE), appears to place them in or around Canaan and gives no indication of any exodus. Archaeologist Israel Finkelstein argues from his analysis of the itinerary lists in the books of Exodus, Numbers and Deuteronomy that the biblical account represents a long-term cultural memory, spanning the 16th to 10th centuries BCE, rather than a specific event: "The beginning is vague and now untraceable." Instead, modern archaeology suggests continuity between Canaanite and Israelite settlement, indicating a primarily Canaanite origin for Israel, with no suggestion that a group of foreigners from Egypt comprised early Israel. Despite the absence of any archaeological evidence, according to Avraham Faust, "most scholars agree that the narrative has a historical core" made up of a probable reconstruction of an Exodus based on similar collective memories, with biblical scholar Kenton Sparks referring to it as "mythologized history". Faust specifies that the result of his assessment is unlikely if it is solely based on either Egyptian presence in Late Bronze Age Canaan or the foreign Hyksos rulers of Egypt, and rules out Midian human activity "which cannot help in dating the Exodus" in identification of the proto-Israelites. Agreeing in treating the expulsion of the Hyksos "not as related to the flight of a group of slaves[,]" Manfred Bietak points out that the portrayal of the Hyksos as a ruling elite with a background in trade and seafaring conflicts with the biblical portrayal of the Israelites as oppressed in Egypt. Most scholars posit that a small group of Egyptian origin may have joined the early Israelites, and contributed their own Egyptian Exodus story to all of Israel.[g] William G. Dever cautiously identifies this group with the Tribe of Joseph, while Richard Elliott Friedman identifies it with the Tribe of Levi. Most scholars who accept Faust's definition of a historical core date possible Exodus group activity to the thirteenth century BCE at the time of Ramses II (19th dynasty), with some instead dating it to the twelfth century BCE under Ramses III (20th dynasty). Evidence in favor of historical traditions forming a background to the Exodus myth include the documented movements of small groups of Ancient Semitic-speaking peoples into and out of Egypt during the 18th and 19th dynasties, some elements of Egyptian folklore and culture mentioned in the Exodus narrative, and the names Moses, Aaron and Phinehas, which seem to have an Egyptian origin. Scholarly estimates for how many could have been involved in such an exodus range from a few hundred to a few thousand people. In terms of dating the Exodus, among those who have attempted to date it to a specific time, a common proposal is 1130 BCE, while a less common is 1525 BCE to align with the expulsion of the Hyksos, Ahmose I Tempest Stele and the Thera eruption, and the WS I bowl noted by William G. Dever as presented by Manfred Bietak. Faust renders in 2023 the academic consensus about the number of people from the Exodus: "most scholars agree that it was in the range of a few thousands, or even perhaps only hundreds." Joel S. Baden noted the presence of Semitic-speaking slaves in Egypt who sometimes escaped in small numbers as potential inspirations for the Exodus. It is also possible that oppressive Egyptian rule of Canaan during the late second millennium BCE, during the 19th and especially the 20th dynasty, may have disposed some native Canaanites to adopt into their own mythology the exodus story of a small group of Egyptian refugees. Nadav Na'aman argues that oppressive Egyptian rule of Canaan may have inspired the Exodus narrative, forming a "collective memory" of Egyptian oppression that was transferred from Canaan to Egypt itself in the popular consciousness. The 17th dynasty expulsion of the Hyksos, a group of Semitic invaders, is also frequently discussed as a potential historical parallel or origin for the story. Many other scholars reject this view, and instead see the biblical exodus traditions as the invention of the exilic and post-exilic Jewish community, with little to no historical basis. Lester Grabbe, for instance, argues that "[t]here is no compelling reason that the exodus has to be rooted in history", and that the details of the story more closely fit the seventh through the fifth centuries BCE than the traditional dating to the second millennium BCE. Some scholars also hold that the Israelites originated in Canaan and from the Canaanites, although others disagree. Philip R. Davies suggests that the story may have been inspired by the return to Israel of Israelites and Judaeans who were placed in Egypt as garrison troops by the Assyrians in the fifth and sixth centuries BCE, during the exile. Development and final composition The earliest traces of the traditions behind the exodus appear in the northern prophets Amos and Hosea, both active in the 8th century BCE in northern Israel, but their southern contemporary Isaiah shows no knowledge of an exodus. Micah, who was active in the south around the same time, references the exodus once (Micah 6:4–5), but it is debated whether the passage is an addition by a later editor.[h] Jeremiah, active in the 7th century, mentions both Moses and the Exodus. The story may, therefore, have originated a few centuries earlier, perhaps in the 10th or 9th century BCE, and there are signs that it took different forms in Israel, in the Transjordan region, and in the southern Kingdom of Judah before being unified in the Persian era. The Exodus narrative was most likely further altered and expanded under the influence of the return from the Babylonian captivity in the sixth century BCE. Evidence from the Bible suggests that the Exodus from Egypt formed a "foundational mythology" or "state ideology" for the Northern Kingdom of Israel. The northern psalms 80 and 81 state that God "brought a vine out of Egypt" (Psalm 80:8) and record ritual observances of Israel's deliverance from Egypt as well as a version of part of the Ten Commandments (Psalm 81:10-11). The Books of Kings records the dedication of two golden calves in Bethel and Dan by the Israelite king Jeroboam I, who uses the words "Here are your gods, O Israel, which brought you up out of the land of Egypt" (1 Kings 12:28). Scholars relate Jeroboam's calves to the golden calf made by Aaron of Exodus 32. Both include a nearly identical dedication formula ("These are your gods, O Israel, who brought you up out of the land of Egypt", Exodus 32:8). This episode in Exodus is "widely regarded as a tendentious narrative against the Bethel calves". Egyptologist Jan Assmann suggests that event, which would have taken placec. 931 BCE, may be partially historical due to its association with the historical pharaoh Sheshonq I (the biblical Shishak). Stephen Russell dates this tradition to "the eighth century BCE or earlier", and argued that it preserves a genuine Exodus tradition from the Northern Kingdom, but in a Judahite recension. Russell and Frank Moore Cross argue that the Israelites of the Northern Kingdom may have believed that the calves at Bethel and Dan were made by Aaron. Russell suggests that the connection to Jeroboam may have been later, possibly coming from a Judahite redactor. Pauline Viviano, however, concludes that neither the references to Jeroboam's calves in Hosea (Hosea 8:6 and 10:5) nor the frequent prohibitions of idol worship in the seventh-century southern prophet Jeremiah show any knowledge of a tradition of a golden calf having been created in Sinai. Some of the earliest evidence for Judahite traditions of the exodus is found in Psalm 78, which portrays the Exodus as beginning a history culminating in the building of the temple at Jerusalem. Pamela Barmash argues that the psalm is a polemic against the Northern Kingdom; as it fails to mention that kingdom's destruction in 722 BCE, she concludes that it must have been written before then. The psalm's version of the Exodus contains some important differences from what is found in the Pentateuch: there is no mention of Moses, and the manna is described as "food of the mighty" rather than as bread in the wilderness. Nadav Na'aman argues for other signs that the Exodus was a tradition in Judah before the destruction of the northern kingdom, including the Song of the Sea and Psalm 114, as well as the great political importance that the narrative came to assume there.[i] A Judahite cultic object associated with the exodus was the brazen serpent or nehushtan: according to 2 Kings 18:4, the brazen serpent had been made by Moses and was worshiped in the temple in Jerusalem until the time of king Hezekiah of Judah, who destroyed it as part of a religious reform, possiblyc. 727 BCE.[j] In the Pentateuch, Moses creates the brazen serpent in Numbers 21:4-9. Meindert Dijkstra writes that while the historicity of the Mosaic origin of the Nehushtan is unlikely, its association with Moses appears genuine rather than the work of a later redactor. Mark Walter Bartusch notes that the nehushtan is not mentioned at any prior point in Kings, and suggests that the brazen serpent was brought to Jerusalem from the Northern Kingdom after its destruction in 722 BCE. The revelation of God on Sinai appears to have originally been a tradition unrelated to the Exodus. Joel S. Baden notes that "[t]he seams [between the Exodus and Wilderness traditions] still show: in the narrative of Israel's rescue from Egypt there is little hint that they will be brought anywhere other than Canaan – yet they find themselves heading first, unexpectedly, and in no obvious geographical order, to an obscure mountain." In addition, there is widespread agreement that the revelation of the law in Deuteronomy was originally separate from the Exodus: the original version of Deuteronomy is generally dated to the 7th century BCE. The contents of the books of Leviticus and Numbers are late additions to the narrative by priestly sources. Scholars broadly agree that the publication of the Torah (or of a proto-Pentateuch) took place in the mid-Persian period (the 5th century BCE), echoing a traditional Jewish view which gives Ezra, the leader of the Jewish community on its return from Babylon, a pivotal role in its promulgation. Many theories have been advanced to explain the composition of the first five books of the Bible, but two have been especially influential. The first of these, Persian Imperial authorisation, advanced by Peter Frei in 1985, is that the Persian authorities required the Jews of Jerusalem to present a single body of law as the price of local autonomy. Frei's theory was demolished at an interdisciplinary symposium held in 2000, but the relationship between the Persian authorities and Jerusalem remains a crucial question. The second theory, associated with Joel P. Weinberg and called the "Citizen-Temple Community", is that the Exodus story was composed to serve the needs of a post-exilic Jewish community organized around the Temple, which acted in effect as a bank for those who belonged to it. The books containing the Exodus story served as an "identity card" defining who belonged to this community (i.e., to Israel), thus reinforcing Israel's unity through its new institutions. Hellenistic Egyptian parallel narratives Writers in Greek and Latin during the Ptolemaic Kingdom (late 4th century BCE–late 1st century BCE) record several Egyptian tales of the expulsion of a group of foreigners connected to the Exodus. These tales often include elements of the Second Intermediate Period ("Hyksos period") and most are extremely anti-Jewish. The earliest non-biblical account is that of Hecataeus of Abdera (c. 320 BCE) as preserved in the first century CE Jewish historian Josephus in Against Apion and in a variant version by the first-century BCE Greek historian Diodorus. Hecataeus tells how the Egyptians blamed a plague on foreigners and expelled them from the country, whereupon Moses, their leader, took them to Canaan. In this version, Moses is portrayed extremely positively. Manetho, also preserved in Josephus's Against Apion, tells how 80,000 lepers and other "impure people", led by a priest named Osarseph, join forces with the former Hyksos, now living in Jerusalem, to take over Egypt. They wreak havoc until the Pharaoh and his son chase them out to the borders of Syria, where Osarseph gives the lepers a law code and changes his name to Moses. The identification of Osarseph with Moses in Manetho's account may be an interpolation or may come from Manetho. Other versions of the story are recorded by the first-century BCE Egyptian grammarian Lysimachus of Alexandria, who set the story in the time of Pharaoh Bakenranef (Bocchoris), the first-century CE Egyptian historian Chaeremon of Alexandria, and the first-century BCE Gallo-Roman historian Gnaeus Pompeius Trogus. The first-century CE Roman historian Tacitus included a version of the story that claims that the Hebrews worshipped a donkey as their god to ridicule Egyptian religion, whereas the Roman biographer Plutarch claimed that the Egyptian god Seth was expelled from Egypt and had two sons named Juda and Hierosolyma. The stories may represent a polemical Egyptian response to the Exodus narrative. Egyptologist Jan Assmann proposed that the story comes from oral sources that "must [...] predate the first possible acquaintance of an Egyptian writer with the Hebrew Bible." Assmann suggested that the story has no single origin but rather combines numerous historical experiences, notably the Amarna and Hyksos periods, into a folk memory. There is general agreement that the stories originally had nothing to do with the Jews. Erich S. Gruen suggested that it may have been the Jews themselves that inserted themselves into Manetho's narrative, in which various negative actions from the point of view of the Egyptians, such as desecrating temples, are interpreted positively. Religious and cultural significance Commemoration of the Exodus is central to Judaism, and Jewish culture. In the Bible, the Exodus is frequently mentioned as the event that created the Israelite people and forged their bond with God, being described as such by the prophets Hosea, Jeremiah, and Ezekiel. The Exodus is invoked daily in Jewish prayers and celebrated each year during the Jewish holidays of Passover, Shavuot, and Sukkot. The fringes worn at the corners of traditional Jewish prayer shawls are described as a physical reminder of the obligation to observe the laws given at the climax of Exodus: "Look at it and recall all the commandments of the Lord" (Numbers). The festivals associated with the Exodus began as agricultural and seasonal feasts but became completely subsumed into the Exodus narrative of Israel's deliverance from oppression at the hands of God. For Jews, the Passover celebrates the freedom of the Israelites from captivity in Egypt, the settling of Canaan by the Israelites, and the "passing over" of the angel of death during the death of the first-born. Passover involves a ritual meal called a Seder during which parts of the exodus narrative are retold. In the Hagaddah of the Seder it is written that every generation is obliged to remind and identify itself in terms of the Exodus. Thus the following words from the Pesaḥim (10:5) are recited: "In every generation a person is duty-bound to regard himself as if he personally has gone forth from Egypt."[k] Because the Israelites fled Egypt in haste without time for bread to rise, the unleavened bread matzoh is eaten on Passover, and homes must be cleansed of any items containing leavening agents, known as Chametz. Shavuot celebrates the granting of the Law to Moses on Mount Sinai; Jews are called to rededicate themselves to the covenant on this day. Some denominations follow Shavuot with The Three Weeks, during which the "two most heinous sins committed by the Jews in their relationship to God" are mourned: the Golden Calf and the doubting of God's promise by the Twelve Spies. A third Jewish festival, Sukkot, the Festival of Booths, is associated with the Israelites living in booths after they left their previous homes in Egypt. It celebrates how God provided for the Israelites while they wandered in the desert without food or shelter. It is celebrated by building a sukkah, a temporary shelter also called a booth or tabernacle, in which the rituals of Sukkot are performed, recalling the impermanence of the Israelites' homes during the desert wanderings. The Christian ritual of the eucharist and the holiday of Easter draw directly on the imagery of the Passover and the Exodus. In the New Testament, Jesus is frequently associated with motifs of the Exodus. The Gospel of Mark has been suggested to be a midrash on the Exodus, though the scholar Larry J. Perkins thinks this unlikely. Mark suggests that the outpouring of Jesus' blood creates a new covenant (Mark 14:24) in the same way that Moses' sacrifice of bulls had created a covenant (Exodus 24:5). In the Gospel of Matthew, Jesus reverses the direction of the Exodus by escaping from the Massacre of the Innocents committed by Herod the Great before himself returning from Egypt (Matt 2:13-15). Other parallels in Matthew include that he is baptized by water (Matt 3:13-17), and tested in the desert; unlike the Israelites, he is able to resist temptation (Matt. 4.1-3). The Gospel of John repeatedly calls Jesus the Passover lamb (John 1:29, 13:1, 19:36), something also found in 1 Peter (1 Pet 1:18-20), and 1 Corinthians (1 Cor 5:7-8). Biblical scholar Michael Graves calls Paul's discussion of the exodus in 1 Corinthians 5:7-8 and his comparison of the early church in Corinth to the Israelites in the desert "[t]he two most significant NT passages touching on the exodus". John also refers to Jesus as manna (John 6:31-5), water flowing from a rock in the desert (John 7:37-9), and as a pillar of fire (John 8:12). Early Christians frequently interpreted actions taken in the Exodus, and sometimes the Exodus as a whole, typologically to prefigure Jesus or actions of Jesus. In Romans 9:17, Paul interprets the hardened heart of Pharaoh during the Plagues of Egypt as referring to the hardened hearts of the Jews who rejected Christ. Early Christian authors such as Justin Martyr, Irenaeus, and Augustine all emphasized the supersession of the Old Covenant of Moses by the New Covenant of Christ, which was open to all people rather than limited to the Jews. The story of the Exodus is also recounted in the Quran, in which Moses is one of the most prominent prophets and messengers. He is mentioned 136 times, the most of any individual in the Quran, with him and his life being narrated and recounted more than that of any other prophet. A number of historical events and situations have been compared to the Exodus. Many early American settlers interpreted their flight from Europe to a new life in America as a new exodus. American "founding fathers" Thomas Jefferson and Benjamin Franklin recommended for the Great Seal of the United States to depict Moses leading the Israelites across the Red Sea. African Americans suffering under slavery and racial oppression interpreted their situation in terms of the Exodus, making it a catalyst for social change. South American liberation theology also takes much inspiration from the Exodus. See also Notes References Cited works External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Enterprise_social_networking] | [TOKENS: 1712] |
Contents Enterprise social networking Enterprise social networking focuses on the use of online social networks or social relations among people who share business interests and/or activities. Enterprise social networking is often a facility of enterprise social software (regarded as a primary component of Enterprise 2.0), which is essentially social software used in "enterprise" (business/commercial) contexts. It encompasses modifications to corporate intranets (referred to as social intranets) and other classic software platforms used by large companies to organize their communication, collaboration and other aspects of their intranets. Enterprise social networking is also generally thought to include the use of a standard external social networking service to generate visibility for an enterprise. History Social networking sites started to form in the 1990s; an example of these websites is Theglobe.com, which began in 1995. As other websites such as GeoCities and Tripod.com started to form online communities, they encouraged their users to interact with each other via chat rooms and other tools. They also provided easy-to-use publishing tools along with free web space. Classmates.com’s approach was to link people together via their emails, the website was like a friends search engine. Businesses eventually realized that social networking websites could provide a fast and efficient way of marketing. Social media websites are great places for businesses to reach their customers, and the environment can provide a means for growing a business. In 2005, as social networking websites were becoming more and more popular, Myspace had more page views than Google. Myspace was followed by Facebook which started in February 2004. When Facebook began, users were limited to college students in the United States, who had to use a college email with a .edu extension to join the network. In September 2005 some high schools were allowed to join the network, but they needed an invitation to join. On September 26 of 2006, Facebook announced that anyone around the world older than 13 years old with a valid email would be able to join Facebook’s online community. In October 2007, Microsoft purchased a 1.6% share of Facebook. That gave them the right to place international ads on Facebook. In July 2010 it was reported that Facebook had more than 500 million active users. This means that one out of fourteen people around the world is a Facebook user. The growth of Facebook was a boom in the social networking space. Facebook became a huge corporation that had 1400 employees in 2009; their estimated revenue was US$800 million in 2009. In 2010, it was reported that there were more than 200 social networking websites on the web. Business impacts Companies such as Jive Software and IBM have recently been doing research to see how social networking can impact enterprise networks. Different companies have embraced social networking and they are creating their own internal social networking sites. IBM is one example and they have created the Beehive research project, based on their Lotus Connections product. Another example is Atos, which is deploying its in-house blueKiwi product across all 76,000 employees to achieve its Zero Email ambition by the end of 2013.[citation needed] Many companies are encouraging employees to use their social networks so they can connect with other employees, help people socialize when they take a break, or even help contribute to other work-related issues. Some companies are even joining typical social networking sites like Facebook or MySpace to gain more clients, communicate with their clients, or target individuals based on their likes. These companies want to gain the trust of their clients. Applications The year 2009 saw 92% of Inc. 500 companies using at least one social media channel, a dramatic increase over the 77% reported in 2008. Medpedia is one of about 70 medical wikis that allows physicians and researchers to share information. While the information is free and publicly available, contributions are limited to those provided by medical professionals. Intellipedia is a set of three wikis used by the US intelligence community to share information of varying classifications. Deloitte has been a pioneer in its use of the corporate social networking applications for consulting. Boral Limited was one of the first major building and construction materials companies in Australia to adopt Enterprise Social Networking using the Yammer platform. Engagement levels of over 25% were achieved. Other notable uses include Lockheed Martin and Pfizer as documented by the Queensland University of Technology (QUT). The adoption of social networking in sales organizations has recently been given a new name, S2.0 or Sales 2.0. Implementing a private sales social network provides a means to quickly disburse company sales knowledge. Smart companies are using social media tools to outsource their work to their customers. In addition to using social networking to market the enterprise, companies are involving customers in the design process. Chicago company Threadless invites their customers to submit their T-shirt designs to the Threadless web site for review by other customers. Smart companies are also involving their customers in technical support using enterprise social networking. eBay was an early adopter of the practice of allowing users to assist each other online. Issues If training in the use of enterprise social networking tools is not provided to employees who do not have experience using them, they are unlikely to be widely adopted. The use of enterprise social networking must be championed at the highest levels of the enterprise to provide the resources needed and promote adoption throughout the organization.[citation needed] Gartner have said that only 10% of organisations see value in a social collaboration product, primarily due to the lack of change management provided during rollout. The lack of adoption is a complex issue with many theories put forward: The sharing of information across the enterprise through social networking creates a transparency that may or may not necessarily be welcomed by all sectors of the organization. There is often an assumption that social networking will not work well in a particular industry or that its use may be perceived as unprofessional. In addition the ability to justify use of enterprise social networking, based on return on investment is not always readily apparent. Members of traditional or conservative organisations perceive social networking as a time wasting tool. However once employees see usage by other employees, they often begin seeing the value and hence engaging and contributing content.[citation needed] Privacy issues associated with social networking sites can be seen from many perspectives. Some people will argue that this[clarification needed] is just a part of globalization and the growth of technology, while others will still believe that it[clarification needed] is a right for any citizen, and will never change their views and perspectives about their privacy. With the increase in popularity of social networking, many users have given up their personal privacy in order to join these networks. Social media privacy issues didn’t begin with social networking sites; they have historically been persistent issues with many other types of social media such as text messaging, instant messaging and computer-supported collaboration work.[citation needed] Oscar Gandy in 1993 claimed that in the age of digital media, people probably do not have any privacy. He stated that “the panoptic sort is an antidemocratic system of control that cannot be transformed because it can serve no purpose other than that for which it was designed — the rationalization and control of human existence.” Gandy demanded creation of an agency that would ensure the survival of privacy. From a policy perspective, according to Schement and Curtis (1994), privacy is seen “as security against intrusion by government”. According to Garfinkel (2000) “Privacy isn’t just about hiding things. It’s about self–possession, autonomy, and integrity.” Because social tools make many things that were normally private much more public, including all types of corporate data, many organizations would rather wait for best practices or to see what their peers are doing before delving very far into social networking. Privacy can become a huge issue at the enterprise level, when customer and employee data are at stake. Security concerns must be addressed prior to embarking upon creation of an enterprise social network. One of the significant areas of concern with the use of social networking internally within organisations is the impact and effect of behavioral issues. As the interactions within a social network are loosely coupled to business process and structured information systems, the effect of individual personalities and human psychology become more pronounced within social networks. Such emerging concerns cover issues such as attention management, death by trivia, dominant personalities, behavioral adoption and influence strategies. See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/United_States_courts_of_appeals] | [TOKENS: 2884] |
Contents United States courts of appeals The United States courts of appeals are the intermediate appellate courts of the United States federal judiciary. They hear appeals of cases from the United States district courts and some U.S. administrative agencies, and their decisions can be appealed to the Supreme Court of the United States. The courts of appeals are divided into 13 "Circuits". Eleven of the circuits are numbered "First" through "Eleventh" and cover geographic areas of the United States and hear appeals from the U.S. district courts within their borders. The District of Columbia Circuit covers only Washington, DC. The Federal Circuit hears appeals from federal courts across the entire United States in cases involving certain specialized areas of law. The United States courts of appeals are considered the most powerful and influential courts in the United States after the Supreme Court. Because of their ability to set legal precedent in regions that cover millions of Americans, the United States courts of appeals have strong policy influence on U.S. law. Moreover, because the Supreme Court chooses to review fewer than 3% of the 7,000 to 8,000 cases filed with it annually, the U.S. courts of appeals as a practical matter serve as the final arbiter on the vast majority of federal cases. There are 179 judgeships on the U.S. courts of appeals authorized by Congress in 28 U.S.C. § 43 pursuant to Article III of the U.S. Constitution. Like other federal judges, they are nominated by the president of the United States and confirmed by the United States Senate. They have lifetime tenure, earning (as of 2023) an annual salary of $246,600. The actual number of judges in service varies, both because of vacancies and because senior judges who continue to hear cases are not counted against the number of authorized judgeships. Decisions of the U.S. courts of appeals have been published by the private company West Publishing in the Federal Reporter series since the courts were established. Only decisions that the courts designate for publication are included. The "unpublished" opinions (of all but the Fifth and Eleventh Circuits) are published separately in West's Federal Appendix, and they are also available in on-line databases like LexisNexis or Westlaw. More recently, court decisions have also been made available electronically on official court websites. However, there are also a few federal court decisions that are classified for national security reasons. The circuit with the fewest appellate judges is the First Circuit, and the one with the most appellate judges is the geographically large and populous Ninth Circuit in the West. The number of judges that the U.S. Congress has authorized for each circuit is set forth by law in 28 U.S.C. § 44, while the places where those judges must regularly sit to hear appeals are prescribed in 28 U.S.C. § 48. Although the courts of appeals are frequently called "circuit courts", they should not be confused with the former United States circuit courts, which were active from 1789 through 1911, during the time when long-distance transportation was much less available, and which were primarily first-level federal trial courts that moved periodically from place to place in "circuits" in order to serve the dispersed population in towns and the smaller cities that existed then. The "courts of appeals" system was established in the Judiciary Act of 1891. Procedure Because the courts of appeals possess only appellate jurisdiction, they do not hold trials. Only courts with original jurisdiction hold trials and thus determine punishments (in criminal cases) and remedies (in civil cases). Instead, appeals courts review decisions of trial courts for errors of law.[citation needed] Accordingly, an appeals court considers only the record (that is, the papers the parties filed and the transcripts and any exhibits from any trial) from the trial court, and the legal arguments of the parties.[citation needed] These arguments, which are presented in written form and can range in length from dozens to hundreds of pages, are known as briefs. Sometimes lawyers are permitted to add to their written briefs with oral arguments before the appeals judges. At such hearings, only the parties' lawyers speak to the court.[citation needed] The rules that govern the procedure in the courts of appeals are the Federal Rules of Appellate Procedure. In a court of appeals, an appeal is almost always heard by a "panel" of three judges who are randomly selected from the available judges (including senior judges and judges temporarily assigned to the circuit). Some cases, however, receive an en banc hearing. Except in the Ninth Circuit Court, the en banc court consists of all of the circuit judges who are on active status, but it does not include the senior or assigned judges (except that under some circumstances, a senior judge may participate in an en banc hearing who participated at an earlier stage of the same case). Because of the large number of Appellate Judges in the Ninth Circuit Court of Appeals (29), only ten judges, chosen at random, and the Chief Judge hear en banc cases. The strong emphasis on collective judicial participation distinguishes the en banc procedure in the United States courts of appeals from the use of extended judicial formations in other legal systems, where such bodies often serve different institutional purposes. In the past, certain classes of federal court cases held the right of an automatic appeal to the Supreme Court of the United States. That is, one of the parties in the case could appeal a decision of a court of appeals to the Supreme Court, and it had to accept the case.[citation needed] The right of automatic appeal for most types of decisions of a court of appeals was ended by the Judiciary Act of 1925. Passage of this law was urged by Chief Justice William Howard Taft. The current procedure is that a party in a case may apply to the Supreme Court to review a ruling of the circuit court. This is called petitioning for a writ of certiorari, and the Supreme Court may choose, in its sole discretion, to review any lower court ruling. In extremely rare cases, the Supreme Court may grant the writ of certiorari before the judgment is rendered by the court of appeals, thereby reviewing the lower court's ruling directly. Certiorari before judgment was granted in the Watergate scandal-related case, United States v. Nixon, and in the 2005 decision involving the Federal Sentencing Guidelines, United States v. Booker. A court of appeals may also pose questions to the Supreme Court for a ruling in the midst of reviewing a case. This procedure was formerly used somewhat commonly, but now it is quite rare. For example, while between 1937 and 1946 twenty 'certificate' cases were accepted, since 1947 the Supreme Court has accepted only four. The Second Circuit, sitting en banc, attempted to use this procedure in the case United States v. Penaranda, 375 F.3d 238 (2d Cir. 2004), as a result of the Supreme Court's decision in Blakely v. Washington, but the Supreme Court dismissed the question. The last instance of the Supreme Court accepting a set of questions and answering them was in 1982's City of Mesquite v. Aladdin's Castle, Inc. A court of appeals may convene a Bankruptcy Appellate Panel to hear appeals in bankruptcy cases directly from the bankruptcy court of its circuit. As of 2008[update], only the First, Sixth, Eighth, Ninth, and Tenth Circuits have established a Bankruptcy Appellate Panel. Those circuits that do not have a Bankruptcy Appellate Panel have their bankruptcy appeals heard by the district court. Courts of appeals decisions, unlike those of the lower federal courts, establish binding precedents. Other federal courts in that circuit must, from that point forward, follow the appeals court's guidance in similar cases, regardless of whether the trial judge thinks that the case should be decided differently.[citation needed] Federal and state laws can and do change from time to time, depending on the actions of Congress and the state legislatures. Therefore, the law that exists at the time of the appeal might be different from the law that existed at the time of the events that are in controversy under civil or criminal law in the case at hand. A court of appeals applies the law as it exists at the time of the appeal; otherwise, it would be handing down decisions that would be instantly obsolete, and this would be a waste of time and resources, since such decisions could not be cited as precedent. "[A] court is to apply the law in effect at the time it renders its decision, unless doing so would result in manifest injustice, or there is statutory direction or some legislative history to the contrary." However, the above rule cannot apply in criminal cases if the effect of applying the newer law would be to create an ex post facto law to the detriment of the defendant.[citation needed] Decisions made by the circuit courts only apply to the districts within the court's oversight, though other courts may use the guidance issued by the circuit court in their own judgments. While a single case can only be heard by one circuit court, a core legal principle may be tried through multiple cases in separate circuit courts, creating an inconsistency between different parts of the United States. This creates a split decision among the circuit courts. Often, if there is a split decision between two or more circuits, and a related case is petitioned to the Supreme Court, the Supreme Court will take that case to resolve the split.[citation needed] Attorneys In order to serve as counsel in a case appealed to a circuit court, the attorney must first be admitted to the bar of that circuit. Admission to the bar of a circuit court is granted as a matter of course to any attorney who is admitted to practice law in any state of the United States. The attorney submits an application, pays a fee, and takes the oath of admission. Local practice varies as to whether the oath is given in writing or in open court before a judge of the circuit, and most courts of appeals allow the applicant to choose which method they prefer.[citation needed] Nomenclature When the courts of appeals were created in 1891, one was created for each of the nine circuits then existing, and each court was named the "United States Circuit Court of Appeals for the _____ Circuit". When a court of appeals was created for the District of Columbia in 1893, it was named the "Court of Appeals for the District of Columbia", and it was renamed to the "United States Court of Appeals for the District of Columbia" in 1934. In 1948, Congress renamed all of the courts of appeals then existing to their current formal names: the court of appeals for each numbered circuit was named the "United States Court of Appeals for the _____ Circuit", and the "United States Court of Appeals for the District of Columbia" became the "United States Court of Appeals for the District of Columbia Circuit". The Tenth Circuit was created in 1929 by subdividing the existing Eighth Circuit, and the Eleventh Circuit was created in 1981 by subdividing the existing Fifth Circuit. The Federal Circuit was created in 1982 by the merger of the United States Court of Customs and Patent Appeals and the appellate division of the United States Court of Claims.[citation needed] Judicial councils Judicial councils are panels in each circuit that are charged with making "necessary and appropriate orders for the effective and expeditious administration of justice" within their circuits. Among their responsibilities is judicial discipline, the formulation of circuit policy, the implementation of policy directives received from the Judicial Conference of the United States, and the annual submission of a report to the Administrative Office of the United States Courts on the number and nature of orders entered during the year that relate to judicial misconduct. Judicial councils consist of the chief judge of the circuit and an equal number of circuit judges and district judges of the circuit. Circuit composition The courts of appeals, and the lower courts and specific other bodies over which they have appellate jurisdiction, are as follows: First Circuit (Boston) Second Circuit (New York City) Third Circuit (Philadelphia) Fourth Circuit (Richmond) Fifth Circuit (New Orleans) Sixth Circuit (Cincinnati) Seventh Circuit (Chicago) Eighth Circuit (St. Louis) Ninth Circuit (San Francisco) Tenth Circuit (Denver) Eleventh Circuit (Atlanta) District of Columbia Circuit (Washington) Federal Circuit (Washington) Circuit population Based on 2020 United States census figures, the population residing in each circuit is as follows. History The Judiciary Act of 1789 established three circuits, which were groups of judicial districts in which United States circuit courts were established. The original three circuits were given distinct names, rather than numbers: the Eastern, the Middle, and the Southern. Each circuit court consisted of two Supreme Court justices and the local district judge; the three circuits existed solely for the purpose of assigning the justices to a group of circuit courts. Some districts (generally the ones most difficult for an itinerant justice to reach) did not have a circuit court; in these districts the district court exercised the original jurisdiction of a circuit court. As new states were admitted to the Union, Congress often did not create circuit courts for them for a number of years.[citation needed] The number of circuits remained unchanged until the year after Rhode Island ratified the Constitution, when the Midnight Judges Act reorganized the districts into six numbered circuits, and created circuit judgeships so that Supreme Court justices would no longer have to ride circuit. This Act, however, was repealed in March 1802, and Congress provided that the former circuit courts would be revived as of July 1 of that year. But it then passed the new Judiciary Act of 1802 in April, so that the revival of the old courts never took effect. The 1802 Act restored circuit riding, but with only one justice to a circuit; it therefore created six new circuits, but with slightly different compositions than the 1801 Act. These six circuits later were augmented by others. Until 1866, each new circuit (except the short-lived California Circuit) was accompanied by a newly created Supreme Court seat.[citation needed] See also Explanatory notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Kharruba] | [TOKENS: 792] |
Contents Kharruba Kharruba was a Palestinian Arab village in the Ramle Subdistrict of Mandatory Palestine, near Modi'in. It was located 8 km east of Ramla. It was depopulated on July 12, 1948, during the 1948 Arab–Israeli War. History The name Kharruba, in its current form, is an Arabic one: "a carob tree". It may be Kfar Hariba or Kfar Haruba mentioned in the Jerusalem Talmud as home of two brothers who fought the Romans during the Bar Kokhba revolt. In 1552, Kharruba was a cultivated place (mazra'a). Part of the tax revenues of Kharruba were endowed to the Haseki Sultan Imaret in Jerusalem, founded by Haseki Hürrem Sultan, the favourite wife of Suleiman the Magnificent. Administratively, Kharruba belonged to the Sub-district of Ramla in the District of Gaza. Kharruba appeared in Ottoman tax registers compiled in 1596 under the name of Harnuba, in the Nahiyas of Ramla, of the Gaza Sanjak. It was indicated as empty (hali), though 25% taxes were paid on agricultural products. These included wheat, barley, summer crops, vineyards, fruit trees, sesame, goats, beehives, in an addition to occasional revenues; a total of 4,000 akçe. In 1838, it was noted as a Muslim village, Khurrubeh, in the Ibn Humar area in the District of Er-Ramleh. In 1863, Victor Guérin described Kharruba as a hamlet of a few huts. He noticed the remains of a medieval fort and suggested it might be the Crusader castle Arnaldi. The following decade, the PEF's "Survey of Western Palestine" found only ruins. By the beginning of the 20th century, residents from Beit Iksa resettled the site, establishing it as a dependency – or satellite village – of their home village. During the British Mandate period, Kharruba was one of the key areas of Lime production for the developing urban centers along Palestine's coastal plain. At the time of the 1931 census, Kharruba had 21 occupied houses and a population of 119 inhabitants, all Muslims. In the 1945 statistics, the village had a population of 170 Muslims. The total land area was 3,374 dunams, of this, a total of 1,620 dunums were used for cereals, 25 dunums were irrigated or used for orchards, while 3 dunams were classified as built-up public areas. It was depopulated during the 1948 Arab–Israeli War on July 12, 1948, by the Yiftach Brigade which reported that it had blown up the houses and "cleared the village". In 1992 the village site was described: "The site is covered with the stone rubble of the destroyed houses, overgrown with vegetation. Many of the plants that grow on the site are the ones that Palestinians traditionally planted near their homes: cactuses, castor oil (ricinus) plants, and cypress, Christ's thorn, and olive trees. The surrounding land is used by the Israelis as grazing ground." Archaeology A site called Haruba is mentioned in the Copper Scroll, the only one of the Dead Sea Scrolls engraved on copper rather than written on parchment. Modern scholars do not believe it to be the site mentioned in the scroll. In 2012, five suspected antiquities robbers were caught at Kharruba, after damaging a mikveh (ritual bath) dating to the Second Temple period and trenches used as hiding places during the Bar Kokhba revolt. References Bibliography External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Middle_East#cite_ref-59] | [TOKENS: 6152] |
Contents Middle East The Middle East[b] is a geopolitical region encompassing the Arabian Peninsula, Egypt, Iran, Iraq, the Levant, and Turkey. The term came into widespread usage by Western European nations in the early 20th century as a replacement of the term Near East (both were in contrast to the Far East). The term "Middle East" has led to some confusion over its changing definitions. Since the late 20th century, it has been criticized as being too Eurocentric. The region includes the vast majority of the territories included in the closely associated definition of West Asia, but without the South Caucasus. It also includes all of Egypt (not just the Sinai region) and all of Turkey (including East Thrace). Most Middle Eastern countries (13 out of 18) are part of the Arab world. The three most populous countries in the region are Egypt, Iran, and Turkey, while Saudi Arabia is the largest Middle Eastern country by area. The history of the Middle East dates back to ancient times, and it was long considered the "cradle of civilization". The geopolitical importance of the region has been recognized and competed for during millennia. The Abrahamic religions (Judaism, Christianity, and Islam) have their origins in the Middle East. Arabs constitute the main ethnic group in the region, followed by Turks, Persians, Kurds, Jews, and Assyrians. The Middle East generally has a hot, arid climate, especially in the Arabian and Egyptian regions. Several major rivers provide irrigation to support agriculture in limited areas here, such as the Nile Delta in Egypt, the Tigris and Euphrates watersheds of Mesopotamia, and the basin of the Jordan River that spans most of the Levant. These regions are collectively known as the Fertile Crescent, and comprise the core of what historians had long referred to as the cradle of civilization; multiple regions of the world have since been classified as also having developed independent, original civilizations. Conversely, the Levantine coast and most of Turkey have relatively temperate climates typical of the Mediterranean, with dry summers and cool, wet winters. Most of the countries that border the Persian Gulf have vast reserves of petroleum. Monarchs of the Arabian Peninsula in particular have benefitted economically from petroleum exports. Because of the arid climate and dependence on the fossil fuel industry, the Middle East is both a major contributor to climate change and a region that is expected to be severely adversely affected by it. Other concepts of the region exist, including the broader Middle East and North Africa (MENA), which includes states of the Maghreb and the Sudan. The term the "Greater Middle East" also includes Afghanistan, Mauritania, Pakistan, as well as parts of East Africa, and sometimes Central Asia and the South Caucasus. Terminology The term "Middle East" may have originated in the 1850s in the British India Office. However, it became more widely known when United States naval strategist Alfred Thayer Mahan used the term in 1902 to "designate the area between Arabia and India". During this time the British and Russian empires were vying for influence in Central Asia, a rivalry that would become known as the Great Game. Mahan realized not only the strategic importance of the region, but also of its center, the Persian Gulf. He labeled the area surrounding the Persian Gulf as the Middle East. He said that, beyond Egypt's Suez Canal, the Gulf was the most important passage for Britain to control in order to keep the Russians from advancing towards British India. Mahan first used the term in his article "The Persian Gulf and International Relations", published in September 1902 in the National Review, a British journal. The Middle East, if I may adopt a term which I have not seen, will some day need its Malta, as well as its Gibraltar; it does not follow that either will be in the Persian Gulf. Naval force has the quality of mobility which carries with it the privilege of temporary absences; but it needs to find on every scene of operation established bases of refit, of supply, and in case of disaster, of security. The British Navy should have the facility to concentrate in force if occasion arise, about Aden, India, and the Persian Gulf. Mahan's article was reprinted in The Times and followed in October by a 20-article series entitled "The Middle Eastern Question", written by Sir Ignatius Valentine Chirol. During this series, Sir Ignatius expanded the definition of Middle East to include "those regions of Asia which extend to the borders of India or command the approaches to India." After the series ended in 1903, The Times removed quotation marks from subsequent uses of the term. Until World War II, it was customary to refer to areas centered on Turkey and the eastern shore of the Mediterranean as the "Near East", while the "Far East" centered on China, India and Japan. The Middle East was then defined as the area from Mesopotamia to Burma; namely, the area between the Near East and the Far East. This area broadly corresponds to South Asia. In the late 1930s, the British established the Middle East Command, which was based in Cairo, for its military forces in the region. After that time, the term "Middle East" gained broader usage in Europe and the United States. Following World War II, for example, the Middle East Institute was founded in Washington, D.C. in 1946. The corresponding adjective is Middle Eastern and the derived noun is Middle Easterner. While non-Eurocentric terms such as "Southwest Asia" or "Swasia" have been sparsely used, the classification of the African country, Egypt, among those counted in the Middle East challenges the usefulness of using such terms. The description Middle has also led to some confusion over changing definitions. Before the First World War, "Near East" was used in English to refer to the Balkans and the Ottoman Empire, while "Middle East" referred to the Caucasus, Persia, and Arabian lands, and sometimes Afghanistan, India and others. In contrast, "Far East" referred to the countries of East Asia (e.g. China, Japan, and Korea). With the collapse of the Ottoman Empire in 1918, "Near East" largely fell out of common use in English, while "Middle East" came to be applied to the emerging independent countries of the Islamic world. However, the usage "Near East" was retained by a variety of academic disciplines, including archaeology and ancient history. In their usage, the term describes an area identical to the term Middle East, which is not used by these disciplines (see ancient Near East).[citation needed] The first official use of the term "Middle East" by the United States government was in the 1957 Eisenhower Doctrine, which pertained to the Suez Crisis. Secretary of State John Foster Dulles defined the Middle East as "the area lying between and including Libya on the west and Pakistan on the east, Syria and Iraq on the North and the Arabian peninsula to the south, plus the Sudan and Ethiopia." In 1958, the State Department explained that the terms "Near East" and "Middle East" were interchangeable, and defined the region as including only Egypt, Syria, Israel, Lebanon, Jordan, Iraq, Saudi Arabia, Kuwait, Bahrain, and Qatar. Since the late 20th century, scholars and journalists from the region, such as journalist Louay Khraish and historian Hassan Hanafi have criticized the use of "Middle East" as a Eurocentric and colonialist term. The Associated Press Stylebook of 2004 says that Near East formerly referred to the farther west countries while Middle East referred to the eastern ones, but that now they are synonymous. It instructs: Use Middle East unless Near East is used by a source in a story. Mideast is also acceptable, but Middle East is preferred. European languages have adopted terms similar to Near East and Middle East. Since these are based on a relative description, the meanings depend on the country and are generally different from the English terms. In German the term Naher Osten (Near East) is still in common use (nowadays the term Mittlerer Osten is more and more common in press texts translated from English sources, albeit having a distinct meaning). In the four Slavic languages, Russian Ближний Восток or Blizhniy Vostok, Bulgarian Близкия Изток, Polish Bliski Wschód or Croatian Bliski istok (terms meaning Near East are the only appropriate ones for the region). However, some European languages do have "Middle East" equivalents, such as French Moyen-Orient, Swedish Mellanöstern, Spanish Oriente Medio or Medio Oriente, Greek is Μέση Ανατολή (Mesi Anatoli), and Italian Medio Oriente.[c] Perhaps because of the political influence of the United States and Europe, and the prominence of Western press, the Arabic equivalent of Middle East (Arabic: الشرق الأوسط ash-Sharq al-Awsaṭ) has become standard usage in the mainstream Arabic press. It comprises the same meaning as the term "Middle East" in North American and Western European usage. The designation, Mashriq, also from the Arabic root for East, also denotes a variously defined region around the Levant, the eastern part of the Arabic-speaking world (as opposed to the Maghreb, the western part). Even though the term originated in the West, countries of the Middle East that use languages other than Arabic also use that term in translation. For instance, the Persian equivalent for Middle East is خاورمیانه (Khāvar-e miyāneh), the Hebrew is המזרח התיכון (hamizrach hatikhon), and the Turkish is Orta Doğu. Countries and territory Traditionally included within the Middle East are Arabia, Asia Minor, East Thrace, Egypt, Iran, the Levant, Mesopotamia, and the Socotra Archipelago. The region includes 17 UN-recognized countries and one British Overseas Territory. Various concepts are often paralleled to the Middle East, most notably the Near East, Fertile Crescent, and Levant. These are geographical concepts, which refer to large sections of the modern-day Middle East, with the Near East being the closest to the Middle East in its geographical meaning. Due to it primarily being Arabic speaking, the Maghreb region of North Africa is sometimes included. "Greater Middle East" is a political term coined by the second Bush administration in the first decade of the 21st century to denote various countries, pertaining to the Muslim world, specifically Afghanistan, Iran, Pakistan, and Turkey. Various Central Asian countries are sometimes also included. History The Middle East lies at the juncture of Africa and Eurasia and of the Indian Ocean and the Mediterranean Sea (see also: Indo-Mediterranean). It is the birthplace and spiritual center of religions such as Christianity, Islam, Judaism, Manichaeism, Yezidi, Druze, Yarsan, and Mandeanism, and in Iran, Mithraism, Zoroastrianism, Manicheanism, and the Baháʼí Faith. Throughout its history the Middle East has been a major center of world affairs; a strategically, economically, politically, culturally, and religiously sensitive area. The region is one of the regions where agriculture was independently discovered, and from the Middle East it was spread, during the Neolithic, to different regions of the world such as Europe, the Indus Valley and Eastern Africa. Prior to the formation of civilizations, advanced cultures formed all over the Middle East during the Stone Age. The search for agricultural lands by agriculturalists, and pastoral lands by herdsmen meant different migrations took place within the region and shaped its ethnic and demographic makeup. The Middle East is widely and most famously known as the cradle of civilization. The world's earliest civilizations, Mesopotamia (Sumer, Akkad, Assyria and Babylonia), ancient Egypt and Kish in the Levant, all originated in the Fertile Crescent and Nile Valley regions of the ancient Near East. These were followed by the Hittite, Greek, Hurrian and Urartian civilisations of Asia Minor; Elam, Persia and Median civilizations in Iran, as well as the civilizations of the Levant (such as Ebla, Mari, Nagar, Ugarit, Canaan, Aramea, Mitanni, Phoenicia and Israel) and the Arabian Peninsula (Magan, Sheba, Ubar). The Near East was first largely unified under the Neo Assyrian Empire, then the Achaemenid Empire followed later by the Macedonian Empire and after this to some degree by the Iranian empires (namely the Parthian and Sassanid Empires), the Roman Empire and Byzantine Empire. The region served as the intellectual and economic center of the Roman Empire and played an exceptionally important role due to its periphery on the Sassanid Empire. Thus, the Romans stationed up to five or six of their legions in the region for the sole purpose of defending it from Sassanid and Bedouin raids and invasions. From the 4th century CE onwards, the Middle East became the center of the two main powers at the time, the Byzantine Empire and the Sassanid Empire. However, it would be the later Islamic Caliphates of the Middle Ages, or Islamic Golden Age which began with the Islamic conquest of the region in the 7th century AD, that would first unify the entire Middle East as a distinct region and create the dominant Islamic Arab ethnic identity that largely (but not exclusively) persists today. The 4 caliphates that dominated the Middle East for more than 600 years were the Rashidun Caliphate, the Umayyad caliphate, the Abbasid caliphate and the Fatimid caliphate. Additionally, the Mongols would come to dominate the region, the Kingdom of Armenia would incorporate parts of the region to their domain, the Seljuks would rule the region and spread Turko-Persian culture, and the Franks would found the Crusader states that would stand for roughly two centuries. Josiah Russell estimates the population of what he calls "Islamic territory" as roughly 12.5 million in 1000 – Anatolia 8 million, Syria 2 million, and Egypt 1.5 million. From the 16th century onward, the Middle East came to be dominated, once again, by two main powers: the Ottoman Empire and the Safavid dynasty. The modern Middle East began after World War I, when the Ottoman Empire, which was allied with the Central Powers, was defeated by the Allies and partitioned into a number of separate nations, initially under British and French Mandates. Other defining events in this transformation included the establishment of Israel in 1948 and the eventual departure of European powers, notably Britain and France by the end of the 1960s. They were supplanted in some part by the rising influence of the United States from the 1970s onwards. In the 20th century, the region's significant stocks of crude oil gave it new strategic and economic importance. Mass production of oil began around 1945, with Saudi Arabia, Iran, Kuwait, Iraq, and the United Arab Emirates having large quantities of oil. Estimated oil reserves, especially in Saudi Arabia and Iran, are some of the highest in the world, and the international oil cartel OPEC is dominated by Middle Eastern countries. During the Cold War, the Middle East was a theater of ideological struggle between the two superpowers and their allies: NATO and the United States on one side, and the Soviet Union and Warsaw Pact on the other, as they competed to influence regional allies. Besides the political reasons there was also the "ideological conflict" between the two systems. Moreover, as Louise Fawcett argues, among many important areas of contention, or perhaps more accurately of anxiety, were, first, the desires of the superpowers to gain strategic advantage in the region, second, the fact that the region contained some two-thirds of the world's oil reserves in a context where oil was becoming increasingly vital to the economy of the Western world [...] Within this contextual framework, the United States sought to divert the Arab world from Soviet influence. Throughout the 20th and 21st centuries, the region has experienced both periods of relative peace and tolerance and periods of conflict particularly between Sunnis and Shiites. Geography In 2018, the MENA region emitted 3.2 billion tonnes of carbon dioxide and produced 8.7% of global greenhouse gas emissions (GHG) despite making up only 6% of the global population. These emissions are mostly from the energy sector, an integral component of many Middle Eastern and North African economies due to the extensive oil and natural gas reserves that are found within the region. The Middle East region is one of the most vulnerable to climate change. The impacts include increase in drought conditions, aridity, heatwaves and sea level rise. Sharp global temperature and sea level changes, shifting precipitation patterns and increased frequency of extreme weather events are some of the main impacts of climate change as identified by the Intergovernmental Panel on Climate Change (IPCC). The MENA region is especially vulnerable to such impacts due to its arid and semi-arid environment, facing climatic challenges such as low rainfall, high temperatures and dry soil. The climatic conditions that foster such challenges for MENA are projected by the IPCC to worsen throughout the 21st century. If greenhouse gas emissions are not significantly reduced, part of the MENA region risks becoming uninhabitable before the year 2100. Climate change is expected to put significant strain on already scarce water and agricultural resources within the MENA region, threatening the national security and political stability of all included countries. Over 60 percent of the region's population lives in high and very high water-stressed areas compared to the global average of 35 percent. This has prompted some MENA countries to engage with the issue of climate change on an international level through environmental accords such as the Paris Agreement. Law and policy are also being established on a national level amongst MENA countries, with a focus on the development of renewable energies. Economy Middle Eastern economies range from being very poor (such as Gaza and Yemen) to extremely wealthy nations (such as Qatar and UAE). According to the International Monetary Fund, the three largest Middle Eastern economies in nominal GDP in 2023 were Saudi Arabia ($1.06 trillion), Turkey ($1.03 trillion), and Israel ($0.54 trillion). For nominal GDP per person, the highest ranking countries are Qatar ($83,891), Israel ($55,535), the United Arab Emirates ($49,451) and Cyprus ($33,807). Turkey ($3.6 trillion), Saudi Arabia ($2.3 trillion), and Iran ($1.7 trillion) had the largest economies in terms of GDP PPP. For GDP PPP per person, the highest-ranking countries are Qatar ($124,834), the United Arab Emirates ($88,221), Saudi Arabia ($64,836), Bahrain ($60,596) and Israel ($54,997). The lowest-ranking country in the Middle East, in terms of GDP nominal per capita, is Yemen ($573). The economic structure of Middle Eastern nations are different because while some are heavily dependent on export of only oil and oil-related products (Saudi Arabia, the UAE and Kuwait), others have a highly diverse economic base (such as Cyprus, Israel, Turkey and Egypt). Industries of the Middle Eastern region include oil and oil-related products, agriculture, cotton, cattle, dairy, textiles, leather products, surgical instruments, defence equipment (guns, ammunition, tanks, submarines, fighter jets, UAVs, and missiles). Banking is an important sector, especially for UAE and Bahrain. With the exception of Cyprus, Turkey, Egypt, Lebanon and Israel, tourism has been a relatively undeveloped area of the economy, in part because of the socially conservative nature of the region as well as political turmoil in certain regions. Since the end of the COVID pandemic however, countries such as the UAE, Bahrain, and Jordan have begun attracting greater numbers of tourists because of improving tourist facilities and the relaxing of tourism-related restrictive policies. Unemployment is high in the Middle East and North Africa region, particularly among people aged 15–29, a demographic representing 30% of the region's population. The total regional unemployment rate in 2025 is 10.8%, and among youth is as high as 28%. Demographics Arabs constitute the largest ethnic group in the Middle East, followed by various Iranian peoples and then by Turkic peoples (Turkish, Azeris, Syrian Turkmen, and Iraqi Turkmen). Native ethnic groups of the region include, in addition to Arabs, Arameans, Assyrians, Baloch, Berbers, Copts, Druze, Greek Cypriots, Jews, Kurds, Lurs, Mandaeans, Persians, Samaritans, Shabaks, Tats, and Zazas. European ethnic groups that form a diaspora in the region include Albanians, Bosniaks, Circassians (including Kabardians), Crimean Tatars, Greeks, Franco-Levantines, Italo-Levantines, and Iraqi Turkmens. Among other migrant populations are Chinese, Filipinos, Indians, Indonesians, Pakistanis, Pashtuns, Romani, and Afro-Arabs. "Migration has always provided an important vent for labor market pressures in the Middle East. For the period between the 1970s and 1990s, the Arab states of the Persian Gulf in particular provided a rich source of employment for workers from Egypt, Yemen and the countries of the Levant, while Europe had attracted young workers from North African countries due both to proximity and the legacy of colonial ties between France and the majority of North African states." According to the International Organization for Migration, there are 13 million first-generation migrants from Arab nations in the world, of which 5.8 reside in other Arab countries. Expatriates from Arab countries contribute to the circulation of financial and human capital in the region and thus significantly promote regional development. In 2009 Arab countries received a total of US$35.1 billion in remittance in-flows and remittances sent to Jordan, Egypt and Lebanon from other Arab countries are 40 to 190 per cent higher than trade revenues between these and other Arab countries. In Somalia, the Somali Civil War has greatly increased the size of the Somali diaspora, as many of the best educated Somalis left for Middle Eastern countries as well as Europe and North America. Non-Arab Middle Eastern countries such as Turkey, Israel and Iran are also subject to important migration dynamics. A fair proportion of those migrating from Arab nations are from ethnic and religious minorities facing persecution and are not necessarily ethnic Arabs, Iranians or Turks.[citation needed] Large numbers of Kurds, Jews, Assyrians, Greeks and Armenians as well as many Mandeans have left nations such as Iraq, Iran, Syria and Turkey for these reasons during the last century. In Iran, many religious minorities such as Christians, Baháʼís, Jews and Zoroastrians have left since the Islamic Revolution of 1979. The Middle East is very diverse when it comes to religions, many of which originated there. Islam is the largest religion in the Middle East, but other faiths that originated there, such as Judaism and Christianity, are also well represented. Christian communities have played a vital role in the Middle East, and they represent 78% of Cyprus population, and 40.5% of Lebanon, where the Lebanese president, half of the cabinet, and half of the parliament follow one of the various Lebanese Christian rites. There are also important minority religions like the Baháʼí Faith, Yarsanism, Yazidism, Zoroastrianism, Mandaeism, Druze, and Shabakism, and in ancient times the region was home to Mesopotamian religions, Canaanite religions, Manichaeism, Mithraism and various monotheist gnostic sects. The six top languages, in terms of numbers of speakers, are Arabic, Persian, Turkish, Kurdish, Modern Hebrew and Greek. About 20 minority languages are also spoken in the Middle East. Arabic, with all its dialects, is the most widely spoken language in the Middle East, with Literary Arabic being official in all North African and in most West Asian countries. Arabic dialects are also spoken in some adjacent areas in neighbouring Middle Eastern non-Arab countries. It is a member of the Semitic branch of the Afro-Asiatic languages. Several Modern South Arabian languages such as Mehri and Soqotri are also spoken in Yemen and Oman. Another Semitic language is Aramaic and its dialects are spoken mainly by Assyrians and Mandaeans, with Western Aramaic still spoken in two villages near Damascus, Syria. There is also an Oasis Berber-speaking community in Egypt where the language is also known as Siwa. It is a non-Semitic Afro-Asiatic sister language. Persian is the second most spoken language. While it is primarily spoken in Iran and some border areas in neighbouring countries, the country is one of the region's largest and most populous. It belongs to the Indo-Iranian branch of the family of Indo-European languages. Other Western Iranic languages spoken in the region include Achomi, Daylami, Kurdish dialects, Semmani, Lurish, amongst many others. The close third-most widely spoken language, Turkish, is largely confined to Turkey, which is also one of the region's largest and most populous countries, but it is present in areas in neighboring countries. It is a member of the Turkic languages, which have their origins in East Asia. Another Turkic language, Azerbaijani, is spoken by Azerbaijanis in Iran. The fourth-most widely spoken language, Kurdish, is spoken in the countries of Iran, Iraq, Syria and Turkey, Sorani Kurdish is the second official language in Iraq (instated after the 2005 constitution) after Arabic. Hebrew is the official language of Israel, with Arabic given a special status after the 2018 Basic law lowered its status from an official language prior to 2018. Hebrew is spoken and used by over 80% of Israel's population, the other 20% using Arabic. Modern Hebrew only began being spoken in the 20th century after being revived in the late 19th century by Elizer Ben-Yehuda (Elizer Perlman) and European Jewish settlers, with the first native Hebrew speaker being born in 1882. Greek is one of the two official languages of Cyprus, and the country's main language. Small communities of Greek speakers exist all around the Middle East; until the 20th century it was also widely spoken in Asia Minor (being the second most spoken language there, after Turkish) and Egypt. During the antiquity, Ancient Greek was the lingua franca for many areas of the western Middle East and until the Muslim expansion it was widely spoken there as well. Until the late 11th century, it was also the main spoken language in Asia Minor; after that it was gradually replaced by the Turkish language as the Anatolian Turks expanded and the local Greeks were assimilated, especially in the interior. English is one of the official languages of Akrotiri and Dhekelia. It is also commonly taught and used as a foreign second language, in countries such as Egypt, Jordan, Iran, Iraq, Qatar, Bahrain, United Arab Emirates and Kuwait. It is also a main language in some Emirates of the United Arab Emirates. It is also spoken as native language by Jewish immigrants from Anglophone countries (UK, US, Australia) in Israel and understood widely as second language there. French is taught and used in many government facilities and media in Lebanon, and is taught in some primary and secondary schools of Egypt and Syria. Maltese, a Semitic language mainly spoken in Europe, is used by the Franco-Maltese diaspora in Egypt. Due to widespread immigration of French Jews to Israel, it is the native language of approximately 200,000 Jews in Israel. Armenian speakers are to be found in the region. Georgian is spoken by the Georgian diaspora. Russian is spoken by a large portion of the Israeli population, because of emigration in the late 1990s. Russian today is a popular unofficial language in use in Israel; news, radio and sign boards can be found in Russian around the country after Hebrew and Arabic. Circassian is also spoken by the diaspora in the region and by almost all Circassians in Israel who speak Hebrew and English as well. The largest Romanian-speaking community in the Middle East is found in Israel, where as of 1995[update] Romanian is spoken by 5% of the population.[d] Bengali, Hindi and Urdu are widely spoken by migrant communities in many Middle Eastern countries, such as Saudi Arabia (where 20–25% of the population is South Asian), the United Arab Emirates (where 50–55% of the population is South Asian), and Qatar, which have large numbers of Pakistani, Bangladeshi and Indian immigrants. Culture The Middle East has recently become more prominent in hosting global sport events due to its wealth and desire to diversify its economy. The South Asian diaspora is a major backer of cricket in the region. See also Notes References Further reading External links 29°N 41°E / 29°N 41°E / 29; 41 |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/United_States_district_court] | [TOKENS: 3146] |
Contents United States district court The United States district courts are the trial courts of the U.S. federal judiciary. There is one district court for each federal judicial district. Each district covers one U.S. state or a portion of a state. There is at least one federal courthouse in each district, and many districts have more than one. District court decisions are appealed to the U.S. court of appeals for the circuit in which they reside, except for certain specialized cases that are appealed to the U.S. Court of Appeals for the Federal Circuit or directly to the U.S. Supreme Court. District courts are courts of law, equity, and admiralty, and can hear both civil and criminal cases. But unlike U.S. state courts, federal district courts are courts of limited jurisdiction, and can only hear cases that involve disputes between residents of different states, questions of federal law, or federal crimes. Legal basis Unlike the U.S. Supreme Court, which was expressly established by Article III of the Constitution, the district courts were established by Congress pursuant to authority delegated by Article III[note 1] through the enacting of a federal statute, the Judiciary Act of 1789. There is no constitutional requirement that district courts exist at all. During the drafting and ratification of the Constitution, some opponents of a strong federal judiciary argued that the federal courts ought to be limited to the Supreme Court, which would hear appeals only from state courts. In other words, the state courts would be treated as federal tribunals under Article I of the Constitution for the purpose of hearing disputes under federal law, but their judges would not become officers of the federal government. Edward Carrington advocated this position in a letter to James Madison, and it was also discussed by Alexander Hamilton in Federalist No. 81. However, this view did not prevail, and the first Congress created the district court system that is still in place today. Pursuant to the Constitution, nonetheless, state courts retain the power of concurrent jurisdiction in most federal matters. When the Act was first passed, there were thirteen districts created among the eleven states which had ratified the Constitution by that point. When North Carolina and Rhode Island voted to ratify, a district was created for each of them, bringing the number of districts to fifteen. The territories (insular areas) of Guam, the Northern Mariana Islands, and the United States Virgin Islands each have one territorial court; these courts are called "district courts" and exercise the same jurisdiction as district courts, but differ from district courts in that territorial courts are Article IV courts, with judges who serve ten-year terms rather than the lifetime tenure of judges of Article III courts, such as the district court judges. American Samoa having neither a district court nor a federal territorial court, its federal cases are sent either to the District of Columbia or Hawaii. The Philippines, previously part of the United States, were never part of the U.S. federal court system. Geography There are 89 districts in the 50 states, with a total of 94 districts including territories. There is at least one judicial district for each state, the District of Columbia, and Puerto Rico. Each state has between one and four districts. For states with multiple districts, they are named geographically. States with two districts all give them either Northern–Southern or Western–Eastern designations. Most states with three districts add a Middle District, with two exceptions: Illinois has a Central District instead of a Middle District, and Oklahoma has Northern, Western, and Eastern Districts. Of the three states with four districts, New York and Texas use all four directional designations, while California has a Central District and no Western District. Other federal trial courts There are other federal trial courts that have nationwide jurisdiction over certain types of cases, but the district court also has concurrent jurisdiction over many of those cases, and the district court is the only one with jurisdiction over civilian criminal cases. The United States Court of International Trade addresses cases involving international trade and customs issues. The United States Court of Federal Claims has exclusive jurisdiction over most claims for money damages against the United States, including disputes over federal contracts, unlawful takings of private property by the federal government, and suits for injury on federal property or by a federal employee. The United States Tax Court has jurisdiction over contested pre-assessment determinations of taxes. Judges A judge of a United States district court is officially titled a "United States District Judge". Other federal judges, including circuit judges and Supreme Court justices, can also sit in a district court upon assignment by the chief judge of the circuit or by the Chief Justice of the United States. The number of judges in each district court (and the structure of the judicial system generally) is set by Congress in the United States Code. The president appoints the federal judges for terms of good behavior (subject to the advice and consent of the Senate), so the nominees often share at least some of his or her convictions. In states represented by a senator of the president's party, the senator (or the more senior of them if both senators are of the president's party) has substantial input into the nominating process, and through a tradition known as senatorial courtesy can exercise an unofficial veto over a nominee unacceptable to the senator. Federal magistrate judges are appointed by each district court pursuant to statute. They are appointed for an eight-year term and may be reappointed for additional eight-year terms. A magistrate judge may be removed "for incompetency, misconduct, neglect of duty, or physical or mental disability". A magistrate judgeship may be a stepping stone to a district judgeship nomination. District judges usually concentrate on managing their court's overall caseload, supervising trials, and writing opinions in response to important motions like the motion for summary judgment. Since the 1960s, routine tasks like resolving discovery disputes can, in the district judge's discretion, be referred to magistrate judges. Magistrate judges can also be requested to prepare reports and recommendations on contested matters for the district judge's consideration or, with the consent of all parties, to assume complete jurisdiction over a case including conducting the trial. With the exception of the territorial courts (Guam, the Northern Mariana Islands, and the Virgin Islands), federal district judges are Article III judges appointed for life, and can be removed involuntarily only when they violate the standard of "good behavior". The sole method of involuntary removal of a judge is through impeachment by the United States House of Representatives followed by a trial in the United States Senate and a conviction by a two-thirds vote. Otherwise, a judge, even if convicted of a felony criminal offense by a jury, is entitled to hold office until retirement or death. In the history of the United States, twelve judges have been impeached by the House, and seven have been removed following conviction in the Senate. (For a table that includes the twelve impeached judges, see Impeachment in the United States.) A judge who has reached the age of 65 (or has become disabled) may retire or elect to go on senior status and keep working. Such senior judges are not counted in the quota of active judges for the district and do only whatever work they are assigned by the chief judge of the district, but they keep their offices (called "chambers") and staff, and many of them work full-time. As of 2010, there were 678 authorized district court judgeships. A federal judge is addressed in writing as "The Honorable John/Jane Doe" or "Hon. John/Jane Doe" and in speech as "Judge" or "Judge Doe" or, when presiding in court, "Your Honor". Clerks Each district court appoints a clerk, who is responsible for overseeing filings made with the court, maintaining the court's records, processing fees, fines, and restitution, and managing the non-judicial work of the court, including information technology, budget, procurement, human resources, and financial. Clerks may appoint deputies, clerical assistants, and employees to carry out the work of the court. The clerk of each district court must reside in the district for which the clerk is appointed, except that the clerk of the District of Columbia and the clerk of the Southern District of New York may reside within twenty miles of their respective districts. The Judiciary Act of 1789 authorized the Supreme Court and the judge of each U.S. District Court to appoint a clerk to assist with the administration of federal judicial business in those courts. The clerk for each district court was to also serve as clerk of the corresponding circuit court. The Judiciary Act required each clerk to issue the writs summoning jurors and "to record the decrees, judgments and determinations of the court of which he is clerk." The Judicial Code (28 U.S.C. § 751) provides that the clerk is appointed, and may be removed, by the court. The clerk's duties are prescribed by the statute, by the court's customs and practices, and by policy established by the Judicial Conference of the United States. The clerk is appointed by order of the court en banc to serve the entire court. The role of the clerk and deputies or assistants should not be confused with the judges' law clerks, who assist the judges by conducting research and preparing drafts of opinions. To be eligible to serve as a clerk, a person must have a minimum of 10 years of progressively responsible administrative experience in public service or business that provides a thorough understanding of organizational, procedural, and human aspects of managing an organization, and at least 3 of the 10 years must have been in a position of substantial management responsibility. An attorney may substitute the active practice of law on a year-for-year basis for the management or administrative experience requirement. Clerks do not have to be licensed attorneys, but some courts specify that a law degree is a preference for employment. Jurisdiction Unlike some state courts, the power of federal courts to hear cases and controversies is strictly limited. Federal courts may not decide every case that happens to come before them. In order for a district court to entertain a lawsuit, Congress must first grant the court subject matter jurisdiction over the type of dispute in question. The district courts exercise original jurisdiction over—that is, they are empowered to conduct trials in—the following types of cases: For most of these cases, the jurisdiction of the federal district courts is concurrent with that of the state courts. In other words, a plaintiff can choose to bring these cases in either a federal district court or a state court. Congress has established a procedure whereby a party, typically the defendant, can "remove" a case from state court to federal court, provided that the federal court also has original jurisdiction over the matter (meaning that the case could have been filed in federal court initially). If the party that initially filed the case in state court believes that removal was improper, that party can ask the district court to "remand" the case to the state court system. For certain matters, such as patent and copyright infringement disputes and prosecutions for federal crimes, the jurisdiction of the district courts is exclusive of that of the state courts, meaning that only federal courts can hear those cases.[note 2] In addition to their original jurisdiction, the district courts have appellate jurisdiction over a very limited class of judgments, orders, and decrees. Attorneys In order to represent a party in a case in a district court, a person must be an attorney at law and generally must be admitted to the bar of that particular court. The United States usually does not have a separate bar examination for federal practice (except with respect to patent practice before the United States Patent and Trademark Office). Admission to the bar of a district court is generally available to any attorney who is admitted to practice law in the state where the district court sits.[note 3] 56 districts (around 60% of all district courts) require an attorney to be admitted to practice in the state where the district court sits. The other 39 districts (around 40% of all district courts) extend admission to certain lawyers admitted in other states, although conditions vary from court to court. For example, the district courts in New York City (Southern District of New York and Eastern District of New York) extend admission to attorneys admitted to the bar in Connecticut or Vermont and to the district court in that state, but otherwise require attorneys to be admitted to the New York bar. Only 13 districts extend admission to attorneys admitted to any U.S. state bar. The attorney generally submits an application with a fee and takes the oath of admission. Local practice varies as to whether the oath is given in writing or in open court before a judge of the district. A "sponsor" admitted to the court's bar is often required. Several district courts require attorneys seeking admission to their bars to take an additional bar examination on federal law, including the following: the Southern District of Ohio, the Northern District of Florida, and the District of Puerto Rico. Pro hac vice admission is also available in most federal district courts on a case-by-case basis. Most district courts require pro hac vice attorneys to associate with an attorney admitted to practice before the court. Appeals Generally, a final ruling by a district court in either a civil or a criminal case can be appealed to the United States court of appeals in the federal judicial circuit in which the district court is located, except that some district court rulings involving patents and certain other specialized matters must be appealed instead to the United States Court of Appeals for the Federal Circuit, and in a very few cases the appeal may be taken directly to the United States Supreme Court. Largest and busiest district courts The Central District of California is the largest federal district by population; it includes all five counties that make up Greater Los Angeles. By contrast, New York City and the surrounding metropolitan area are divided between the Southern District of New York (which includes Manhattan, The Bronx and Westchester County) and the Eastern District of New York (which includes Brooklyn, Queens, Staten Island, Nassau County and Suffolk County). New York suburbs in Connecticut and New Jersey are covered by the District of Connecticut and District of New Jersey, respectively. The Southern District of New York and the Central District of California are the largest federal districts by number of judges, with 28 judges each. In 2007, the busiest district courts in terms of criminal federal felony filings were the District of New Mexico, Western District of Texas, Southern District of Texas, and the District of Arizona. These four districts all share the border with Mexico. A crackdown on illegal immigration resulted in 75 percent of the criminal cases filed in the 94 district courts in 2007 being filed in these four districts and the other district that borders Mexico, the Southern District of California. The busiest patent litigation court is the United States District Court for the Eastern District of Texas, with the most patent lawsuits filed there nearly every year. List of district courts Extinct district courts Most extinct district courts have disappeared by being divided into smaller districts. The following courts were subdivided out of existence: Alabama, Arkansas, California, Florida, Georgia, Illinois, Indiana, Iowa, Kentucky, Louisiana, Michigan, Mississippi, Missouri, New York, North Carolina, Ohio, Pennsylvania, Tennessee, Texas, Virginia, Washington, West Virginia, Wisconsin. On rare occasions, an extinct district court was extinguished by merging it with other district courts. In every case except one, this has restored a district court that had been subdivided: There are a few additional extinct district courts that fall into neither of the above two patterns. See also Notes External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Weekly_Famicom_Ts%C5%ABshin] | [TOKENS: 1456] |
Contents Famitsu Famitsu[a], formerly Famicom Tsūshin[b], is a line of Japanese video game magazines published by Kadokawa Game Linkage (previously known as Gzbrain), a subsidiary of Kadokawa. Famitsu is published in weekly and monthly formats, and in special issues devoted to particular themes. It was first published in 1986. Shūkan Famitsū,[c] the original publication, is considered the most widely read and respected video game news magazine in Japan. From October 28, 2011, the company began releasing the digital version on BookWalker weekly. The name Famitsu is a portmanteau abbreviation of Famicom Tsūshin; Famicom is an abbreviation of Family Computer, the dominant video game console in Japan when the magazine was first published. History LOGiN (ログイン), a computer game magazine, started in 1982 as an extra issue of ASCII, and later it became a periodic magazine. Famicom Tsūshin[d] was a column in Login, focused on the Famicom platform, and ran from March 1985 to December 1986 issue. It received a good reception, so the publisher decided to found the magazine specialized for it. The first issue of Famitsu was published on June 6, 1986, as Famicom Tsūshin. It sold less than 200,000 copies, despite 700,000 copies printed. The major competitor was Family Computer Magazine launched in July 1985 by Tokuma Shoten. Famitsu's editor found many readers had multiple game consoles, and they thought it would be better if the magazine covered various platforms. Increasing contents and the page count gradually, the magazine was published three times per month instead of semimonthly publication. On July 19, 1991 (issue #136) the magazine was renamed to Shūkan Famicom Tsūshin[e] and issues were published weekly thereafter. Alongside the weekly magazine, a monthly version called Gekkan Famicom Tsūshin[f] was also published. Hirokazu Hamamura, an editor-in-chief (1992–2002), felt the beginning of a new era when he saw a private demonstration of Final Fantasy VI in 1993. He thought the name Famicom Tsūshin should be refurbished. At the start of 1996 (with issue #369) the magazines underwent another name change, truncating their titles to Shūkan Famitsū[g] and Gekkan Famitsū[h]. The name Famitsu had already been in common use. The magazine was published by ASCII from its founding through March 2000 when it was sold to Enterbrain, which published it for 13 years, until their parent company Kadokawa published it from 2013 to 2017. Since 2017, Kadokawa's subsidiary Gzbrain has been publishing the magazine, while in 2019 the company changed its name to Kadokawa Game Linkage. Shūkan Famitsū and Gekkan Famitsū Famicom Tsūshin initially focused on the Famicom platform, but later it featured multi-platform coverage. Famicom Tsūshin was renamed to Famitsu in 1995. Shūkan Famitsū is a weekly publication concentrating on video game news and reviews, and is published every Thursday with a circulation of 500,000 per issue. Gekkan Famitsū is published monthly. Famitsu covers alternately feature pop idols or actresses on even-numbered issues and the Famitsu mascot, Necky[i] the Fox in odd-numbered issues. Year-end and special editions all feature Necky dressed as popular contemporary video game characters. Necky is the cartoon creation of artist Susumu Matsushita, and he takes the form of a costumed fox. The costumes worn by Necky reflect current popular video games. Necky's name was chosen according to a reader poll, and it derives from a complex Japanese pun: "Necky" is actually the reverse of the Japanese word for fox, キツネ,[j] and his original connection to Famicom Tsūshin is intended to evoke the bark of the fox, the Japanese onomatopoeia of which is コンコン[k]. Necky makes a cameo appearance in Super Mario Maker. Special-topic Famitsu publications Famitsu publishes other magazines dedicated to particular consoles. Currently in circulation are: Famitsu spin-offs that are no longer in circulation include: Scoring Video games are graded in Famitsu via a review system of having four critics each assign the game a score from 0 to 10, with 10 being the highest score. The scores are then added together. As of 2024[update], thirty games have received perfect scores of 40 from Famitsu. The console with the highest number of perfect-scoring games is the PlayStation 3, with seven total. Four of the perfect-scoring games on PlayStation 3 were also released on the Xbox 360, which is tied with the Wii for the second-highest number of perfect scores at five total. Franchises with multiple perfect score winners include The Legend of Zelda with five titles, Metal Gear with three titles, and Final Fantasy with two titles. The most recent game to receive a perfect score is Like a Dragon: Infinite Wealth. As of 2023[update], all but three games with perfect scores are from Japanese companies, ten being published/developed by Nintendo, four by Square Enix, three by Sega, three by Konami and one by Capcom. As of 2023[update], the only three completely foreign games to achieve a perfect score are The Elder Scrolls V: Skyrim by Bethesda Softworks, Grand Theft Auto V by Rockstar Games, and Ghost of Tsushima by Sucker Punch Productions. Other foreign games that have achieved near-perfect scores are Grand Theft Auto IV, Red Dead Redemption, L.A. Noire, and Red Dead Redemption 2, all by Rockstar Games; Call of Duty: Modern Warfare 2, Call of Duty: Black Ops, and Call of Duty: Modern Warfare 3, all by Activision (but published by Square Enix in Japan); Gears of War 3 by Epic Games; and The Last of Us Part II and Uncharted 4: A Thief's End by Naughty Dog. Kingdom Hearts II, another game with a near-perfect score, was a joint effort between Japanese developer Square Enix and American developer Disney Interactive Studios. Awards Famitsu administers the Famitsu awards. Video games receive a number of different awards in categories like Innovation, Biggest Hit, Rookie Award, Highest Quality, etc. One or two "Game of the Year" awards are granted as the top prize. Top prize winners are determined by a combination of critical and fan review scores as well as sales figures. Relationship with other magazines UK trade magazine MCV and Famitsu have an exclusive partnership which sees news and content from each magazine appear in the other. See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Black_hole#cite_note-183] | [TOKENS: 13839] |
Contents Black hole A black hole is an astronomical body so compact that its gravity prevents anything, including light, from escaping. Albert Einstein's theory of general relativity predicts that a sufficiently compact mass will form a black hole. The boundary of no escape is called the event horizon. In general relativity, a black hole's event horizon seals an object's fate but produces no locally detectable change when crossed. General relativity also predicts that every black hole should have a central singularity, where the curvature of spacetime is infinite. In many ways, a black hole acts like an ideal black body, as it reflects no light. Quantum field theory in curved spacetime predicts that event horizons emit Hawking radiation, with the same spectrum as a black body of a temperature inversely proportional to its mass. This temperature is of the order of billionths of a kelvin for stellar black holes, making it essentially impossible to observe directly. Objects whose gravitational fields are too strong for light to escape were first considered in the 18th century by John Michell and Pierre-Simon Laplace. In 1916, Karl Schwarzschild found the first modern solution of general relativity that would characterise a black hole. Due to his influential research, the Schwarzschild metric is named after him. David Finkelstein, in 1958, first interpreted Schwarzschild's model as a region of space from which nothing can escape. Black holes were long considered a mathematical curiosity; it was not until the 1960s that theoretical work showed they were a generic prediction of general relativity. The first black hole known was Cygnus X-1, identified by several researchers independently in 1971. Black holes typically form when massive stars collapse at the end of their life cycle. After a black hole has formed, it can grow by absorbing mass from its surroundings. Supermassive black holes of millions of solar masses may form by absorbing other stars and merging with other black holes, or via direct collapse of gas clouds. There is consensus that supermassive black holes exist in the centres of most galaxies. The presence of a black hole can be inferred through its interaction with other matter and with electromagnetic radiation such as visible light. Matter falling toward a black hole can form an accretion disk of infalling plasma, heated by friction and emitting light. In extreme cases, this creates a quasar, some of the brightest objects in the universe. Merging black holes can also be detected by observation of the gravitational waves they emit. If other stars are orbiting a black hole, their orbits can be used to determine the black hole's mass and location. Such observations can be used to exclude possible alternatives such as neutron stars. In this way, astronomers have identified numerous stellar black hole candidates in binary systems and established that the radio source known as Sagittarius A*, at the core of the Milky Way galaxy, contains a supermassive black hole of about 4.3 million solar masses. History The idea of a body so massive that even light could not escape was first proposed in the late 18th century by English astronomer and clergyman John Michell and independently by French scientist Pierre-Simon Laplace. Both scholars proposed very large stars in contrast to the modern concept of an extremely dense object. Michell's idea, in a short part of a letter published in 1784, calculated that a star with the same density but 500 times the radius of the sun would not let any emitted light escape; the surface escape velocity would exceed the speed of light.: 122 Michell correctly hypothesized that such supermassive but non-radiating bodies might be detectable through their gravitational effects on nearby visible bodies. In 1796, Laplace mentioned that a star could be invisible if it were sufficiently large while speculating on the origin of the Solar System in his book Exposition du Système du Monde. Franz Xaver von Zach asked Laplace for a mathematical analysis, which Laplace provided and published in a journal edited by von Zach. In 1905, Albert Einstein showed that the laws of electromagnetism would be invariant under a Lorentz transformation: they would be identical for observers travelling at different velocities relative to each other. This discovery became known as the principle of special relativity. Although the laws of mechanics had already been shown to be invariant, gravity remained yet to be included.: 19 In 1907, Einstein published a paper proposing his equivalence principle, the hypothesis that inertial mass and gravitational mass have a common cause. Using the principle, Einstein predicted the redshift and half of the lensing effect of gravity on light; the full prediction of gravitational lensing required development of general relativity.: 19 By 1915, Einstein refined these ideas into his general theory of relativity, which explained how matter affects spacetime, which in turn affects the motion of other matter. This formed the basis for black hole physics. Only a few months after Einstein published the field equations describing general relativity, astrophysicist Karl Schwarzschild set out to apply the idea to stars. He assumed spherical symmetry with no spin and found a solution to Einstein's equations.: 124 A few months after Schwarzschild, Johannes Droste, a student of Hendrik Lorentz, independently gave the same solution. At a certain radius from the center of the mass, the Schwarzschild solution became singular, meaning that some of the terms in the Einstein equations became infinite. The nature of this radius, which later became known as the Schwarzschild radius, was not understood at the time. Many physicists of the early 20th century were skeptical of the existence of black holes. In a 1926 popular science book, Arthur Eddington critiqued the idea of a star with mass compressed to its Schwarzschild radius as a flaw in the then-poorly-understood theory of general relativity.: 134 In 1939, Einstein himself used his theory of general relativity in an attempt to prove that black holes were impossible. His work relied on increasing pressure or increasing centrifugal force balancing the force of gravity so that the object would not collapse beyond its Schwarzschild radius. He missed the possibility that implosion would drive the system below this critical value.: 135 By the 1920s, astronomers had classified a number of white dwarf stars as too cool and dense to be explained by the gradual cooling of ordinary stars. In 1926, Ralph Fowler showed that quantum-mechanical degeneracy pressure was larger than thermal pressure at these densities.: 145 In 1931, Subrahmanyan Chandrasekhar calculated that a non-rotating body of electron-degenerate matter below a certain limiting mass is stable, and by 1934 he showed that this explained the catalog of white dwarf stars.: 151 When Chandrasekhar announced his results, Eddington pointed out that stars above this limit would radiate until they were sufficiently dense to prevent light from exiting, a conclusion he considered absurd. Eddington and, later, Lev Landau argued that some yet unknown mechanism would stop the collapse. In the 1930s, Fritz Zwicky and Walter Baade studied stellar novae, focusing on exceptionally bright ones they called supernovae. Zwicky promoted the idea that supernovae produced stars with the density of atomic nuclei—neutron stars—but this idea was largely ignored.: 171 In 1939, based on Chandrasekhar's reasoning, J. Robert Oppenheimer and George Volkoff predicted that neutron stars below a certain mass limit, later called the Tolman–Oppenheimer–Volkoff limit, would be stable due to neutron degeneracy pressure. Above that limit, they reasoned that either their model would not apply or that gravitational contraction would not stop.: 380 John Archibald Wheeler and two of his students resolved questions about the model behind the Tolman–Oppenheimer–Volkoff (TOV) limit. Harrison and Wheeler developed the equations of state relating density to pressure for cold matter all the way through electron degeneracy and neutron degeneracy. Masami Wakano and Wheeler then used the equations to compute the equilibrium curve for stars, relating mass to circumference. They found no additional features that would invalidate the TOV limit. This meant that the only thing that could prevent black holes from forming was a dynamic process ejecting sufficient mass from a star as it cooled.: 205 The modern concept of black holes was formulated by Robert Oppenheimer and his student Hartland Snyder in 1939.: 80 In the paper, Oppenheimer and Snyder solved Einstein's equations of general relativity for an idealized imploding star, in a model later called the Oppenheimer–Snyder model, then described the results from far outside the star. The implosion starts as one might expect: the star material rapidly collapses inward. However, as the density of the star increases, gravitational time dilation increases and the collapse, viewed from afar, seems to slow down further and further until the star reaches its Schwarzschild radius, where it appears frozen in time.: 217 In 1958, David Finkelstein identified the Schwarzschild surface as an event horizon, calling it "a perfect unidirectional membrane: causal influences can cross it in only one direction". In this sense, events that occur inside of the black hole cannot affect events that occur outside of the black hole. Finkelstein created a new reference frame to include the point of view of infalling observers.: 103 Finkelstein's new frame of reference allowed events at the surface of an imploding star to be related to events far away. By 1962 the two points of view were reconciled, convincing many skeptics that implosion into a black hole made physical sense.: 226 The era from the mid-1960s to the mid-1970s was the "golden age of black hole research", when general relativity and black holes became mainstream subjects of research.: 258 In this period, more general black hole solutions were found. In 1963, Roy Kerr found the exact solution for a rotating black hole. Two years later, Ezra Newman found the cylindrically symmetric solution for a black hole that is both rotating and electrically charged. In 1967, Werner Israel found that the Schwarzschild solution was the only possible solution for a nonspinning, uncharged black hole, meaning that a Schwarzschild black hole would be defined by its mass alone. Similar identities were later found for Reissner-Nordstrom and Kerr black holes, defined only by their mass and their charge or spin respectively. Together, these findings became known as the no-hair theorem, which states that a stationary black hole is completely described by the three parameters of the Kerr–Newman metric: mass, angular momentum, and electric charge. At first, it was suspected that the strange mathematical singularities found in each of the black hole solutions only appeared due to the assumption that a black hole would be perfectly spherically symmetric, and therefore the singularities would not appear in generic situations where black holes would not necessarily be symmetric. This view was held in particular by Vladimir Belinski, Isaak Khalatnikov, and Evgeny Lifshitz, who tried to prove that no singularities appear in generic solutions, although they would later reverse their positions. However, in 1965, Roger Penrose proved that general relativity without quantum mechanics requires that singularities appear in all black holes. Astronomical observations also made great strides during this era. In 1967, Antony Hewish and Jocelyn Bell Burnell discovered pulsars and by 1969, these were shown to be rapidly rotating neutron stars. Until that time, neutron stars, like black holes, were regarded as just theoretical curiosities, but the discovery of pulsars showed their physical relevance and spurred a further interest in all types of compact objects that might be formed by gravitational collapse. Based on observations in Greenwich and Toronto in the early 1970s, Cygnus X-1, a galactic X-ray source discovered in 1964, became the first astronomical object commonly accepted to be a black hole. Work by James Bardeen, Jacob Bekenstein, Carter, and Hawking in the early 1970s led to the formulation of black hole thermodynamics. These laws describe the behaviour of a black hole in close analogy to the laws of thermodynamics by relating mass to energy, area to entropy, and surface gravity to temperature. The analogy was completed: 442 when Hawking, in 1974, showed that quantum field theory implies that black holes should radiate like a black body with a temperature proportional to the surface gravity of the black hole, predicting the effect now known as Hawking radiation. While Cygnus X-1, a stellar-mass black hole, was generally accepted by the scientific community as a black hole by the end of 1973, it would be decades before a supermassive black hole would gain the same broad recognition. Although, as early as the 1960s, physicists such as Donald Lynden-Bell and Martin Rees had suggested that powerful quasars in the center of galaxies were powered by accreting supermassive black holes, little observational proof existed at the time. However, the Hubble Space Telescope, launched decades later, found that supermassive black holes were not only present in these active galactic nuclei, but that supermassive black holes in the center of galaxies were ubiquitous: Almost every galaxy had a supermassive black hole at its center, many of which were quiescent. In 1999, David Merritt proposed the M–sigma relation, which related the dispersion of the velocity of matter in the center bulge of a galaxy to the mass of the supermassive black hole at its core. Subsequent studies confirmed this correlation. Around the same time, based on telescope observations of the velocities of stars at the center of the Milky Way galaxy, independent work groups led by Andrea Ghez and Reinhard Genzel concluded that the compact radio source in the center of the galaxy, Sagittarius A*, was likely a supermassive black hole. On 11 February 2016, the LIGO Scientific Collaboration and Virgo Collaboration announced the first direct detection of gravitational waves, named GW150914, representing the first observation of a black hole merger. At the time of the merger, the black holes were approximately 1.4 billion light-years away from Earth and had masses of 30 and 35 solar masses.: 6 In 2017, Rainer Weiss, Kip Thorne, and Barry Barish, who had spearheaded the project, were awarded the Nobel Prize in Physics for their work. Since the initial discovery in 2015, hundreds more gravitational waves have been observed by LIGO and another interferometer, Virgo. On 10 April 2019, the first direct image of a black hole and its vicinity was published, following observations made by the Event Horizon Telescope (EHT) in 2017 of the supermassive black hole in Messier 87's galactic centre. In 2022, the Event Horizon Telescope collaboration released an image of the black hole in the center of the Milky Way galaxy, Sagittarius A*; The data had been collected in 2017. In 2020, the Nobel Prize in Physics was awarded for work on black holes. Andrea Ghez and Reinhard Genzel shared one-half for their discovery that Sagittarius A* is a supermassive black hole. Penrose received the other half for his work showing that the mathematics of general relativity requires the formation of black holes. Cosmologists lamented that Hawking's extensive theoretical work on black holes would not be honored since he died in 2018. In December 1967, a student reportedly suggested the phrase black hole at a lecture by John Wheeler; Wheeler adopted the term for its brevity and "advertising value", and Wheeler's stature in the field ensured it quickly caught on, leading some to credit Wheeler with coining the phrase. However, the term was used by others around that time. Science writer Marcia Bartusiak traces the term black hole to physicist Robert H. Dicke, who in the early 1960s reportedly compared the phenomenon to the Black Hole of Calcutta, notorious as a prison where people entered but never left alive. The term was used in print by Life and Science News magazines in 1963, and by science journalist Ann Ewing in her article "'Black Holes' in Space", dated 18 January 1964, which was a report on a meeting of the American Association for the Advancement of Science held in Cleveland, Ohio. Definition A black hole is generally defined as a region of spacetime from which no information-carrying signals or objects can escape. However, verifying an object as a black hole by this definition would require waiting for an infinite time and at an infinite distance from the black hole to verify that indeed, nothing has escaped, and thus cannot be used to identify a physical black hole. Broadly, physicists do not have a precisely-agreed-upon definition of a black hole. Among astrophysicists, a black hole is a compact object with a mass larger than four solar masses. A black hole may also be defined as a reservoir of information: 142 or a region where space is falling inwards faster than the speed of light. Properties The no-hair theorem postulates that, once it achieves a stable condition after formation, a black hole has only three independent physical properties: mass, electric charge, and angular momentum; the black hole is otherwise featureless. If the conjecture is true, any two black holes that share the same values for these properties, or parameters, are indistinguishable from one another. The degree to which the conjecture is true for real black holes is currently an unsolved problem. The simplest static black holes have mass but neither electric charge nor angular momentum. According to Birkhoff's theorem, these Schwarzschild black holes are the only vacuum solution that is spherically symmetric. Solutions describing more general black holes also exist. Non-rotating charged black holes are described by the Reissner–Nordström metric, while the Kerr metric describes a non-charged rotating black hole. The most general stationary black hole solution known is the Kerr–Newman metric, which describes a black hole with both charge and angular momentum. The simplest static black holes have mass but neither electric charge nor angular momentum. Contrary to the popular notion of a black hole "sucking in everything" in its surroundings, from far away, the external gravitational field of a black hole is identical to that of any other body of the same mass. While a black hole can theoretically have any positive mass, the charge and angular momentum are constrained by the mass. The total electric charge Q and the total angular momentum J are expected to satisfy the inequality Q 2 4 π ϵ 0 + c 2 J 2 G M 2 ≤ G M 2 {\displaystyle {\frac {Q^{2}}{4\pi \epsilon _{0}}}+{\frac {c^{2}J^{2}}{GM^{2}}}\leq GM^{2}} for a black hole of mass M. Black holes with the maximum possible charge or spin satisfying this inequality are called extremal black holes. Solutions of Einstein's equations that violate this inequality exist, but they do not possess an event horizon. These are so-called naked singularities that can be observed from the outside. Because these singularities make the universe inherently unpredictable, many physicists believe they could not exist. The weak cosmic censorship hypothesis, proposed by Sir Roger Penrose, rules out the formation of such singularities, when they are created through the gravitational collapse of realistic matter. However, this theory has not yet been proven, and some physicists believe that naked singularities could exist. It is also unknown whether black holes could even become extremal, forming naked singularities, since natural processes counteract increasing spin and charge when a black hole becomes near-extremal. The total mass of a black hole can be estimated by analyzing the motion of objects near the black hole, such as stars or gas. All black holes spin, often fast—One supermassive black hole, GRS 1915+105 has been estimated to spin at over 1,000 revolutions per second. The Milky Way's central black hole Sagittarius A* rotates at about 90% of the maximum rate. The spin rate can be inferred from measurements of atomic spectral lines in the X-ray range. As gas near the black hole plunges inward, high energy X-ray emission from electron-positron pairs illuminates the gas further out, appearing red-shifted due to relativistic effects. Depending on the spin of the black hole, this plunge happens at different radii from the hole, with different degrees of redshift. Astronomers can use the gap between the x-ray emission of the outer disk and the redshifted emission from plunging material to determine the spin of the black hole. A newer way to estimate spin is based on the temperature of gasses accreting onto the black hole. The method requires an independent measurement of the black hole mass and inclination angle of the accretion disk followed by computer modeling. Gravitational waves from coalescing binary black holes can also provide the spin of both progenitor black holes and the merged hole, but such events are rare. A spinning black hole has angular momentum. The supermassive black hole in the center of the Messier 87 (M87) galaxy appears to have an angular momentum very close to the maximum theoretical value. That uncharged limit is J ≤ G M 2 c , {\displaystyle J\leq {\frac {GM^{2}}{c}},} allowing definition of a dimensionless spin magnitude such that 0 ≤ c J G M 2 ≤ 1. {\displaystyle 0\leq {\frac {cJ}{GM^{2}}}\leq 1.} Most black holes are believed to have an approximately neutral charge. For example, Michal Zajaček, Arman Tursunov, Andreas Eckart, and Silke Britzen found the electric charge of Sagittarius A* to be at least ten orders of magnitude below the theoretical maximum. A charged black hole repels other like charges just like any other charged object. If a black hole were to become charged, particles with an opposite sign of charge would be pulled in by the extra electromagnetic force, while particles with the same sign of charge would be repelled, neutralizing the black hole. This effect may not be as strong if the black hole is also spinning. The presence of charge can reduce the diameter of the black hole by up to 38%. The charge Q for a nonspinning black hole is bounded by Q ≤ G M , {\displaystyle Q\leq {\sqrt {G}}M,} where G is the gravitational constant and M is the black hole's mass. Classification Black holes can have a wide range of masses. The minimum mass of a black hole formed by stellar gravitational collapse is governed by the maximum mass of a neutron star and is believed to be approximately two-to-four solar masses. However, theoretical primordial black holes, believed to have formed soon after the Big Bang, could be far smaller, with masses as little as 10−5 grams at formation. These very small black holes are sometimes called micro black holes. Black holes formed by stellar collapse are called stellar black holes. Estimates of their maximum mass at formation vary, but generally range from 10 to 100 solar masses, with higher estimates for black holes progenated by low-metallicity stars. The mass of a black hole formed via a supernova has a lower bound: If the progenitor star is too small, the collapse may be stopped by the degeneracy pressure of the star's constituents, allowing the condensation of matter into an exotic denser state. Degeneracy pressure occurs from the Pauli exclusion principle—Particles will resist being in the same place as each other. Smaller progenitor stars, with masses less than about 8 M☉, will be held together by the degeneracy pressure of electrons and will become a white dwarf. For more massive progenitor stars, electron degeneracy pressure is no longer strong enough to resist the force of gravity and the star will be held together by neutron degeneracy pressure, which can occur at much higher densities, forming a neutron star. If the star is still too massive, even neutron degeneracy pressure will not be able to resist the force of gravity and the star will collapse into a black hole.: 5.8 Stellar black holes can also gain mass via accretion of nearby matter, often from a companion object such as a star. Black holes that are larger than stellar black holes but smaller than supermassive black holes are called intermediate-mass black holes, with masses of approximately 102 to 105 solar masses. These black holes seem to be rarer than their stellar and supermassive counterparts, with relatively few candidates having been observed. Physicists have speculated that such black holes may form from collisions in globular and star clusters or at the center of low-mass galaxies. They may also form as the result of mergers of smaller black holes, with several LIGO observations finding merged black holes within the 110-350 solar mass range. The black holes with the largest masses are called supermassive black holes, with masses more than 106 times that of the Sun. These black holes are believed to exist at the centers of almost every large galaxy, including the Milky Way. Some scientists have proposed a subcategory of even larger black holes, called ultramassive black holes, with masses greater than 109-1010 solar masses. Theoretical models predict that the accretion disc that feeds black holes will be unstable once a black hole reaches 50-100 billion times the mass of the Sun, setting a rough upper limit to black hole mass. Structure While black holes are conceptually invisible sinks of all matter and light, in astronomical settings, their enormous gravity alters the motion of surrounding objects and pulls nearby gas inwards at near-light speed, making the area around black holes the brightest objects in the universe. Some black holes have relativistic jets—thin streams of plasma travelling away from the black hole at more than one-tenth of the speed of light. A small faction of the matter falling towards the black hole gets accelerated away along the hole rotation axis. These jets can extend as far as millions of parsecs from the black hole itself. Black holes of any mass can have jets. However, they are typically observed around spinning black holes with strongly-magnetized accretion disks. Relativistic jets were more common in the early universe, when galaxies and their corresponding supermassive black holes were rapidly gaining mass. All black holes with jets also have an accretion disk, but the jets are usually brighter than the disk. Quasars, typically found in other galaxies, are believed to be supermassive black holes with jets; microquasars are believed to be stellar-mass objects with jets, typically observed in the Milky Way. The mechanism of formation of jets is not yet known, but several options have been proposed. One method proposed to fuel these jets is the Blandford-Znajek process, which suggests that the dragging of magnetic field lines by a black hole's rotation could launch jets of matter into space. The Penrose process, which involves extraction of a black hole's rotational energy, has also been proposed as a potential mechanism of jet propulsion. Due to conservation of angular momentum, gas falling into the gravitational well created by a massive object will typically form a disk-like structure around the object.: 242 As the disk's angular momentum is transferred outward due to internal processes, its matter falls farther inward, converting its gravitational energy into heat and releasing a large flux of x-rays. The temperature of these disks can range from thousands to millions of Kelvin, and temperatures can differ throughout a single accretion disk. Accretion disks can also emit in other parts of the electromagnetic spectrum, depending on the disk's turbulence and magnetization and the black hole's mass and angular momentum. Accretion disks can be defined as geometrically thin or geometrically thick. Geometrically thin disks are mostly confined to the black hole's equatorial plane and have a well-defined edge at the innermost stable circular orbit (ISCO), while geometrically thick disks are supported by internal pressure and temperature and can extend inside the ISCO. Disks with high rates of electron scattering and absorption, appearing bright and opaque, are called optically thick; optically thin disks are more translucent and produce fainter images when viewed from afar. Accretion disks of black holes accreting beyond the Eddington limit are often referred to as polish donuts due to their thick, toroidal shape that resembles that of a donut. Quasar accretion disks are expected to usually appear blue in color. The disk for a stellar black hole, on the other hand, would likely look orange, yellow, or red, with its inner regions being the brightest. Theoretical research suggests that the hotter a disk is, the bluer it should be, although this is not always supported by observations of real astronomical objects. Accretion disk colors may also be altered by the Doppler effect, with the part of the disk travelling towards an observer appearing bluer and brighter and the part of the disk travelling away from the observer appearing redder and dimmer. In Newtonian gravity, test particles can stably orbit at arbitrary distances from a central object. In general relativity, however, there exists a smallest possible radius for which a massive particle can orbit stably. Any infinitesimal inward perturbations to this orbit will lead to the particle spiraling into the black hole, and any outward perturbations will, depending on the energy, cause the particle to spiral in, move to a stable orbit further from the black hole, or escape to infinity. This orbit is called the innermost stable circular orbit, or ISCO. The location of the ISCO depends on the spin of the black hole and the spin of the particle itself. In the case of a Schwarzschild black hole (spin zero) and a particle without spin, the location of the ISCO is: r I S C O = 3 r s = 6 G M c 2 , {\displaystyle r_{\rm {ISCO}}=3\,r_{\text{s}}={\frac {6\,GM}{c^{2}}},} where r I S C O {\displaystyle r_{\rm {_{ISCO}}}} is the radius of the ISCO, r s {\displaystyle r_{\text{s}}} is the Schwarzschild radius of the black hole, G {\displaystyle G} is the gravitational constant, and c {\displaystyle c} is the speed of light. The radius of this orbit changes slightly based on particle spin. For charged black holes, the ISCO moves inwards. For spinning black holes, the ISCO is moved inwards for particles orbiting in the same direction that the black hole is spinning (prograde) and outwards for particles orbiting in the opposite direction (retrograde). For example, the ISCO for a particle orbiting retrograde can be as far out as about 9 r s {\displaystyle 9r_{\text{s}}} , while the ISCO for a particle orbiting prograde can be as close as at the event horizon itself. The photon sphere is a spherical boundary for which photons moving on tangents to that sphere are bent completely around the black hole, possibly orbiting multiple times. Light rays with impact parameters less than the radius of the photon sphere enter the black hole. For Schwarzschild black holes, the photon sphere has a radius 1.5 times the Schwarzschild radius; the radius for non-Schwarzschild black holes is at least 1.5 times the radius of the event horizon. When viewed from a great distance, the photon sphere creates an observable black hole shadow. Since no light emerges from within the black hole, this shadow is the limit for possible observations.: 152 The shadow of colliding black holes should have characteristic warped shapes, allowing scientists to detect black holes that are about to merge. While light can still escape from the photon sphere, any light that crosses the photon sphere on an inbound trajectory will be captured by the black hole. Therefore, any light that reaches an outside observer from the photon sphere must have been emitted by objects between the photon sphere and the event horizon. Light emitted towards the photon sphere may also curve around the black hole and return to the emitter. For a rotating, uncharged black hole, the radius of the photon sphere depends on the spin parameter and whether the photon is orbiting prograde or retrograde. For a photon orbiting prograde, the photon sphere will be 1-3 Schwarzschild radii from the center of the black hole, while for a photon orbiting retrograde, the photon sphere will be between 3-5 Schwarzschild radii from the center of the black hole. The exact location of the photon sphere depends on the magnitude of the black hole's rotation. For a charged, nonrotating black hole, there will only be one photon sphere, and the radius of the photon sphere will decrease for increasing black hole charge. For non-extremal, charged, rotating black holes, there will always be two photon spheres, with the exact radii depending on the parameters of the black hole. Near a rotating black hole, spacetime rotates similar to a vortex. The rotating spacetime will drag any matter and light into rotation around the spinning black hole. This effect of general relativity, called frame dragging, gets stronger closer to the spinning mass. The region of spacetime in which it is impossible to stay still is called the ergosphere. The ergosphere of a black hole is a volume bounded by the black hole's event horizon and the ergosurface, which coincides with the event horizon at the poles but bulges out from it around the equator. Matter and radiation can escape from the ergosphere. Through the Penrose process, objects can emerge from the ergosphere with more energy than they entered with. The extra energy is taken from the rotational energy of the black hole, slowing down the rotation of the black hole.: 268 A variation of the Penrose process in the presence of strong magnetic fields, the Blandford–Znajek process, is considered a likely mechanism for the enormous luminosity and relativistic jets of quasars and other active galactic nuclei. The observable region of spacetime around a black hole closest to its event horizon is called the plunging region. In this area it is no longer possible for free falling matter to follow circular orbits or stop a final descent into the black hole. Instead, it will rapidly plunge toward the black hole at close to the speed of light, growing increasingly hot and producing a characteristic, detectable thermal emission. However, light and radiation emitted from this region can still escape from the black hole's gravitational pull. For a nonspinning, uncharged black hole, the radius of the event horizon, or Schwarzschild radius, is proportional to the mass, M, through r s = 2 G M c 2 ≈ 2.95 M M ⊙ k m , {\displaystyle r_{\mathrm {s} }={\frac {2GM}{c^{2}}}\approx 2.95\,{\frac {M}{M_{\odot }}}~\mathrm {km,} } where rs is the Schwarzschild radius and M☉ is the mass of the Sun.: 124 For a black hole with nonzero spin or electric charge, the radius is smaller,[Note 1] until an extremal black hole could have an event horizon close to r + = G M c 2 , {\displaystyle r_{\mathrm {+} }={\frac {GM}{c^{2}}},} half the radius of a nonspinning, uncharged black hole of the same mass. Since the volume within the Schwarzschild radius increase with the cube of the radius, average density of a black hole inside its Schwarzschild radius is inversely proportional to the square of its mass: supermassive black holes are much less dense than stellar black holes. The average density of a 108 M☉ black hole is comparable to that of water. The defining feature of a black hole is the existence of an event horizon, a boundary in spacetime through which matter and light can pass only inward towards the center of the black hole. Nothing, not even light, can escape from inside the event horizon. The event horizon is referred to as such because if an event occurs within the boundary, information from that event cannot reach or affect an outside observer, making it impossible to determine whether such an event occurred.: 179 For non-rotating black holes, the geometry of the event horizon is precisely spherical, while for rotating black holes, the event horizon is oblate. To a distant observer, a clock near a black hole would appear to tick more slowly than one further from the black hole.: 217 This effect, known as gravitational time dilation, would also cause an object falling into a black hole to appear to slow as it approached the event horizon, never quite reaching the horizon from the perspective of an outside observer.: 218 All processes on this object would appear to slow down, and any light emitted by the object to appear redder and dimmer, an effect known as gravitational redshift. An object falling from half of a Schwarzschild radius above the event horizon would fade away until it could no longer be seen, disappearing from view within one hundredth of a second. It would also appear to flatten onto the black hole, joining all other material that had ever fallen into the hole. On the other hand, an observer falling into a black hole would not notice any of these effects as they cross the event horizon. Their own clocks appear to them to tick normally, and they cross the event horizon after a finite time without noting any singular behaviour. In general relativity, it is impossible to determine the location of the event horizon from local observations, due to Einstein's equivalence principle.: 222 Black holes that are rotating and/or charged have an inner horizon, often called the Cauchy horizon, inside of the black hole. The inner horizon is divided up into two segments: an ingoing section and an outgoing section. At the ingoing section of the Cauchy horizon, radiation and matter that fall into the black hole would build up at the horizon, causing the curvature of spacetime to go to infinity. This would cause an observer falling in to experience tidal forces. This phenomenon is often called mass inflation, since it is associated with a parameter dictating the black hole's internal mass growing exponentially, and the buildup of tidal forces is called the mass-inflation singularity or Cauchy horizon singularity. Some physicists have argued that in realistic black holes, accretion and Hawking radiation would stop mass inflation from occurring. At the outgoing section of the inner horizon, infalling radiation would backscatter off of the black hole's spacetime curvature and travel outward, building up at the outgoing Cauchy horizon. This would cause an infalling observer to experience a gravitational shock wave and tidal forces as the spacetime curvature at the horizon grew to infinity. This buildup of tidal forces is called the shock singularity. Both of these singularities are weak, meaning that an object crossing them would only be deformed a finite amount by tidal forces, even though the spacetime curvature would still be infinite at the singularity. This is as opposed to a strong singularity, where an object hitting the singularity would be stretched and squeezed by an infinite amount. They are also null singularities, meaning that a photon could travel parallel to the them without ever being intercepted. Ignoring quantum effects, every black hole has a singularity inside, points where the curvature of spacetime becomes infinite, and geodesics terminate within a finite proper time.: 205 For a non-rotating black hole, this region takes the shape of a single point; for a rotating black hole it is smeared out to form a ring singularity that lies in the plane of rotation.: 264 In both cases, the singular region has zero volume. All of the mass of the black hole ends up in the singularity.: 252 Since the singularity has nonzero mass in an infinitely small space, it can be thought of as having infinite density. Observers falling into a Schwarzschild black hole (i.e., non-rotating and not charged) cannot avoid being carried into the singularity once they cross the event horizon. As they fall further into the black hole, they will be torn apart by the growing tidal forces in a process sometimes referred to as spaghettification or the noodle effect. Eventually, they will reach the singularity and be crushed into an infinitely small point.: 182 However any perturbations, such as those caused by matter or radiation falling in, would cause space to oscillate chaotically near the singularity. Any matter falling in would experience intense tidal forces rapidly changing in direction, all while being compressed into an increasingly small volume. Alternative forms of general relativity, including addition of some quatum effects, can lead to regular, or nonsingular, black holes without singularities. For example, the fuzzball model, based on string theory, states that black holes are actually made up of quantum microstates and need not have a singularity or an event horizon. The theory of loop quantum gravity proposes that the curvature and density at the center of a black hole is large, but not infinite. Formation Black holes are formed by gravitational collapse of massive stars, either by direct collapse or during a supernova explosion in a process called fallback. Black holes can result from the merger of two neutron stars or a neutron star and a black hole. Other more speculative mechanisms include primordial black holes created from density fluctuations in the early universe, the collapse of dark stars, a hypothetical object powered by annihilation of dark matter, or from hypothetical self-interacting dark matter. Gravitational collapse occurs when an object's internal pressure is insufficient to resist the object's own gravity. At the end of a star's life, it will run out of hydrogen to fuse, and will start fusing more and more massive elements, until it gets to iron. Since the fusion of elements heavier than iron would require more energy than it would release, nuclear fusion ceases. If the iron core of the star is too massive, the star will no longer be able to support itself and will undergo gravitational collapse. While most of the energy released during gravitational collapse is emitted very quickly, an outside observer does not actually see the end of this process. Even though the collapse takes a finite amount of time from the reference frame of infalling matter, a distant observer would see the infalling material slow and halt just above the event horizon, due to gravitational time dilation. Light from the collapsing material takes longer and longer to reach the observer, with the delay growing to infinity as the emitting material reaches the event horizon. Thus the external observer never sees the formation of the event horizon; instead, the collapsing material seems to become dimmer and increasingly red-shifted, eventually fading away. Observations of quasars at redshift z ∼ 7 {\displaystyle z\sim 7} , less than a billion years after the Big Bang, has led to investigations of other ways to form black holes. The accretion process to build supermassive black holes has a limiting rate of mass accumulation and a billion years is not enough time to reach quasar status. One suggestion is direct collapse of nearly pure hydrogen gas (low metalicity) clouds characteristic of the young universe, forming a supermassive star which collapses into a black hole. It has been suggested that seed black holes with typical masses of ~105 M☉ could have formed in this way which then could grow to ~109 M☉. However, the very large amount of gas required for direct collapse is not typically stable to fragmentation to form multiple stars. Thus another approach suggests massive star formation followed by collisions that seed massive black holes which ultimately merge to create a quasar.: 85 A neutron star in a common envelope with a regular star can accrete sufficient material to collapse to a black hole or two neutron stars can merge. These avenues for the formation of black holes are considered relatively rare. In the current epoch of the universe, conditions needed to form black holes are rare and are mostly only found in stars. However, in the early universe, conditions may have allowed for black hole formations via other means. Fluctuations of spacetime soon after the Big Bang may have formed areas that were denser then their surroundings. Initially, these regions would not have been compact enough to form a black hole, but eventually, the curvature of spacetime in the regions become large enough to cause them to collapse into a black hole. Different models for the early universe vary widely in their predictions of the scale of these fluctuations. Various models predict the creation of primordial black holes ranging from a Planck mass (~2.2×10−8 kg) to hundreds of thousands of solar masses. Primordial black holes with masses less than 1015 g would have evaporated by now due to Hawking radiation. Despite the early universe being extremely dense, it did not re-collapse into a black hole during the Big Bang, since the universe was expanding rapidly and did not have the gravitational differential necessary for black hole formation. Models for the gravitational collapse of objects of relatively constant size, such as stars, do not necessarily apply in the same way to rapidly expanding space such as the Big Bang. In principle, black holes could be formed in high-energy particle collisions that achieve sufficient density, although no such events have been detected. These hypothetical micro black holes, which could form from the collision of cosmic rays and Earth's atmosphere or in particle accelerators like the Large Hadron Collider, would not be able to aggregate additional mass. Instead, they would evaporate in about 10−25 seconds, posing no threat to the Earth. Evolution Black holes can also merge with other objects such as stars or even other black holes. This is thought to have been important, especially in the early growth of supermassive black holes, which could have formed from the aggregation of many smaller objects. The process has also been proposed as the origin of some intermediate-mass black holes. Mergers of supermassive black holes may take a long time: As a binary of supermassive black holes approach each other, most nearby stars are ejected, leaving little for the remaining black holes to gravitationally interact with that would allow them to get closer to each other. This phenomenon has been called the final parsec problem, as the distance at which this happens is usually around one parsec. When a black hole accretes matter, the gas in the inner accretion disk orbits at very high speeds because of its proximity to the black hole. The resulting friction heats the inner disk to temperatures at which it emits vast amounts of electromagnetic radiation (mainly X-rays) detectable by telescopes. By the time the matter of the disk reaches the ISCO, between 5.7% and 42% of its mass will have been converted to energy, depending on the black hole's spin. About 90% of this energy is released within about 20 black hole radii. In many cases, accretion disks are accompanied by relativistic jets that are emitted along the black hole's poles, which carry away much of the energy. The mechanism for the creation of these jets is currently not well understood, in part due to insufficient data. Many of the universe's most energetic phenomena have been attributed to the accretion of matter on black holes. Active galactic nuclei and quasars are believed to be the accretion disks of supermassive black holes. X-ray binaries are generally accepted to be binary systems in which one of the two objects is a compact object accreting matter from its companion. Ultraluminous X-ray sources may be the accretion disks of intermediate-mass black holes. At a certain rate of accretion, the outward radiation pressure will become as strong as the inward gravitational force, and the black hole should unable to accrete any faster. This limit is called the Eddington limit. However, many black holes accrete beyond this rate due to their non-spherical geometry or instabilities in the accretion disk. Accretion beyond the limit is called Super-Eddington accretion and may have been commonplace in the early universe. Stars have been observed to get torn apart by tidal forces in the immediate vicinity of supermassive black holes in galaxy nuclei, in what is known as a tidal disruption event (TDE). Some of the material from the disrupted star forms an accretion disk around the black hole, which emits observable electromagnetic radiation. The correlation between the masses of supermassive black holes in the centres of galaxies with the velocity dispersion and mass of stars in their host bulges suggests that the formation of galaxies and the formation of their central black holes are related. Black hole winds from rapid accretion, particularly when the galaxy itself is still accreting matter, can compress gas nearby, accelerating star formation. However, if the winds become too strong, the black hole may blow nearly all of the gas out of the galaxy, quenching star formation. Black hole jets may also energize nearby cavities of plasma and eject low-entropy gas from out of the galactic core, causing gas in galactic centers to be hotter than expected. If Hawking's theory of black hole radiation is correct, then black holes are expected to shrink and evaporate over time as they lose mass by the emission of photons and other particles. The temperature of this thermal spectrum (Hawking temperature) is proportional to the surface gravity of the black hole, which is inversely proportional to the mass. Hence, large black holes emit less radiation than small black holes.: Ch. 9.6 A stellar black hole of 1 M☉ has a Hawking temperature of 62 nanokelvins. This is far less than the 2.7 K temperature of the cosmic microwave background radiation. Stellar-mass or larger black holes receive more mass from the cosmic microwave background than they emit through Hawking radiation and thus will grow instead of shrinking. To have a Hawking temperature larger than 2.7 K (and be able to evaporate), a black hole would need a mass less than the Moon. Such a black hole would have a diameter of less than a tenth of a millimetre. The Hawking radiation for an astrophysical black hole is predicted to be very weak and would thus be exceedingly difficult to detect from Earth. A possible exception is the burst of gamma rays emitted in the last stage of the evaporation of primordial black holes. Searches for such flashes have proven unsuccessful and provide stringent limits on the possibility of existence of low mass primordial black holes, with modern research predicting that primordial black holes must make up less than a fraction of 10−7 of the universe's total mass. NASA's Fermi Gamma-ray Space Telescope, launched in 2008, has searched for these flashes, but has not yet found any. The properties of a black hole are constrained and interrelated by the theories that predict these properties. When based on general relativity, these relationships are called the laws of black hole mechanics. For a black hole that is not still forming or accreting matter, the zeroth law of black hole mechanics states the black hole's surface gravity is constant across the event horizon. The first law relates changes in the black hole's surface area, angular momentum, and charge to changes in its energy. The second law says the surface area of a black hole never decreases on its own. Finally, the third law says that the surface gravity of a black hole is never zero. These laws are mathematical analogs of the laws of thermodynamics. They are not equivalent, however, because, according to general relativity without quantum mechanics, a black hole can never emit radiation, and thus its temperature must always be zero.: 11 Quantum mechanics predicts that a black hole will continuously emit thermal Hawking radiation, and therefore must always have a nonzero temperature. It also predicts that all black holes have entropy which scales with their surface area. When quantum mechanics is accounted for, the laws of black hole mechanics become equivalent to the classical laws of thermodynamics. However, these conclusions are derived without a complete theory of quantum gravity, although many potential theories do predict black holes having entropy and temperature. Thus, the true quantum nature of black hole thermodynamics continues to be debated.: 29 Observational evidence Millions of black holes with around 30 solar masses derived from stellar collapse are expected to exist in the Milky Way. Even a dwarf galaxy like Draco should have hundreds. Only a few of these have been detected. By nature, black holes do not themselves emit any electromagnetic radiation other than the hypothetical Hawking radiation, so astrophysicists searching for black holes must generally rely on indirect observations. The defining characteristic of a black hole is its event horizon. The horizon itself cannot be imaged, so all other possible explanations for these indirect observations must be considered and eliminated before concluding that a black hole has been observed.: 11 The Event Horizon Telescope (EHT) is a global system of radio telescopes capable of directly observing a black hole shadow. The angular resolution of a telescope is based on its aperture and the wavelengths it is observing. Because the angular diameters of Sagittarius A* and Messier 87* in the sky are very small, a single telescope would need to be about the size of the Earth to clearly distinguish their horizons using radio wavelengths. By combining data from several different radio telescopes around the world, the Event Horizon Telescope creates an effective aperture the diameter size of the Earth. The EHT team used imaging algorithms to compute the most probable image from the data in its observations of Sagittarius A* and M87*. Gravitational-wave interferometry can be used to detect merging black holes and other compact objects. In this method, a laser beam is split down two long arms of a tunnel. The laser beams reflect off of mirrors in the tunnels and converge at the intersection of the arms, cancelling each other out. However, when a gravitational wave passes, it warps spacetime, changing the lengths of the arms themselves. Since each laser beam is now travelling a slightly different distance, they do not cancel out and produce a recognizable signal. Analysis of the signal can give scientists information about what caused the gravitational waves. Since gravitational waves are very weak, gravitational-wave observatories such as LIGO must have arms several kilometers long and carefully control for noise from Earth to be able to detect these gravitational waves. Since the first measurements in 2016, multiple gravitational waves from black holes have been detected and analyzed. The proper motions of stars near the centre of the Milky Way provide strong observational evidence that these stars are orbiting a supermassive black hole. Since 1995, astronomers have tracked the motions of 90 stars orbiting an invisible object coincident with the radio source Sagittarius A*. In 1998, by fitting the motions of the stars to Keplerian orbits, the astronomers were able to infer that Sagittarius A* must be a 2.6×106 M☉ object must be contained within a radius of 0.02 light-years. Since then, one of the stars—called S2—has completed a full orbit. From the orbital data, astronomers were able to refine the calculations of the mass of Sagittarius A* to 4.3×106 M☉, with a radius of less than 0.002 light-years. This upper limit radius is larger than the Schwarzschild radius for the estimated mass, so the combination does not prove Sagittarius A* is a black hole. Nevertheless, these observations strongly suggest that the central object is a supermassive black hole as there are no other plausible scenarios for confining so much invisible mass into such a small volume. Additionally, there is some observational evidence that this object might possess an event horizon, a feature unique to black holes. The Event Horizon Telescope image of Sagittarius A*, released in 2022, provided further confirmation that it is indeed a black hole. X-ray binaries are binary systems that emit a majority of their radiation in the X-ray part of the electromagnetic spectrum. These X-ray emissions result when a compact object accretes matter from an ordinary star. The presence of an ordinary star in such a system provides an opportunity for studying the central object and to determine if it might be a black hole. By measuring the orbital period of the binary, the distance to the binary from Earth, and the mass of the companion star, scientists can estimate the mass of the compact object. The Tolman-Oppenheimer-Volkoff limit (TOV limit) dictates the largest mass a nonrotating neutron star can be, and is estimated to be about two solar masses. While a rotating neutron star can be slightly more massive, if the compact object is much more massive than the TOV limit, it cannot be a neutron star and is generally expected to be a black hole. The first strong candidate for a black hole, Cygnus X-1, was discovered in this way by Charles Thomas Bolton, Louise Webster, and Paul Murdin in 1972. Observations of rotation broadening of the optical star reported in 1986 lead to a compact object mass estimate of 16 solar masses, with 7 solar masses as the lower bound. In 2011, this estimate was updated to 14.1±1.0 M☉ for the black hole and 19.2±1.9 M☉ for the optical stellar companion. X-ray binaries can be categorized as either low-mass or high-mass; This classification is based on the mass of the companion star, not the compact object itself. In a class of X-ray binaries called soft X-ray transients, the companion star is of relatively low mass, allowing for more accurate estimates of the black hole mass. These systems actively emit X-rays for only several months once every 10–50 years. During the period of low X-ray emission, called quiescence, the accretion disk is extremely faint, allowing detailed observation of the companion star. Numerous black hole candidates have been measured by this method. Black holes are also sometimes found in binaries with other compact objects, such as white dwarfs, neutron stars, and other black holes. The centre of nearly every galaxy contains a supermassive black hole. The close observational correlation between the mass of this hole and the velocity dispersion of the host galaxy's bulge, known as the M–sigma relation, strongly suggests a connection between the formation of the black hole and that of the galaxy itself. Astronomers use the term active galaxy to describe galaxies with unusual characteristics, such as unusual spectral line emission and very strong radio emission. Theoretical and observational studies have shown that the high levels of activity in the centers of these galaxies, regions called active galactic nuclei (AGN), may be explained by accretion onto supermassive black holes. These AGN consist of a central black hole that may be millions or billions of times more massive than the Sun, a disk of interstellar gas and dust called an accretion disk, and two jets perpendicular to the accretion disk. Although supermassive black holes are expected to be found in most AGN, only some galaxies' nuclei have been more carefully studied in attempts to both identify and measure the actual masses of the central supermassive black hole candidates. Some of the most notable galaxies with supermassive black hole candidates include the Andromeda Galaxy, Messier 32, Messier 87, the Sombrero Galaxy, and the Milky Way itself. Another way black holes can be detected is through observation of effects caused by their strong gravitational field. One such effect is gravitational lensing: The deformation of spacetime around a massive object causes light rays to be deflected, making objects behind them appear distorted. When the lensing object is a black hole, this effect can be strong enough to create multiple images of a star or other luminous source. However, the distance between the lensed images may be too small for contemporary telescopes to resolve—this phenomenon is called microlensing. Instead of seeing two images of a lensed star, astronomers see the star brighten slightly as the black hole moves towards the line of sight between the star and Earth and then return to its normal luminosity as the black hole moves away. The turn of the millennium saw the first 3 candidate detections of black holes in this way, and in January 2022, astronomers reported the first confirmed detection of a microlensing event from an isolated black hole. This was also the first determination of an isolated black hole mass, 7.1±1.3 M☉. Alternatives While there is a strong case for supermassive black holes, the model for stellar-mass black holes assumes of an upper limit for the mass of a neutron star: objects observed to have more mass are assumed to be black holes. However, the properties of extremely dense matter are poorly understood. New exotic phases of matter could allow other kinds of massive objects. Quark stars would be made up of quark matter and supported by quark degeneracy pressure, a form of degeneracy pressure even stronger than neutron degeneracy pressure. This would halt gravitational collapse at a higher mass than for a neutron star. Even stronger stars called electroweak stars would convert quarks in their cores into leptons, providing additional pressure to stop the star from collapsing. If, as some extensions of the Standard Model posit, quarks and leptons are made up of the even-smaller fundamental particles called preons, a very compact star could be supported by preon degeneracy pressure. While none of these hypothetical models can explain all of the observations of stellar black hole candidates, a Q star is the only alternative which could significantly exceed the mass limit for neutron stars and thus provide an alternative for supermassive black holes.: 12 A few theoretical objects have been conjectured to match observations of astronomical black hole candidates identically or near-identically, but which function via a different mechanism. A dark energy star would convert infalling matter into vacuum energy; This vacuum energy would be much larger than the vacuum energy of outside space, exerting outwards pressure and preventing a singularity from forming. A black star would be gravitationally collapsing slowly enough that quantum effects would keep it just on the cusp of fully collapsing into a black hole. A gravastar would consist of a very thin shell and a dark-energy interior providing outward pressure to stop the collapse into a black hole or formation of a singularity; It could even have another gravastar inside, called a 'nestar'. Open questions According to the no-hair theorem, a black hole is defined by only three parameters: its mass, charge, and angular momentum. This seems to mean that all other information about the matter that went into forming the black hole is lost, as there is no way to determine anything about the black hole from outside other than those three parameters. When black holes were thought to persist forever, this information loss was not problematic, as the information can be thought of as existing inside the black hole. However, black holes slowly evaporate by emitting Hawking radiation. This radiation does not appear to carry any additional information about the matter that formed the black hole, meaning that this information is seemingly gone forever. This is called the black hole information paradox. Theoretical studies analyzing the paradox have led to both further paradoxes and new ideas about the intersection of quantum mechanics and general relativity. While there is no consensus on the resolution of the paradox, work on the problem is expected to be important for a theory of quantum gravity.: 126 Observations of faraway galaxies have found that ultraluminous quasars, powered by supermassive black holes, existed in the early universe as far as redshift z ≥ 7 {\displaystyle z\geq 7} . These black holes have been assumed to be the products of the gravitational collapse of large population III stars. However, these stellar remnants were not massive enough to produce the quasars observed at early times without accreting beyond the Eddington limit, the theoretical maximum rate of black hole accretion. Physicists have suggested a variety of different mechanisms by which these supermassive black holes may have formed. It has been proposed that smaller black holes may have also undergone mergers to produce the observed supermassive black holes. It is also possible that they were seeded by direct-collapse black holes, in which a large cloud of hot gas avoids fragmentation that would lead to multiple stars, due to low angular momentum or heating from a nearby galaxy. Given the right circumstances, a single supermassive star forms and collapses directly into a black hole without undergoing typical stellar evolution. Additionally, these supermassive black holes in the early universe may be high-mass primordial black holes, which could have accreted further matter in the centers of galaxies. Finally, certain mechanisms allow black holes to grow faster than the theoretical Eddington limit, such as dense gas in the accretion disk limiting outward radiation pressure that prevents the black hole from accreting. However, the formation of bipolar jets prevent super-Eddington rates. In fiction Black holes have been portrayed in science fiction in a variety of ways. Even before the advent of the term itself, objects with characteristics of black holes appeared in stories such as the 1928 novel The Skylark of Space with its "black Sun" and the "hole in space" in the 1935 short story Starship Invincible. As black holes grew to public recognition in the 1960s and 1970s, they began to be featured in films as well as novels, such as Disney's The Black Hole. Black holes have also been used in works of the 21st century, such as Christopher Nolan's science fiction epic Interstellar. Authors and screenwriters have exploited the relativistic effects of black holes, particularly gravitational time dilation. For example, Interstellar features a black hole planet with a time dilation factor of over 60,000:1, while the 1977 novel Gateway depicts a spaceship approaching but never crossing the event horizon of a black hole from the perspective of an outside observer due to time dilation effects. Black holes have also been appropriated as wormholes or other methods of faster-than-light travel, such as in the 1974 novel The Forever War, where a network of black holes is used for interstellar travel. Additionally, black holes can feature as hazards to spacefarers and planets: A black hole threatens a deep-space outpost in 1978 short story The Black Hole Passes, and a binary black hole dangerously alters the orbit of a planet in the 2018 Netflix reboot of Lost in Space. Notes References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/OpenAI#cite_ref-bloom1AIMicro_223-0] | [TOKENS: 8773] |
Contents OpenAI OpenAI is an American artificial intelligence research organization comprising both a non-profit foundation and a controlled for-profit public benefit corporation (PBC), headquartered in San Francisco. It aims to develop "safe and beneficial" artificial general intelligence (AGI), which it defines as "highly autonomous systems that outperform humans at most economically valuable work". OpenAI is widely recognized for its development of the GPT family of large language models, the DALL-E series of text-to-image models, and the Sora series of text-to-video models, which have influenced industry research and commercial applications. Its release of ChatGPT in November 2022 has been credited with catalyzing widespread interest in generative AI. The organization was founded in 2015 in Delaware but evolved a complex corporate structure. As of October 2025, following restructuring approved by California and Delaware regulators, the non-profit OpenAI Foundation holds 26% of the for-profit OpenAI Group PBC, with Microsoft holding 27% and employees/other investors holding 47%. Under its governance arrangements, the OpenAI Foundation holds the authority to appoint the board of the for-profit OpenAI Group PBC, a mechanism designed to align the entity’s strategic direction with the Foundation’s charter. Microsoft previously invested over $13 billion into OpenAI, and provides Azure cloud computing resources. In October 2025, OpenAI conducted a $6.6 billion share sale that valued the company at $500 billion. In 2023 and 2024, OpenAI faced multiple lawsuits for alleged copyright infringement against authors and media companies whose work was used to train some of OpenAI's products. In November 2023, OpenAI's board removed Sam Altman as CEO, citing a lack of confidence in him, but reinstated him five days later following a reconstruction of the board. Throughout 2024, roughly half of then-employed AI safety researchers left OpenAI, citing the company's prominent role in an industry-wide problem. Founding In December 2015, OpenAI was founded as a not for profit organization by Sam Altman, Elon Musk, Ilya Sutskever, Greg Brockman, Trevor Blackwell, Vicki Cheung, Andrej Karpathy, Durk Kingma, John Schulman, Pamela Vagata, and Wojciech Zaremba, with Sam Altman and Elon Musk as the co-chairs. A total of $1 billion in capital was pledged by Sam Altman, Greg Brockman, Elon Musk, Reid Hoffman, Jessica Livingston, Peter Thiel, Amazon Web Services (AWS), and Infosys. However, the actual capital collected significantly lagged pledges. According to company disclosures, only $130 million had been received by 2019. In its founding charter, OpenAI stated an intention to collaborate openly with other institutions by making certain patents and research publicly available, but later restricted access to its most capable models, citing competitive and safety concerns. OpenAI was initially run from Brockman's living room. It was later headquartered at the Pioneer Building in the Mission District, San Francisco. According to OpenAI's charter, its founding mission is "to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—benefits all of humanity." Musk and Altman stated in 2015 that they were partly motivated by concerns about AI safety and existential risk from artificial general intelligence. OpenAI stated that "it's hard to fathom how much human-level AI could benefit society", and that it is equally difficult to comprehend "how much it could damage society if built or used incorrectly". The startup also wrote that AI "should be an extension of individual human wills and, in the spirit of liberty, as broadly and evenly distributed as possible", and that "because of AI's surprising history, it's hard to predict when human-level AI might come within reach. When it does, it'll be important to have a leading research institution which can prioritize a good outcome for all over its own self-interest." Co-chair Sam Altman expected a decades-long project that eventually surpasses human intelligence. Brockman met with Yoshua Bengio, one of the "founding fathers" of deep learning, and drew up a list of great AI researchers. Brockman was able to hire nine of them as the first employees in December 2015. OpenAI did not pay AI researchers salaries comparable to those of Facebook or Google. It also did not pay stock options which AI researchers typically get. Nevertheless, OpenAI spent $7 million on its first 52 employees in 2016. OpenAI's potential and mission drew these researchers to the firm; a Google employee said he was willing to leave Google for OpenAI "partly because of the very strong group of people and, to a very large extent, because of its mission." OpenAI co-founder Wojciech Zaremba stated that he turned down "borderline crazy" offers of two to three times his market value to join OpenAI instead. In April 2016, OpenAI released a public beta of "OpenAI Gym", its platform for reinforcement learning research. Nvidia gifted its first DGX-1 supercomputer to OpenAI in August 2016 to help it train larger and more complex AI models with the capability of reducing processing time from six days to two hours. In December 2016, OpenAI released "Universe", a software platform for measuring and training an AI's general intelligence across the world's supply of games, websites, and other applications. Corporate structure In 2019, OpenAI transitioned from non-profit to "capped" for-profit, with the profit being capped at 100 times any investment. According to OpenAI, the capped-profit model allows OpenAI Global, LLC to legally attract investment from venture funds and, in addition, to grant employees stakes in the company. Many top researchers work for Google Brain, DeepMind, or Facebook, which offer equity that a nonprofit would be unable to match. Before the transition, OpenAI was legally required to publicly disclose the compensation of its top employees. The company then distributed equity to its employees and partnered with Microsoft, announcing an investment package of $1 billion into the company. Since then, OpenAI systems have run on an Azure-based supercomputing platform from Microsoft. OpenAI Global, LLC then announced its intention to commercially license its technologies. It planned to spend $1 billion "within five years, and possibly much faster". Altman stated that even a billion dollars may turn out to be insufficient, and that the lab may ultimately need "more capital than any non-profit has ever raised" to achieve artificial general intelligence. The nonprofit, OpenAI, Inc., is the sole controlling shareholder of OpenAI Global, LLC, which, despite being a for-profit company, retains a formal fiduciary responsibility to OpenAI, Inc.'s nonprofit charter. A majority of OpenAI, Inc.'s board is barred from having financial stakes in OpenAI Global, LLC. In addition, minority members with a stake in OpenAI Global, LLC are barred from certain votes due to conflict of interest. Some researchers have argued that OpenAI Global, LLC's switch to for-profit status is inconsistent with OpenAI's claims to be "democratizing" AI. On February 29, 2024, Elon Musk filed a lawsuit against OpenAI and CEO Sam Altman, accusing them of shifting focus from public benefit to profit maximization—a case OpenAI dismissed as "incoherent" and "frivolous," though Musk later revived legal action against Altman and others in August. On April 9, 2024, OpenAI countersued Musk in federal court, alleging that he had engaged in "bad-faith tactics" to slow the company's progress and seize its innovations for his personal benefit. OpenAI also argued that Musk had previously supported the creation of a for-profit structure and had expressed interest in controlling OpenAI himself. The countersuit seeks damages and legal measures to prevent further alleged interference. On February 10, 2025, a consortium of investors led by Elon Musk submitted a $97.4 billion unsolicited bid to buy the nonprofit that controls OpenAI, declaring willingness to match or exceed any better offer. The offer was rejected on 14 February 2025, with OpenAI stating that it was not for sale, but the offer complicated Altman's restructuring plan by suggesting a lower bar for how much the nonprofit should be valued. OpenAI, Inc. was originally designed as a nonprofit in order to ensure that AGI "benefits all of humanity" rather than "the private gain of any person". In 2019, it created OpenAI Global, LLC, a capped-profit subsidiary controlled by the nonprofit. In December 2024, OpenAI proposed a restructuring plan to convert the capped-profit into a Delaware-based public benefit corporation (PBC), and to release it from the control of the nonprofit. The nonprofit would sell its control and other assets, getting equity in return, and would use it to fund and pursue separate charitable projects, including in science and education. OpenAI's leadership described the change as necessary to secure additional investments, and claimed that the nonprofit's founding mission to ensure AGI "benefits all of humanity" would be better fulfilled. The plan has been criticized by former employees. A legal letter named "Not For Private Gain" asked the attorneys general of California and Delaware to intervene, stating that the restructuring is illegal and would remove governance safeguards from the nonprofit and the attorneys general. The letter argues that OpenAI's complex structure was deliberately designed to remain accountable to its mission, without the conflicting pressure of maximizing profits. It contends that the nonprofit is best positioned to advance its mission of ensuring AGI benefits all of humanity by continuing to control OpenAI Global, LLC, whatever the amount of equity that it could get in exchange. PBCs can choose how they balance their mission with profit-making. Controlling shareholders have a large influence on how closely a PBC sticks to its mission. On October 28, 2025, OpenAI announced that it had adopted the new PBC corporate structure after receiving approval from the attorneys general of California and Delaware. Under the new structure, OpenAI's for-profit branch became a public benefit corporation known as OpenAI Group PBC, while the non-profit was renamed to the OpenAI Foundation. The OpenAI Foundation holds a 26% stake in the PBC, while Microsoft holds a 27% stake and the remaining 47% is owned by employees and other investors. All members of the OpenAI Group PBC board of directors will be appointed by the OpenAI Foundation, which can remove them at any time. Members of the Foundation's board will also serve on the for-profit board. The new structure allows the for-profit PBC to raise investor funds like most traditional tech companies, including through an initial public offering, which Altman claimed was the most likely path forward. In January 2023, OpenAI Global, LLC was in talks for funding that would value the company at $29 billion, double its 2021 value. On January 23, 2023, Microsoft announced a new US$10 billion investment in OpenAI Global, LLC over multiple years, partially needed to use Microsoft's cloud-computing service Azure. From September to December, 2023, Microsoft rebranded all variants of its Copilot to Microsoft Copilot, and they added MS-Copilot to many installations of Windows and released Microsoft Copilot mobile apps. Following OpenAI's 2025 restructuring, Microsoft owns a 27% stake in the for-profit OpenAI Group PBC, valued at $135 billion. In a deal announced the same day, OpenAI agreed to purchase $250 billion of Azure services, with Microsoft ceding their right of first refusal over OpenAI's future cloud computing purchases. As part of the deal, OpenAI will continue to share 20% of its revenue with Microsoft until it achieves AGI, which must now be verified by an independent panel of experts. The deal also loosened restrictions on both companies working with third parties, allowing Microsoft to pursue AGI independently and allowing OpenAI to develop products with other companies. In 2017, OpenAI spent $7.9 million, a quarter of its functional expenses, on cloud computing alone. In comparison, DeepMind's total expenses in 2017 were $442 million. In the summer of 2018, training OpenAI's Dota 2 bots required renting 128,000 CPUs and 256 GPUs from Google for multiple weeks. In October 2024, OpenAI completed a $6.6 billion capital raise with a $157 billion valuation including investments from Microsoft, Nvidia, and SoftBank. On January 21, 2025, Donald Trump announced The Stargate Project, a joint venture between OpenAI, Oracle, SoftBank and MGX to build an AI infrastructure system in conjunction with the US government. The project takes its name from OpenAI's existing "Stargate" supercomputer project and is estimated to cost $500 billion. The partners planned to fund the project over the next four years. In July, the United States Department of Defense announced that OpenAI had received a $200 million contract for AI in the military, along with Anthropic, Google, and xAI. In the same month, the company made a deal with the UK Government to use ChatGPT and other AI tools in public services. OpenAI subsequently began a $50 million fund to support nonprofit and community organizations. In April 2025, OpenAI raised $40 billion at a $300 billion post-money valuation, which was the highest-value private technology deal in history. The financing round was led by SoftBank, with other participants including Microsoft, Coatue, Altimeter and Thrive. In July 2025, the company reported annualized revenue of $12 billion. This was an increase from $3.7 billion in 2024, which was driven by ChatGPT subscriptions, which reached 20 million paid subscribers by April 2025, up from 15.5 million at the end of 2024, alongside a rapidly expanding enterprise customer base that grew to five million business users. The company’s cash burn remains high because of the intensive computational costs required to train and operate large language models. It projects an $8 billion operating loss in 2025. OpenAI reports revised long-term spending projections totaling approximately $115 billion through 2029, with annual expenditures projected to escalate significantly, reaching $17 billion in 2026, $35 billion in 2027, and $45 billion in 2028. These expenditures are primarily allocated toward expanding compute infrastructure, developing proprietary AI chips, constructing data centers, and funding intensive model training programs, with more than half of the spending through the end of the decade expected to support research-intensive compute for model training and development. The company's financial strategy prioritizes market expansion and technological advancement over near-term profitability, with OpenAI targeting cash-flow-positive operations by 2029 and projecting revenue of approximately $200 billion by 2030. This aggressive spending trajectory underscores both the enormous capital requirements of scaling cutting-edge AI technology and OpenAI's commitment to maintaining its position as a leader in the artificial intelligence industry. In October 2025, OpenAI completed an employee share sale of up to $10 billion to existing investors which valued the company at $500 billion. The deal values OpenAI as the most valuable privately owned company in the world—surpassing SpaceX as the world's most valuable private company. On November 17, 2023, Sam Altman was removed as CEO when its board of directors (composed of Helen Toner, Ilya Sutskever, Adam D'Angelo and Tasha McCauley) cited a lack of confidence in him. Chief Technology Officer Mira Murati took over as interim CEO. Greg Brockman, the president of OpenAI, was also removed as chairman of the board and resigned from the company's presidency shortly thereafter. Three senior OpenAI researchers subsequently resigned: director of research and GPT-4 lead Jakub Pachocki, head of AI risk Aleksander Mądry, and researcher Szymon Sidor. On November 18, 2023, there were reportedly talks of Altman returning as CEO amid pressure placed upon the board by investors such as Microsoft and Thrive Capital, who objected to Altman's departure. Although Altman himself spoke in favor of returning to OpenAI, he has since stated that he considered starting a new company and bringing former OpenAI employees with him if talks to reinstate him didn't work out. The board members agreed "in principle" to resign if Altman returned. On November 19, 2023, negotiations with Altman to return failed and Murati was replaced by Emmett Shear as interim CEO. The board initially contacted Anthropic CEO Dario Amodei (a former OpenAI executive) about replacing Altman, and proposed a merger of the two companies, but both offers were declined. On November 20, 2023, Microsoft CEO Satya Nadella announced Altman and Brockman would be joining Microsoft to lead a new advanced AI research team, but added that they were still committed to OpenAI despite recent events. Before the partnership with Microsoft was finalized, Altman gave the board another opportunity to negotiate with him. About 738 of OpenAI's 770 employees, including Murati and Sutskever, signed an open letter stating they would quit their jobs and join Microsoft if the board did not rehire Altman and then resign. This prompted OpenAI investors to consider legal action against the board as well. In response, OpenAI management sent an internal memo to employees stating that negotiations with Altman and the board had resumed and would take some time. On November 21, 2023, after continued negotiations, Altman and Brockman returned to the company in their prior roles along with a reconstructed board made up of new members Bret Taylor (as chairman) and Lawrence Summers, with D'Angelo remaining. According to subsequent reporting, shortly before Altman’s firing, some employees raised concerns to the board about how he had handled the safety implications of a recent internal AI capability discovery. On November 29, 2023, OpenAI announced that an anonymous Microsoft employee had joined the board as a non-voting member to observe the company's operations; Microsoft resigned from the board in July 2024. In February 2024, the Securities and Exchange Commission subpoenaed OpenAI's internal communication to determine if Altman's alleged lack of candor misled investors. In 2024, following the temporary removal of Sam Altman and his return, many employees gradually left OpenAI, including most of the original leadership team and a significant number of AI safety researchers. In August 2023, it was announced that OpenAI had acquired the New York-based start-up Global Illumination, a company that deploys AI to develop digital infrastructure and creative tools. In June 2024, OpenAI acquired Multi, a startup focused on remote collaboration. In March 2025, OpenAI reached a deal with CoreWeave to acquire $350 million worth of CoreWeave shares and access to AI infrastructure, in return for $11.9 billion paid over five years. Microsoft was already CoreWeave's biggest customer in 2024. Alongside their other business dealings, OpenAI and Microsoft were renegotiating the terms of their partnership to facilitate a potential future initial public offering by OpenAI, while ensuring Microsoft's continued access to advanced AI models. On May 21, OpenAI announced the $6.5 billion acquisition of io, an AI hardware start-up founded by former Apple designer Jony Ive in 2024. In September 2025, OpenAI agreed to acquire the product testing startup Statsig for $1.1 billion in an all-stock deal and appointed Statsig's founding CEO Vijaye Raji as OpenAI's chief technology officer of applications. The company also announced development of an AI-driven hiring service designed to rival LinkedIn. OpenAI acquired personal finance app Roi in October 2025. In October 2025, OpenAI acquired Software Applications Incorporated, the developer of Sky, a macOS-based natural language interface designed to operate across desktop applications. The Sky team joined OpenAI, and the company announced plans to integrate Sky’s capabilities into ChatGPT. In December 2025, it was announced OpenAI had agreed to acquire Neptune, an AI tooling startup that helps companies track and manage model training, for an undisclosed amount. In January 2026, it was announced OpenAI had acquired healthcare technology startup Torch for approximately $60 million. The acquisition followed the launch of OpenAI’s ChatGPT Health product and was intended to strengthen the company’s medical data and healthcare artificial intelligence capabilities. OpenAI has been criticized for outsourcing the annotation of data sets to Sama, a company based in San Francisco that employed workers in Kenya. These annotations were used to train an AI model to detect toxicity, which could then be used to moderate toxic content, notably from ChatGPT's training data and outputs. However, these pieces of text usually contained detailed descriptions of various types of violence, including sexual violence. The investigation uncovered that OpenAI began sending snippets of data to Sama as early as November 2021. The four Sama employees interviewed by Time described themselves as mentally scarred. OpenAI paid Sama $12.50 per hour of work, and Sama was redistributing the equivalent of between $1.32 and $2.00 per hour post-tax to its annotators. Sama's spokesperson said that the $12.50 was also covering other implicit costs, among which were infrastructure expenses, quality assurance and management. In 2024, OpenAI began collaborating with Broadcom to design a custom AI chip capable of both training and inference, targeted for mass production in 2026 and to be manufactured by TSMC on a 3 nm process node. This initiative intended to reduce OpenAI's dependence on Nvidia GPUs, which are costly and face high demand in the market. In January 2024, Arizona State University purchased ChatGPT Enterprise in OpenAI's first deal with a university. In June 2024, Apple Inc. signed a contract with OpenAI to integrate ChatGPT features into its products as part of its new Apple Intelligence initiative. In June 2025, OpenAI began renting Google Cloud's Tensor Processing Units (TPUs) to support ChatGPT and related services, marking its first meaningful use of non‑Nvidia AI chips. In September 2025, it was revealed that OpenAI signed a contract with Oracle to purchase $300 billion in computing power over the next five years. In September 2025, OpenAI and NVIDIA announced a memorandum of understanding that included a potential deployment of at least 10 gigawatts of NVIDIA systems and a $100 billion investment from NVIDIA in OpenAI. OpenAI expected the negotiations to be completed within weeks. As of January 2026, this has not been realized, and the two sides are rethinking the future of their partnership. In October 2025, OpenAI announced a multi-billion dollar deal with AMD. OpenAI committed to purchasing six gigawatts worth of AMD chips, starting with the MI450. OpenAI will have the option to buy up to 160 million shares of AMD, about 10% of the company, depending on development, performance and share price targets. In December 2025, Disney said it would make a $1 billion investment in OpenAI, and signed a three-year licensing deal that will let users generate videos using Sora—OpenAI's short-form AI video platform. More than 200 Disney, Marvel, Star Wars and Pixar characters will be available to OpenAI users. In early 2026, Amazon entered advanced discussions to invest up to $50 billion in OpenAI as part of a potential artificial intelligence partnership. Under the proposed agreement, OpenAI’s models could be integrated into Amazon’s digital assistant Alexa and other internal projects. OpenAI provides LLMs to the Artificial Intelligence Cyber Challenge and to the Advanced Research Projects Agency for Health. In October 2024, The Intercept revealed that OpenAI's tools are considered "essential" for AFRICOM's mission and included in an "Exception to Fair Opportunity" contractual agreement between the United States Department of Defense and Microsoft. In December 2024, OpenAI said it would partner with defense-tech company Anduril to build drone defense technologies for the United States and its allies. In 2025, OpenAI's Chief Product Officer, Kevin Weil, was commissioned lieutenant colonel in the U.S. Army to join Detachment 201 as senior advisor. In June 2025, the U.S. Department of Defense awarded OpenAI a $200 million one-year contract to develop AI tools for military and national security applications. OpenAI announced a new program, OpenAI for Government, to give federal, state, and local governments access to its models, including ChatGPT. Services In February 2019, GPT-2 was announced, which gained attention for its ability to generate human-like text. In 2020, OpenAI announced GPT-3, a language model trained on large internet datasets. GPT-3 is aimed at natural language answering questions, but it can also translate between languages and coherently generate improvised text. It also announced that an associated API, named the API, would form the heart of its first commercial product. Eleven employees left OpenAI, mostly between December 2020 and January 2021, in order to establish Anthropic. In 2021, OpenAI introduced DALL-E, a specialized deep learning model adept at generating complex digital images from textual descriptions, utilizing a variant of the GPT-3 architecture. In December 2022, OpenAI received widespread media coverage after launching a free preview of ChatGPT, its new AI chatbot based on GPT-3.5. According to OpenAI, the preview received over a million signups within the first five days. According to anonymous sources cited by Reuters in December 2022, OpenAI Global, LLC was projecting $200 million of revenue in 2023 and $1 billion in revenue in 2024. After ChatGPT was launched, Google announced a similar chatbot, Bard, amid internal concerns that ChatGPT could threaten Google’s position as a primary source of online information. On February 7, 2023, Microsoft announced that it was building AI technology based on the same foundation as ChatGPT into Microsoft Bing, Edge, Microsoft 365 and other products. On March 14, 2023, OpenAI released GPT-4, both as an API (with a waitlist) and as a feature of ChatGPT Plus. On November 6, 2023, OpenAI launched GPTs, allowing individuals to create customized versions of ChatGPT for specific purposes, further expanding the possibilities of AI applications across various industries. On November 14, 2023, OpenAI announced they temporarily suspended new sign-ups for ChatGPT Plus due to high demand. Access for newer subscribers re-opened a month later on December 13. In December 2024, the company launched the Sora model. It also launched OpenAI o1, an early reasoning model that was internally codenamed strawberry. Additionally, ChatGPT Pro—a $200/month subscription service offering unlimited o1 access and enhanced voice features—was introduced, and preliminary benchmark results for the upcoming OpenAI o3 models were shared. On January 23, 2025, OpenAI released Operator, an AI agent and web automation tool for accessing websites to execute goals defined by users. The feature was only available to Pro users in the United States. OpenAI released deep research agent, nine days later. It scored a 27% accuracy on the benchmark Humanity's Last Exam (HLE). Altman later stated GPT-4.5 would be the last model without full chain-of-thought reasoning. In July 2025, reports indicated that AI models by both OpenAI and Google DeepMind solved mathematics problems at the level of top-performing students in the International Mathematical Olympiad. OpenAI's large language model was able to achieve gold medal-level performance, reflecting significant progress in AI's reasoning abilities. On October 6, 2025, OpenAI unveiled its Agent Builder platform during the company's DevDay event. The platform includes a visual drag-and-drop interface that lets developers and businesses design, test, and deploy agentic workflows with limited coding. On October 21, 2025, OpenAI introduced ChatGPT Atlas, a browser integrating the ChatGPT assistant directly into web navigation, to compete with existing browsers such as Google Chrome and Apple Safari. On December 11, 2025, OpenAI announced GPT-5.2. This model will be better at creating spreadsheets, building presentations, perceiving images, writing code and understanding long context. On January 27, 2026, OpenAI introduced Prism, a LaTeX-native workspace meant to assist scientists to help with research and writing. The platform utilizes GPT-5.2 as a backend to automate the process of drafting for scientific papers, including features for managing citations, complex equation formatting, and real-time collaborative editing. In March 2023, the company was criticized for disclosing particularly few technical details about products like GPT-4, contradicting its initial commitment to openness and making it harder for independent researchers to replicate its work and develop safeguards. OpenAI cited competitiveness and safety concerns to justify this repudiation. OpenAI's former chief scientist Ilya Sutskever argued in 2023 that open-sourcing increasingly capable models was increasingly risky, and that the safety reasons for not open-sourcing the most potent AI models would become "obvious" in a few years. In September 2025, OpenAI published a study on how people use ChatGPT for everyday tasks. The study found that "non-work tasks" (according to an LLM-based classifier) account for more than 72 percent of all ChatGPT usage, with a minority of overall usage related to business productivity. In July 2023, OpenAI launched the superalignment project, aiming within four years to determine how to align future superintelligent systems. OpenAI promised to dedicate 20% of its computing resources to the project, although the team denied receiving anything close to 20%. OpenAI ended the project in May 2024 after its co-leaders Ilya Sutskever and Jan Leike left the company. In August 2025, OpenAI was criticized after thousands of private ChatGPT conversations were inadvertently exposed to public search engines like Google due to an experimental "share with search engines" feature. The opt-in toggle, intended to allow users to make specific chats discoverable, resulted in some discussions including personal details such as names, locations, and intimate topics appearing in search results when users accidentally enabled it while sharing links. OpenAI announced the feature's permanent removal on August 1, 2025, and the company began coordinating with search providers to remove the exposed content, emphasizing that it was not a security breach but a design flaw that heightened privacy risks. CEO Sam Altman acknowledged the issue in a podcast, noting users often treat ChatGPT as a confidant for deeply personal matters, which amplified concerns about AI handling sensitive data. Management In 2018, Musk resigned from his Board of Directors seat, citing "a potential future conflict [of interest]" with his role as CEO of Tesla due to Tesla's AI development for self-driving cars. OpenAI stated that Musk's financial contributions were below $45 million. On March 3, 2023, Reid Hoffman resigned from his board seat, citing a desire to avoid conflicts of interest with his investments in AI companies via Greylock Partners, and his co-founding of the AI startup Inflection AI. Hoffman remained on the board of Microsoft, a major investor in OpenAI. In May 2024, Chief Scientist Ilya Sutskever resigned and was succeeded by Jakub Pachocki. Co-leader Jan Leike also departed amid concerns over safety and trust. OpenAI then signed deals with Reddit, News Corp, Axios, and Vox Media. Paul Nakasone then joined the board of OpenAI. In August 2024, cofounder John Schulman left OpenAI to join Anthropic, and OpenAI's president Greg Brockman took extended leave until November. In September 2024, CTO Mira Murati left the company. In November 2025, Lawrence Summers resigned from the board of directors. Governance and legal issues In May 2023, Sam Altman, Greg Brockman and Ilya Sutskever posted recommendations for the governance of superintelligence. They stated that superintelligence could happen within the next 10 years, allowing a "dramatically more prosperous future" and that "given the possibility of existential risk, we can't just be reactive". They proposed creating an international watchdog organization similar to IAEA to oversee AI systems above a certain capability threshold, suggesting that relatively weak AI systems on the other side should not be overly regulated. They also called for more technical safety research for superintelligences, and asked for more coordination, for example through governments launching a joint project which "many current efforts become part of". In July 2023, the FTC issued a civil investigative demand to OpenAI to investigate whether the company's data security and privacy practices to develop ChatGPT were unfair or harmed consumers (including by reputational harm) in violation of Section 5 of the Federal Trade Commission Act of 1914. These are typically preliminary investigative matters and are nonpublic, but the FTC's document was leaked. In July 2023, the FTC launched an investigation into OpenAI over allegations that the company scraped public data and published false and defamatory information. They asked OpenAI for comprehensive information about its technology and privacy safeguards, as well as any steps taken to prevent the recurrence of situations in which its chatbot generated false and derogatory content about people. The agency also raised concerns about ‘circular’ spending arrangements—for example, Microsoft extending Azure credits to OpenAI while both companies shared engineering talent—and warned that such structures could negatively affect the public. In September 2024, OpenAI's global affairs chief endorsed the UK's "smart" AI regulation during testimony to a House of Lords committee. In February 2025, OpenAI CEO Sam Altman stated that the company is interested in collaborating with the People's Republic of China, despite regulatory restrictions imposed by the U.S. government. This shift comes in response to the growing influence of the Chinese artificial intelligence company DeepSeek, which has disrupted the AI market with open models, including DeepSeek V3 and DeepSeek R1. Following DeepSeek's market emergence, OpenAI enhanced security protocols to protect proprietary development techniques from industrial espionage. Some industry observers noted similarities between DeepSeek's model distillation approach and OpenAI's methodology, though no formal intellectual property claim was filed. According to Oliver Roberts, in March 2025, the United States had 781 state AI bills or laws. OpenAI advocated for preempting state AI laws with federal laws. According to Scott Kohler, OpenAI has opposed California's AI legislation and suggested that the state bill encroaches on a more competent federal government. Public Citizen opposed a federal preemption on AI and pointed to OpenAI's growth and valuation as evidence that existing state laws have not hampered innovation. Before May 2024, OpenAI required departing employees to sign a lifelong non-disparagement agreement forbidding them from criticizing OpenAI and acknowledging the existence of the agreement. Daniel Kokotajlo, a former employee, publicly stated that he forfeited his vested equity in OpenAI in order to leave without signing the agreement. Sam Altman stated that he was unaware of the equity cancellation provision, and that OpenAI never enforced it to cancel any employee's vested equity. However, leaked documents and emails refute this claim. On May 23, 2024, OpenAI sent a memo releasing former employees from the agreement. OpenAI was sued for copyright infringement by authors Sarah Silverman, Matthew Butterick, Paul Tremblay and Mona Awad in July 2023. In September 2023, 17 authors, including George R. R. Martin, John Grisham, Jodi Picoult and Jonathan Franzen, joined the Authors Guild in filing a class action lawsuit against OpenAI, alleging that the company's technology was illegally using their copyrighted work. The New York Times also sued the company in late December 2023. In May 2024 it was revealed that OpenAI had destroyed its Books1 and Books2 training datasets, which were used in the training of GPT-3, and which the Authors Guild believed to have contained over 100,000 copyrighted books. In 2021, OpenAI developed a speech recognition tool called Whisper. OpenAI used it to transcribe more than one million hours of YouTube videos into text for training GPT-4. The automated transcription of YouTube videos raised concerns within OpenAI employees regarding potential violations of YouTube's terms of service, which prohibit the use of videos for applications independent of the platform, as well as any type of automated access to its videos. Despite these concerns, the project proceeded with notable involvement from OpenAI's president, Greg Brockman. The resulting dataset proved instrumental in training GPT-4. In February 2024, The Intercept as well as Raw Story and Alternate Media Inc. filed lawsuit against OpenAI on copyright litigation ground. The lawsuit is said to have charted a new legal strategy for digital-only publishers to sue OpenAI. On April 30, 2024, eight newspapers filed a lawsuit in the Southern District of New York against OpenAI and Microsoft, claiming illegal harvesting of their copyrighted articles. The suing publications included The Mercury News, The Denver Post, The Orange County Register, St. Paul Pioneer Press, Chicago Tribune, Orlando Sentinel, Sun Sentinel, and New York Daily News. In June 2023, a lawsuit claimed that OpenAI scraped 300 billion words online without consent and without registering as a data broker. It was filed in San Francisco, California, by sixteen anonymous plaintiffs. They also claimed that OpenAI and its partner as well as customer Microsoft continued to unlawfully collect and use personal data from millions of consumers worldwide to train artificial intelligence models. On May 22, 2024, OpenAI entered into an agreement with News Corp to integrate news content from The Wall Street Journal, the New York Post, The Times, and The Sunday Times into its AI platform. Meanwhile, other publications like The New York Times chose to sue OpenAI and Microsoft for copyright infringement over the use of their content to train AI models. In November 2024, a coalition of Canadian news outlets, including the Toronto Star, Metroland Media, Postmedia, The Globe and Mail, The Canadian Press and CBC, sued OpenAI for using their news articles to train its software without permission. In October 2024 during a New York Times interview, Suchir Balaji accused OpenAI of violating copyright law in developing its commercial LLMs which he had helped engineer. He was a likely witness in a major copyright trial against the AI company, and was one of several of its current or former employees named in court filings as potentially having documents relevant to the case. On November 26, 2024, Balaji died by suicide. His death prompted the circulation of conspiracy theories alleging that he had been deliberately silenced. California Congressman Ro Khanna endorsed calls for an investigation. On April 24, 2025, Ziff Davis sued OpenAI in Delaware federal court for copyright infringement. Ziff Davis is known for publications such as ZDNet, PCMag, CNET, IGN and Lifehacker. In April 2023, the EU's European Data Protection Board (EDPB) formed a dedicated task force on ChatGPT "to foster cooperation and to exchange information on possible enforcement actions conducted by data protection authorities" based on the "enforcement action undertaken by the Italian data protection authority against OpenAI about the ChatGPT service". In late April 2024 NOYB filed a complaint with the Austrian Datenschutzbehörde against OpenAI for violating the European General Data Protection Regulation. A text created with ChatGPT gave a false date of birth for a living person without giving the individual the option to see the personal data used in the process. A request to correct the mistake was denied. Additionally, neither the recipients of ChatGPT's work nor the sources used, could be made available, OpenAI claimed. OpenAI was criticized for lifting its ban on using ChatGPT for "military and warfare". Up until January 10, 2024, its "usage policies" included a ban on "activity that has high risk of physical harm, including", specifically, "weapons development" and "military and warfare". Its new policies prohibit "[using] our service to harm yourself or others" and to "develop or use weapons". In August 2025, the parents of a 16-year-old boy who died by suicide filed a wrongful death lawsuit against OpenAI (and CEO Sam Altman), alleging that months of conversations with ChatGPT about mental health and methods of self-harm contributed to their son's death and that safeguards were inadequate for minors. OpenAI expressed condolences and said it was strengthening protections (including updated crisis response behavior and parental controls). Coverage described it as a first-of-its-kind wrongful death case targeting the company's chatbot. The complaint was filed in California state court in San Francisco. In November 2025, the Social Media Victims Law Center and Tech Justice Law Project filed seven lawsuits against OpenAI, of which four lawsuits alleged wrongful death. The suits were filed on behalf of Zane Shamblin, 23, of Texas; Amaurie Lacey, 17, of Georgia; Joshua Enneking, 26, of Florida; and Joe Ceccanti, 48, of Oregon, who each committed suicide after prolonged ChatGPT usage. In December 2025, Stein-Erik Soelberg, who was 56 years old at the time, allegedly murdered his mother Suzanne Adams. In the months prior the paranoid, delusional man often discussed his ideas with ChatGPT. Adam's estate then sued OpenAI claiming that the company shared responsibility due to the risk of chatbot psychosis despite the fact that chatbot psychosis is not a real medical diagnosis. OpenAI responded saying they will make ChatGPT safer for users disconnected from reality. See also References Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/%C3%86] | [TOKENS: 1388] |
Contents Æ Æ (lowercase: æ) is a character formed from the letters a and e, originally a ligature representing the Latin diphthong ae. It has been promoted to the status of a letter in some languages, including Danish, Norwegian, Icelandic, and Faroese. It was also used in both Old Swedish, before being replaced by ä, and Old English, where it was eventually dropped entirely in favour of a. The modern International Phonetic Alphabet uses it to represent the near-open front unrounded vowel (the sound represented by the 'a' in English words such as cat). Diacritic variants include Ǣ/ǣ, Ǽ/ǽ, Æ̀/æ̀, Æ̂/æ̂ and Æ̃/æ̃.[a] As a letter of the Old English Latin alphabet, it was called æsc, "ash tree", after the Anglo-Saxon futhorc rune ᚫ which it transliterated; its traditional name in English is still ash, or æsh (Old English: æsċ) if the ligature is included. Languages In English, use of the ligature varies between different places and contexts, but it is fairly rare. In modern typography, if technological limitations make the use of æ difficult (such as in use of typewriters, telegraphs, or ASCII), the digraph ae is often used instead. In Old English, æ represented a sound between a and e (/æ/), very much like the short a of cat in many dialects of Modern English. If long vowels are distinguished from short vowels, the long version /æː/ is marked with a macron (ǣ) or, less commonly, an acute (ǽ). In the United States, the issue of the ligature is sidestepped in many cases by use of a simplified spelling with "e", as happened with œ as well; thus medieval is more commonly used than mediaeval in American English. Usage of the ae diphthong, however, may vary. For example medieval is now more common than mediaeval (and the now old-fashioned mediæval), even in the United Kingdom. In the modern French alphabet, æ (called e-dans-l'a, 'e in the a') is used to spell Latin and Greek borrowings like curriculum vitæ, et cætera, ex æquo, tænia, and the first name Lætitia. It is mentioned in the name of Serge Gainsbourg's song Elaeudanla Téïtéïa, a reading of the French spelling of the name Lætitia: "L, A, E dans l'A, T, I, T, I, A." In Classical Latin, the combination AE denotes the diphthong [aj], which had a value similar to the long i in fine as pronounced in most dialects of Modern English. Both classical and present practice is to write the letters separately, but the ligature was used in medieval and early modern writings, in part because æ was reduced to the simple vowel [ɛ] during the Roman Empire. In some medieval scripts, the ligature was simplified to ę, an e with ogonek, called the e caudata (Latin for "tailed e"). That was further simplified into a plain e, which may have influenced or been influenced by the pronunciation change. However the ligature is still relatively common in liturgical books and musical scores. Old Norse In Old Norse, æ represents the long vowel /ɛː/. The short version of the same vowel, /ɛ/, if it is distinguished from /e/, is written as ę. Icelandic In Icelandic, æ represents the diphthong [ai], which can be long or short. Faroese In most varieties of Faroese, æ is pronounced as follows: One of its etymological origins is Old Norse é (the other is Old Norse æ), which is particularly evident in the dialects of Suðuroy, where Æ is [eː] or [ɛ]: German and Swedish The equivalent letter in German and Swedish is ä. In German this letter is after 'z' and in Swedish it is the second-to-last letter (between å and ö). In the normalized spelling of Middle High German, æ represents a long vowel [ɛː]. The actual spelling in the manuscripts varies, however. Danish and Norwegian In Danish and Norwegian, æ is a separate letter of the alphabet and represents a monophthong. It follows z and precedes ø and å. In Norwegian there are four ways of pronouncing the letter: In many northern, western and southwestern Norwegian dialects such as Trøndersk and in the western Danish dialects of Thy and Southern Jutland, the word "I" (Standard Danish: jeg, Bokmål Norwegian: jeg, Nynorsk Norwegian: eg) is pronounced /æː/. Thus, when this word is written as it is pronounced in these dialects (rather than the standard), it is often spelled with the letter "æ". In western and southern Jutish dialects of Danish, æ is also the proclitic definite article: æ hus (the house), as opposed to Standard Danish and all other Nordic varieties which have enclitic definite articles (Danish, Swedish, Norwegian: huset; Icelandic, Faroese: húsið [the house]). Ossetian – which previously and later used a Cyrillic alphabet with an identical-looking letter (Ӕ and ӕ) – was written using the Latin script from 1923 to 1938, and included this character. It is pronounced as a near-open central vowel [ɐ]. The letter Æ is used in the official orthography of the Kawésqar language, spoken in Chile and also in that of the Fuegian language Yaghan. In the orthographies of both languages, the letter represents /æ/. In Mochica, the exact sound value æ was used for is unknown, but is thought to be [ɨ]. International Phonetic Alphabet The symbol [æ] is also used in the International Phonetic Alphabet (IPA) to denote a near-open front unrounded vowel such as in the word cat in many dialects of Modern English, which is the sound that was most likely represented by the Old English letter. In the IPA it is always in lowercase. U+10783 𐞃 MODIFIER LETTER SMALL AE is a superscript IPA letter. Uralic Phonetic Alphabet The Uralic Phonetic Alphabet (UPA) uses four additional æ-related symbols, see Unicode table below. Unicode See also Footnotes Further reading External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Kilowatt_hour] | [TOKENS: 1896] |
Contents Kilowatt-hour A kilowatt-hour (unit symbol: kW⋅h or kW h; commonly written as kWh) is a non-SI unit of energy equal to 3.6 megajoules (MJ) in SI units, which is the energy delivered by one kilowatt of power for one hour. Kilowatt-hours are a common billing unit for electrical energy supplied by electric utilities. Metric prefixes are used for multiples and submultiples of the basic unit, the watt-hour (3.6 kJ). Definition The kilowatt-hour is a composite unit of energy equal to one kilowatt (kW) multiplied by (i.e., sustained for) one hour. The International System of Units (SI) unit of energy is the joule (symbol J). Because a watt is by definition one joule per second, and because there are 3,600 seconds in an hour, one kWh equals 3,600 kilojoules or 3.6 MJ. Unit representations A widely used representation of the kilowatt-hour is kWh, derived from its component units, kilowatt and hour. It is commonly used in billing for delivered energy to consumers by electric utility companies, and in commercial, educational, and scientific publications, and in the media. It is also the usual unit representation in electrical power engineering. This common representation, however, does not comply with the style guide of the International System of Units (SI). Other representations of the unit may be encountered: The hour is a unit of time listed among the non-SI units accepted by the International Bureau of Weights and Measures for use with the SI. An electric heater consuming 1,000 watts (1 kilowatt) operating for one hour uses one kilowatt-hour of energy, as does a television consuming 100 watts operating continuously for 10 hours or a 40-watt electric appliance operating continuously for 25 hours. Electricity sales Electrical energy is typically sold to consumers in kilowatt-hours. The cost of running an electrical device with constant power consumption rate is calculated by multiplying the device's power consumption in kilowatts by the operating time in hours, and by the price per kilowatt-hour (numerical integration is needed when the power consumption is not constant over the time period). The unit price of electricity charged by utility companies may depend on the customer's consumption profile over time. Prices vary considerably by locality. In the United States, prices in different states can vary by a factor of three. While smaller customer loads are usually billed only for energy, transmission services, and the rated capacity, larger consumers also pay for peak power consumption, the greatest power recorded in a fairly short time, such as 15 minutes. This compensates the power company for maintaining the infrastructure needed to provide peak power. These charges are billed as demand changes. Industrial users may also have extra charges according to the power factor of their load. Major energy production or consumption is often expressed as terawatt-hours (TWh) for a given period, often a calendar year or financial year. A 365-day year equals 8,760 hours, so one gigawatt sustained over a year corresponds to 8.76 terawatt-hours of energy. Conversely, one terawatt-hour is equal to the sustained power of about 114 megawatts for a period of one year. Examples In 2020, the average household in the United States consumed 893 kWh per month. Raising the temperature of 1 litre of water from room temperature to the boiling point with an electric kettle takes about 0.1 kWh. A 12-watt LED lamp lit constantly uses about 0.3 kWh per 24 hours and about 9 kWh per month. In terms of human power, a healthy adult male manual laborer performs work equal to about half a kilowatt-hour over an eight-hour day. Conversions To convert a quantity measured in a unit in the left column to the units in the top row, multiply by the factor in the cell where the row and column intersect. Watt-hour multiples SI prefixes are commonly applied to the watt-hour: a kilowatt-hour (kWh) is 1,000 Wh; a megawatt-hour (MWh) is 1 million Wh and so on. The kilowatt-hour is commonly used by electrical energy providers for purposes of billing, since the monthly energy consumption of a typical residential customer ranges from a few hundred to a few thousand kilowatt-hours. Megawatt-hours (MWh), gigawatt-hours (GWh), and terawatt-hours (TWh) are often used for metering larger amounts of electrical energy to industrial customers and in power generation. The terawatt-hour and petawatt-hour (PWh) units are large enough to conveniently express the annual electricity generation for whole countries and the world energy consumption. Distinction between kWh (energy) and kW (power) A kilowatt is a unit of power (rate of flow of energy per unit of time). A kilowatt-hour is a unit of energy. Kilowatt per hour would be a rate of change of power flow with time. Work is the amount of energy transferred to a system; power is the rate of delivery of energy. Energy is measured in joules, or watt-seconds. Power is measured in watts, or joules per second. For example, a battery stores energy. When the battery delivers its energy, it does so at a certain power, that is, the rate of delivery of the energy. The higher the power, the quicker the battery's stored energy is delivered. A higher power output will cause the battery's stored energy to be depleted in a shorter time period. Electric energy production and consumption are sometimes reported on a yearly basis, in units such as megawatt-hours per year (MWh/yr) gigawatt-hours/year (GWh/yr) or terawatt-hours per year (TWh/yr). These units have dimensions of energy divided by time and thus are units of power. They can be converted to SI power units by dividing by the number of hours in a year, about 8760 h/yr. Thus, 1 GWh/yr = 1 GWh/8760 h ≈ 114.12 kW. Many compound units for various kinds of rates explicitly mention units of time to indicate a change over time. For example: miles per hour, kilometres per hour, dollars per hour. Power units, such as kW, already measure the rate of energy per unit time (kW=kJ/s). Kilowatt-hours are a product of power and time, not a rate of change of power with time. Watts per hour (W/h) is a unit of a change of power per hour, i.e. an acceleration in the delivery of energy. It is used to measure the daily variation of demand (e.g. the slope of the duck curve), or ramp-up behavior of power plants. For example, a power plant that reaches a power output of 1 MW from 0 MW in 15 minutes has a ramp-up rate of 4 MW/h. Other uses of terms such as watts per hour are likely to be errors. Other related energy units Several other units related to kilowatt-hour are commonly used to indicate power or energy capacity or use in specific application areas. Average annual energy production or consumption can be expressed in kilowatt-hours per year. This is used with loads or output that vary during the year but whose annual totals are similar from one year to the next. For example, it is useful to compare the energy efficiency of household appliances whose power consumption varies with time or the season of the year. Another use is to measure the energy produced by a distributed power source. One kilowatt-hour per year equals about 114.08 milliwatts applied constantly during one year. The energy content of a battery is usually expressed indirectly by its capacity in ampere-hours; to convert ampere-hour (Ah) to watt-hours (Wh), the ampere-hour value must be multiplied by the voltage of the power source. This value is approximate, since the battery voltage is not constant during its discharge, and because higher discharge rates reduce the total amount of energy that the battery can provide. In the case of devices that output a different voltage than the battery, it is the battery voltage (typically 3.7 V for Li-ion) that must be used to calculate rather than the device output (for example, usually 5.0 V for USB portable chargers). This results in a 500 mA USB device running for about 3.7 hours on a 2,500 mAh battery, not five hours. The Board of Trade unit (B.T.U.) is an obsolete UK synonym for kilowatt-hour. The term derives from the name of the Board of Trade which regulated the electricity industry until 1942 when the Ministry of Power took over. It is distinct from a British Thermal Unit (BTU) which is 1055 J. In India, the kilowatt-hour is often simply called a unit of energy. A million units, designated MU, is a gigawatt-hour and a BU (billion units) is a terawatt-hour. See also References |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Meta_Platforms#cite_note-251] | [TOKENS: 8626] |
Contents Meta Platforms Meta Platforms, Inc. (doing business as Meta) is an American multinational technology company headquartered in Menlo Park, California. Meta owns and operates several prominent social media platforms and communication services, including Facebook, Instagram, WhatsApp, Messenger, Threads and Manus. The company also operates an advertising network for its own sites and third parties; as of 2023[update], advertising accounted for 97.8 percent of its total revenue. Meta has been described as a part of Big Tech, which refers to the largest six tech companies in the United States, Alphabet (Google), Amazon, Apple, Meta (Facebook), Microsoft, and Nvidia, which are also the largest companies in the world by market capitalization. The company was originally established in 2004 as TheFacebook, Inc., and was renamed Facebook, Inc. in 2005. In 2021, it rebranded as Meta Platforms, Inc. to reflect a strategic shift toward developing the metaverse—an interconnected digital ecosystem spanning virtual and augmented reality technologies. In 2023, Meta was ranked 31st on the Forbes Global 2000 list of the world's largest public companies. As of 2022, it was the world's third-largest spender on research and development, with R&D expenses totaling US$35.3 billion. History Facebook filed for an initial public offering (IPO) on January 1, 2012. The preliminary prospectus stated that the company sought to raise $5 billion, had 845 million monthly active users, and a website accruing 2.7 billion likes and comments daily. After the IPO, Zuckerberg would retain 22% of the total shares and 57% of the total voting power in Facebook. Underwriters valued the shares at $38 each, valuing the company at $104 billion, the largest valuation yet for a newly public company. On May 16, one day before the IPO, Facebook announced it would sell 25% more shares than originally planned due to high demand. The IPO raised $16 billion, making it the third-largest in US history (slightly ahead of AT&T Mobility and behind only General Motors and Visa). The stock price left the company with a higher market capitalization than all but a few U.S. corporations—surpassing heavyweights such as Amazon, McDonald's, Disney, and Kraft Foods—and made Zuckerberg's stock worth $19 billion. The New York Times stated that the offering overcame questions about Facebook's difficulties in attracting advertisers to transform the company into a "must-own stock". Jimmy Lee of JPMorgan Chase described it as "the next great blue-chip". Writers at TechCrunch, on the other hand, expressed skepticism, stating, "That's a big multiple to live up to, and Facebook will likely need to add bold new revenue streams to justify the mammoth valuation." Trading in the stock, which began on May 18, was delayed that day due to technical problems with the Nasdaq exchange. The stock struggled to stay above the IPO price for most of the day, forcing underwriters to buy back shares to support the price. At the closing bell, shares were valued at $38.23, only $0.23 above the IPO price and down $3.82 from the opening bell value. The opening was widely described by the financial press as a disappointment. The stock set a new record for trading volume of an IPO. On May 25, 2012, the stock ended its first full week of trading at $31.91, a 16.5% decline. On May 22, 2012, regulators from Wall Street's Financial Industry Regulatory Authority announced that they had begun to investigate whether banks underwriting Facebook had improperly shared information only with select clients rather than the general public. Massachusetts Secretary of State William F. Galvin subpoenaed Morgan Stanley over the same issue. The allegations sparked "fury" among some investors and led to the immediate filing of several lawsuits, one of them a class action suit claiming more than $2.5 billion in losses due to the IPO. Bloomberg estimated that retail investors may have lost approximately $630 million on Facebook stock since its debut. S&P Global Ratings added Facebook to its S&P 500 index on December 21, 2013. On May 2, 2014, Zuckerberg announced that the company would be changing its internal motto from "Move fast and break things" to "Move fast with stable infrastructure". The earlier motto had been described as Zuckerberg's "prime directive to his developers and team" in a 2009 interview in Business Insider, in which he also said, "Unless you are breaking stuff, you are not moving fast enough." In November 2016, Facebook announced the Microsoft Windows client of gaming service Facebook Gameroom, formerly Facebook Games Arcade, at the Unity Technologies developers conference. The client allows Facebook users to play "native" games in addition to its web games. The service was closed in June 2021. Lasso was a short-video sharing app from Facebook similar to TikTok that was launched on iOS and Android in 2018 and was aimed at teenagers. On July 2, 2020, Facebook announced that Lasso would be shutting down on July 10. In 2018, the Oculus lead Jason Rubin sent his 50-page vision document titled "The Metaverse" to Facebook's leadership. In the document, Rubin acknowledged that Facebook's virtual reality business had not caught on as expected, despite the hundreds of millions of dollars spent on content for early adopters. He also urged the company to execute fast and invest heavily in the vision, to shut out HTC, Apple, Google and other competitors in the VR space. Regarding other players' participation in the metaverse vision, he called for the company to build the "metaverse" to prevent their competitors from "being in the VR business in a meaningful way at all". In May 2019, Facebook founded Libra Networks, reportedly to develop their own stablecoin cryptocurrency. Later, it was reported that Libra was being supported by financial companies such as Visa, Mastercard, PayPal and Uber. The consortium of companies was expected to pool in $10 million each to fund the launch of the cryptocurrency coin named Libra. Depending on when it would receive approval from the Swiss Financial Market Supervisory authority to operate as a payments service, the Libra Association had planned to launch a limited format cryptocurrency in 2021. Libra was renamed Diem, before being shut down and sold in January 2022 after backlash from Swiss government regulators and the public. During the COVID-19 pandemic, the use of online services, including Facebook, grew globally. Zuckerberg predicted this would be a "permanent acceleration" that would continue after the pandemic. Facebook hired aggressively, growing from 48,268 employees in March 2020 to more than 87,000 by September 2022. Following a period of intense scrutiny and damaging whistleblower leaks, news started to emerge on October 21, 2021 about Facebook's plan to rebrand the company and change its name. In the Q3 2021 earnings call on October 25, Mark Zuckerberg discussed the ongoing criticism of the company's social services and the way it operates, and pointed to the pivoting efforts to building the metaverse – without mentioning the rebranding and the name change. The metaverse vision and the name change from Facebook, Inc. to Meta Platforms was introduced at Facebook Connect on October 28, 2021. Based on Facebook's PR campaign, the name change reflects the company's shifting long term focus of building the metaverse, a digital extension of the physical world by social media, virtual reality and augmented reality features. "Meta" had been registered as a trademark in the United States in 2018 (after an initial filing in 2015) for marketing, advertising, and computer services, by a Canadian company that provided big data analysis of scientific literature. This company was acquired in 2017 by the Chan Zuckerberg Initiative (CZI), a foundation established by Zuckerberg and his wife, Priscilla Chan, and became one of their projects. Following the rebranding announcement, CZI announced that it had already decided to deprioritize the earlier Meta project, thus it would be transferring its rights to the name to Meta Platforms, and the previous project would end in 2022. Soon after the rebranding, in early February 2022, Meta reported a greater-than-expected decline in profits in the fourth quarter of 2021. It reported no growth in monthly users, and indicated it expected revenue growth to stall. It also expected measures taken by Apple Inc. to protect user privacy to cost it some $10 billion in advertisement revenue, an amount equal to roughly 8% of its revenue for 2021. In meeting with Meta staff the day after earnings were reported, Zuckerberg blamed competition for user attention, particularly from video-based apps such as TikTok. The 27% reduction in the company's share price which occurred in reaction to the news eliminated some $230 billion of value from Meta's market capitalization. Bloomberg described the decline as "an epic rout that, in its sheer scale, is unlike anything Wall Street or Silicon Valley has ever seen". Zuckerberg's net worth fell by as much as $31 billion. Zuckerberg owns 13% of Meta, and the holding makes up the bulk of his wealth. According to published reports by Bloomberg on March 30, 2022, Meta turned over data such as phone numbers, physical addresses, and IP addresses to hackers posing as law enforcement officials using forged documents. The law enforcement requests sometimes included forged signatures of real or fictional officials. When asked about the allegations, a Meta representative said, "We review every data request for legal sufficiency and use advanced systems and processes to validate law enforcement requests and detect abuse." In June 2022, Sheryl Sandberg, the chief operating officer of 14 years, announced she would step down that year. Zuckerberg said that Javier Olivan would replace Sandberg, though in a “more traditional” role. In March 2022, Meta (except Meta-owned WhatsApp) and Instagram were banned in Russia and added to the Russian list of terrorist and extremist organizations for alleged Russophobia and hate speech (up to genocidal calls) amid the ongoing Russian invasion of Ukraine. Meta appealed against the ban, but it was upheld by a Moscow court in June of the same year. Also in March 2022, Meta and Italian eyewear giant Luxottica released Ray-Ban Stories, a series of smartglasses which could play music and take pictures. Meta and Luxottica parent company EssilorLuxottica declined to disclose sales on the line of products as of September 2022, though Meta has expressed satisfaction with its customer feedback. In July 2022, Meta saw its first year-on-year revenue decline when its total revenue slipped by 1% to $28.8bn. Analysts and journalists accredited the loss to its advertising business, which has been limited by Apple's app tracking transparency feature and the number of people who have opted not to be tracked by Meta apps. Zuckerberg also accredited the decline to increasing competition from TikTok. On October 27, 2022, Meta's market value dropped to $268 billion, a loss of around $700 billion compared to 2021, and its shares fell by 24%. It lost its spot among the top 20 US companies by market cap, despite reaching the top 5 in the previous year. In November 2022, Meta laid off 11,000 employees, 13% of its workforce. Zuckerberg said the decision to aggressively increase Meta's investments had been a mistake, as he had wrongly predicted that the surge in e-commerce would last beyond the COVID-19 pandemic. He also attributed the decline to increased competition, a global economic downturn and "ads signal loss". Plans to lay off a further 10,000 employees began in April 2023. The layoffs were part of a general downturn in the technology industry, alongside layoffs by companies including Google, Amazon, Tesla, Snap, Twitter and Lyft. Starting from 2022, Meta scrambled to catch up to other tech companies in adopting specialized artificial intelligence hardware and software. It had been using less expensive CPUs instead of GPUs for AI work, but that approach turned out to be less efficient. The company gifted the Inter-university Consortium for Political and Social Research $1.3 million to finance the Social Media Archive's aim to make their data available to social science research. In 2023, Ireland's Data Protection Commissioner imposed a record EUR 1.2 billion fine on Meta for transferring data from Europe to the United States without adequate protections for EU citizens.: 250 In March 2023, Meta announced a new round of layoffs that would cut 10,000 employees and close 5,000 open positions to make the company more efficient. Meta revenue surpassed analyst expectations for the first quarter of 2023 after announcing that it was increasing its focus on AI. On July 6, Meta launched a new app, Threads, a competitor to Twitter. Meta announced its artificial intelligence model Llama 2 in July 2023, available for commercial use via partnerships with major cloud providers like Microsoft. It was the first project to be unveiled out of Meta's generative AI group after it was set up in February. It would not charge access or usage but instead operate with an open-source model to allow Meta to ascertain what improvements need to be made. Prior to this announcement, Meta said it had no plans to release Llama 2 for commercial use. An earlier version of Llama was released to academics. In August 2023, Meta announced its permanent removal of news content from Facebook and Instagram in Canada due to the Online News Act, which requires Canadian news outlets to be compensated for content shared on its platform. The Online News Act was in effect by year-end, but Meta will not participate in the regulatory process. In October 2023, Zuckerberg said that AI would be Meta's biggest investment area in 2024. Meta finished 2023 as one of the best-performing technology stocks of the year, with its share price up 150 percent. Its stock reached an all-time high in January 2024, bringing Meta within 2% of achieving $1 trillion market capitalization. In November 2023 Meta Platforms launched an ad-free service in Europe, allowing subscribers to opt-out of personal data being collected for targeted advertising. A group of 28 European organizations, including Max Schrems' advocacy group NOYB, the Irish Council for Civil Liberties, Wikimedia Europe, and the Electronic Privacy Information Center, signed a 2024 letter to the European Data Protection Board (EDPB) expressing concern that this subscriber model would undermine privacy protections, specifically GDPR data protection standards. Meta removed the Facebook and Instagram accounts of Iran's Supreme Leader Ali Khamenei in February 2024, citing repeated violations of its Dangerous Organizations & Individuals policy. As of March, Meta was under investigation by the FDA for alleged use of their social media platforms to sell illegal drugs. On 16 May 2024, the European Commission began an investigation into Meta over concerns related to child safety. In May 2023, Iraqi social media influencer Esaa Ahmed-Adnan encountered a troubling issue when Instagram removed his posts, citing false copyright violations despite his content being original and free from copyrighted material. He discovered that extortionists were behind these takedowns, offering to restore his content for $3,000 or provide ongoing protection for $1,000 per month. This scam, exploiting Meta’s rights management tools, became widespread in the Middle East, revealing a gap in Meta’s enforcement in developing regions. An Iraqi nonprofit Tech4Peace’s founder, Aws al-Saadi helped Ahmed-Adnan and others, but the restoration process was slow, leading to significant financial losses for many victims, including prominent figures like Ammar al-Hakim. This situation highlighted Meta’s challenges in balancing global growth with effective content moderation and protection. On 16 September 2024, Meta announced it had banned Russian state media outlets from its platforms worldwide due to concerns about "foreign interference activity." This decision followed allegations that RT and its employees funneled $10 million through shell companies to secretly fund influence campaigns on various social media channels. Meta's actions were part of a broader effort to counter Russian covert influence operations, which had intensified since the invasion. At its 2024 Connect conference, Meta presented Orion, its first pair of augmented reality glasses. Though Orion was originally intended to be sold to consumers, the manufacturing process turned out to be too complex and expensive. Instead, the company pivoted to producing a small number of the glasses to be used internally. On 4 October 2024, Meta announced about its new AI model called Movie Gen, capable of generating realistic video and audio clips based on user prompts. Meta stated it would not release Movie Gen for open development, preferring to collaborate directly with content creators and integrate it into its products by the following year. The model was built using a combination of licensed and publicly available datasets. On October 31, 2024, ProPublica published an investigation into deceptive political advertisement scams that sometimes use hundreds of hijacked profiles and facebook pages run by organized networks of scammers. The authors cited spotty enforcement by Meta as a major reason for the extent of the issue. In November 2024, TechCrunch reported that Meta were considering building a $10bn global underwater cable spanning 25,000 miles. In the same month, Meta closed down 2 million accounts on Facebook and Instagram that were linked to scam centers in Myanmar, Laos, Cambodia, the Philippines, and the United Arab Emirates doing pig butchering scams. In December 2024, Meta announced that, beginning February 2025, they would require advertisers to run ads about financial services in Australia to verify information about who are the beneficiary and the payer in a bid to regulate scams. On December 4, 2024, Meta announced it will invest US$10 billion for its largest AI data center in northeast Louisiana, powered by natural gas facilities. On the 11th of that month, Meta experienced a global outage, impacting accounts on all of their social media and messaging applications. Outage reports from DownDetector reached 70,000+ and 100,000+ within minutes for Instagram and Facebook, respectively. In January 2025, Meta announced plans to roll back its diversity, equity, and inclusion (DEI) initiatives, citing shifts in the "legal and policy landscape" in the United States following the 2024 presidential election. The decision followed reports that CEO Mark Zuckerberg sought to align the company more closely with the incoming Trump administration, including changes to content moderation policies and executive leadership. The new content moderation policies continued to bar insults about a person's intellect or mental illness, but made an exception to allow calling LGBTQ people mentally ill because they are gay or transgender. Later that month, Meta agreed to pay $25 million to settle a 2021 lawsuit brought by Donald Trump for suspending his social media accounts after the January 6 riots. Changes to Meta's moderation policies were controversial among its oversight board, with a significant divide in opinion between the board's US conservatives and its global members. In June 2025, Meta Platforms Inc. has decided to make a multibillion-dollar investment into artificial intelligence startup Scale AI. The financing could exceed $10 billion in value which would make it one of the largest private company funding events of all time. In October 2025, it was announced that Meta would be laying off 600 employees in the artificial intelligence unit to perform better and simpler. They referred to their AI unit as "bloated" and are seeking to trim down the department. This mass layoff is going to impact Meta’s AI infrastructure units, Fundamental Artificial Intelligence Research unit (FAIR) and other product-related positions. Mergers and acquisitions Meta has acquired multiple companies (often identified as talent acquisitions). One of its first major acquisitions was in April 2012, when it acquired Instagram for approximately US$1 billion in cash and stock. In October 2013, Facebook, Inc. acquired Onavo, an Israeli mobile web analytics company. In February 2014, Facebook, Inc. announced it would buy mobile messaging company WhatsApp for US$19 billion in cash and stock. The acquisition was completed on October 6. Later that year, Facebook bought Oculus VR for $2.3 billion in cash and stock, which released its first consumer virtual reality headset in 2016. In late November 2019, Facebook, Inc. announced the acquisition of the game developer Beat Games, responsible for developing one of that year's most popular VR games, Beat Saber. In Late 2022, after Facebook Inc rebranded to Meta Platforms Inc, Oculus was rebranded to Meta Quest. In May 2020, Facebook, Inc. announced it had acquired Giphy for a reported cash price of $400 million. It will be integrated with the Instagram team. However, in August 2021, UK's Competition and Markets Authority (CMA) stated that Facebook, Inc. might have to sell Giphy, after an investigation found that the deal between the two companies would harm competition in display advertising market. Facebook, Inc. was fined $70 million by CMA for deliberately failing to report all information regarding the acquisition and the ongoing antitrust investigation. In October 2022, the CMA ruled for a second time that Meta be required to divest Giphy, stating that Meta already controls half of the advertising in the UK. Meta agreed to the sale, though it stated that it disagrees with the decision itself. In May 2023, Giphy was divested to Shutterstock for $53 million. In November 2020, Facebook, Inc. announced that it planned to purchase the customer-service platform and chatbot specialist startup Kustomer to promote companies to use their platform for business. It has been reported that Kustomer valued at slightly over $1 billion. The deal was closed in February 2022 after regulatory approval. In September 2022, Meta acquired Lofelt, a Berlin-based haptic tech startup. In December 2025, it was announced Meta had acquired the AI-wearables startup, Limitless. In the same month, they also acquired another AI startup, Manus AI, for $2 billion. Manus announced in December that its platform had achieved $100mm in recurring revenue just 8 months after its launch and Meta said it will scale the platform to many other businesses. In January 2026, it was announced Meta proposed acquisition of Manus was undergoing preliminary scrutiny by Chinese regulators. The examination concerns the cross-border transfer of artificial intelligence technology developed in China. Lobbying In 2020, Facebook, Inc. spent $19.7 million on lobbying, hiring 79 lobbyists. In 2019, it had spent $16.7 million on lobbying and had a team of 71 lobbyists, up from $12.6 million and 51 lobbyists in 2018. Facebook was the largest spender of lobbying money among the Big Tech companies in 2020. The lobbying team includes top congressional aide John Branscome, who was hired in September 2021, to help the company fend off threats from Democratic lawmakers and the Biden administration. In December 2024, Meta donated $1 million to the inauguration fund for then-President-elect Donald Trump. In 2025, Meta was listed among the donors funding the construction of the White House State Ballroom. Partnerships February 2026, Meta announced a long-term partnership with Nvidia. Censorship In August 2024, Mark Zuckerberg sent a letter to Jim Jordan indicating that during the COVID-19 pandemic the Biden administration repeatedly asked Meta to limit certain COVID-19 content, including humor and satire, on Facebook and Instagram. In 2016 Meta hired Jordana Cutler, formerly an employee at the Israeli Embassy to the United States, as its policy chief for Israel and the Jewish Diaspora. In this role, Cutler pushed for the censorship of accounts belonging to Students for Justice in Palestine chapters in the United States. Critics have said that Cutler's position gives the Israeli government an undue influence over Meta policy, and that few countries have such high levels of contact with Meta policymakers. Following the election of Donald Trump in 2025, various sources noted possible censorship related to the Democratic Party on Instagram and other Meta platforms. In February 2025, a Meta rep flagged journalist Gil Duran's article and other "critiques of tech industry figures" as spam or sensitive content, limiting their reach. In March 2025, Meta attempted to block former employee Sarah Wynn-Williams from promoting or further distributing her memoir, Careless People, that includes allegations of unaddressed sexual harassment in the workplace by senior executives. The New York Times reports that the arbitration is among Meta's most forcible attempts to repudiate a former employee's account of workplace dynamics. Publisher Macmillan reacted to the ruling by the Emergency International Arbitral Tribunal by stating that it will ignore its provisions. As of 15 March 2025[update], hardback and digital versions of Careless People were being offered for sale by major online retailers. From October 2025, Meta began removing and restricting access for accounts related to LGBTQ, reproductive health and abortion information pages on its platforms. Martha Dimitratou, executive director of Repro Uncensored, called Meta's shadow-banning of these issues "One of the biggest waves of censorship we are seeing". Disinformation concerns Since its inception, Meta has been accused of being a host for fake news and misinformation. In the wake of the 2016 United States presidential election, Zuckerberg began to take steps to eliminate the prevalence of fake news, as the platform had been criticized for its potential influence on the outcome of the election. The company initially partnered with ABC News, the Associated Press, FactCheck.org, Snopes and PolitiFact for its fact-checking initiative; as of 2018, it had over 40 fact-checking partners across the world, including The Weekly Standard. A May 2017 review by The Guardian found that the platform's fact-checking initiatives of partnering with third-party fact-checkers and publicly flagging fake news were regularly ineffective, and appeared to be having minimal impact in some cases. In 2018, journalists working as fact-checkers for the company criticized the partnership, stating that it had produced minimal results and that the company had ignored their concerns. In 2024 Meta's decision to continue to disseminate a falsified video of US president Joe Biden, even after it had been proven to be fake, attracted criticism and concern. In January 2025, Meta ended its use of third-party fact-checkers in favor of a user-run community notes system similar to the one used on X. While Zuckerberg supported these changes, saying that the amount of censorship on the platform was excessive, the decision received criticism by fact-checking institutions, stating that the changes would make it more difficult for users to identify misinformation. Meta also faced criticism for weakening its policies on hate speech that were designed to protect minorities and LGBTQ+ individuals from bullying and discrimination. While moving its content review teams from California to Texas, Meta changed their hateful conduct policy to eliminate restrictions on anti-LGBT and anti-immigrant hate speech, as well as explicitly allowing users to accuse LGBT people of being mentally ill or abnormal based on their sexual orientation or gender identity. In January 2025, Meta faced significant criticism for its role in removing LGBTQ+ content from its platforms, amid its broader efforts to address anti-LGBTQ+ hate speech. The removal of LGBTQ+ themes was noted as part of the wider crackdown on content deemed to violate its community guidelines. Meta's content moderation policies, which were designed to combat harmful speech and protect users from discrimination, inadvertently led to the removal or restriction of LGBTQ+ content, particularly posts highlighting LGBTQ+ identities, support, or political issues. According to reports, LGBTQ+ posts, including those that simply celebrated pride or advocated for LGBTQ+ rights, were flagged and removed for reasons that some critics argue were vague or inconsistently applied. Many LGBTQ+ activists and users on Meta's platforms expressed concern that such actions stifled visibility and expression, potentially isolating LGBTQ+ individuals and communities, especially in spaces that were historically important for outreach and support. Lawsuits Numerous lawsuits have been filed against the company, both when it was known as Facebook, Inc., and as Meta Platforms. In March 2020, the Office of the Australian Information Commissioner (OAIC) sued Facebook, for significant and persistent infringements of the rule on privacy involving the Cambridge Analytica fiasco. Every violation of the Privacy Act is subject to a theoretical cumulative liability of $1.7 million. The OAIC estimated that a total of 311,127 Australians had been exposed. On December 8, 2020, the U.S. Federal Trade Commission and 46 states (excluding Alabama, Georgia, South Carolina, and South Dakota), the District of Columbia and the territory of Guam, launched Federal Trade Commission v. Facebook as an antitrust lawsuit against Facebook. The lawsuit concerns Facebook's acquisition of two competitors—Instagram and WhatsApp—and the ensuing monopolistic situation. FTC alleges that Facebook holds monopolistic power in the U.S. social networking market and seeks to force the company to divest from Instagram and WhatsApp to break up the conglomerate. William Kovacic, a former chairman of the Federal Trade Commission, argued the case will be difficult to win as it would require the government to create a counterfactual argument of an internet where the Facebook-WhatsApp-Instagram entity did not exist, and prove that harmed competition or consumers. In November 2025, it was ruled that Meta did not violate antitrust laws and holds no monopoly in the market. On December 24, 2021, a court in Russia fined Meta for $27 million after the company declined to remove unspecified banned content. The fine was reportedly tied to the company's annual revenue in the country. In May 2022, a lawsuit was filed in Kenya against Meta and its local outsourcing company Sama. Allegedly, Meta has poor working conditions in Kenya for workers moderating Facebook posts. According to the lawsuit, 260 screeners were declared redundant with confusing reasoning. The lawsuit seeks financial compensation and an order that outsourced moderators be given the same health benefits and pay scale as Meta employees. In June 2022, 8 lawsuits were filed across the U.S. over the allege that excessive exposure to platforms including Facebook and Instagram has led to attempted or actual suicides, eating disorders and sleeplessness, among other issues. The litigation follows a former Facebook employee's testimony in Congress that the company refused to take responsibility. The company noted that tools have been developed for parents to keep track of their children's activity on Instagram and set time limits, in addition to Meta's "Take a break" reminders. In addition, the company is providing resources specific to eating disorders as well as developing AI to prevent children under the age of 13 signing up for Facebook or Instagram. In June 2022, Meta settled a lawsuit with the US Department of Justice. The lawsuit, which was filed in 2019, alleged that the company enabled housing discrimination through targeted advertising, as it allowed homeowners and landlords to run housing ads excluding people based on sex, race, religion, and other characteristics. The U.S. Department of Justice stated that this was in violation of the Fair Housing Act. Meta was handed a penalty of $115,054 and given until December 31, 2022, to shadow the algorithm tool. In January 2023, Meta was fined €390 million for violations of the European Union General Data Protection Regulation. In May 2023, the European Data Protection Board fined Meta a record €1.2 billion for breaching European Union data privacy laws by transferring personal data of Facebook users to servers in the U.S. In July 2024, Meta agreed to pay the state of Texas US$1.4 billion to settle a lawsuit brought by Texas Attorney General Ken Paxton accusing the company of collecting users' biometric data without consent, setting a record for the largest privacy-related settlement ever obtained by a state attorney general. In October 2024, Meta Platforms faced lawsuits in Japan from 30 plaintiffs who claimed they were defrauded by fake investment ads on Facebook and Instagram, featuring false celebrity endorsements. The plaintiffs are seeking approximately $2.8 million in damages. In April 2025, the Kenyan High Court ruled that a US$2.4 billion lawsuit in which three plaintiffs claim that Facebook inflamed civil violence in Ethiopia in 2021 could proceed. In April 2025, Meta was fined €200 million ($230 million) for breaking the Digital Markets Act, by imposing a “consent or pay” system that forces users to either allow their personal data to be used to target advertisements, or pay a subscription fee for advertising-free versions of Facebook and Instagram. In late April 2025, a case was filed against Meta in Ghana over the alleged psychological distress experienced by content moderators employed to take down disturbing social media content including depictions of murders, extreme violence and child sexual abuse. Meta moved the moderation service to the Ghanaian capital of Accra after legal issues in the previous location Kenya. The new moderation company is Teleperformance, a multinational corporation with a history of worker's rights violation. Reports suggests the conditions are worse here than in the previous Kenyan location, with many workers afraid of speaking out due to fear of returning to conflict zones. Workers reported developing mental illnesses, attempted suicides, and low pay. In 26 January 2026, a New Mexico state court case was filed, suggesting that Mark Zuckerberg approved allowing minors to access artificial intelligence chatbot companions that safety staffers warned were capable of sexual interactions. In 2020, the company UReputation, which had been involved in several cases concerning the management of digital armies[clarification needed], filed a lawsuit against Facebook, accusing it of unlawfully transmitting personal data to third parties. Legal actions were initiated in Tunisia, France, and the United States. In 2025, the United States District court for the Northern District of Georgia approved a discovery procedure, allowing UReputation to access documents and evidence held by Meta. Structure Meta's key management consists of: As of October 2022[update], Meta had 83,553 employees worldwide. As of June 2024[update], Meta's board consisted of the following directors; Meta Platforms is mainly owned by institutional investors, who hold around 80% of all shares. Insiders control the majority of voting shares. The three largest individual investors in 2024 were Mark Zuckerberg, Sheryl Sandberg and Christopher K. Cox. The largest shareholders in late 2024/early 2025 were: Roger McNamee, an early Facebook investor and Zuckerberg's former mentor, said Facebook had "the most centralized decision-making structure I have ever encountered in a large company". Facebook co-founder Chris Hughes has stated that chief executive officer Mark Zuckerberg has too much power, that the company is now a monopoly, and that, as a result, it should be split into multiple smaller companies. In an op-ed in The New York Times, Hughes said he was concerned that Zuckerberg had surrounded himself with a team that did not challenge him, and that it is the U.S. government's job to hold him accountable and curb his "unchecked power". He also said that "Mark's power is unprecedented and un-American." Several U.S. politicians agreed with Hughes. European Union Commissioner for Competition Margrethe Vestager stated that splitting Facebook should be done only as "a remedy of the very last resort", and that it would not solve Facebook's underlying problems. Revenue Facebook ranked No. 34 in the 2020 Fortune 500 list of the largest United States corporations by revenue, with almost $86 billion in revenue most of it coming from advertising. One analysis of 2017 data determined that the company earned US$20.21 per user from advertising. According to New York, since its rebranding, Meta has reportedly lost $500 billion as a result of new privacy measures put in place by companies such as Apple and Google which prevents Meta from gathering users' data. In February 2015, Facebook announced it had reached two million active advertisers, with most of the gain coming from small businesses. An active advertiser was defined as an entity that had advertised on the Facebook platform in the last 28 days. In March 2016, Facebook announced it had reached three million active advertisers with more than 70% from outside the United States. Prices for advertising follow a variable pricing model based on auctioning ad placements, and potential engagement levels of the advertisement itself. Similar to other online advertising platforms like Google and Twitter, targeting of advertisements is one of the chief merits of digital advertising compared to traditional media. Marketing on Meta is employed through two methods based on the viewing habits, likes and shares, and purchasing data of the audience, namely targeted audiences and "look alike" audiences. The U.S. IRS challenged the valuation Facebook used when it transferred IP from the U.S. to Facebook Ireland (now Meta Platforms Ireland) in 2010 (which Facebook Ireland then revalued higher before charging out), as it was building its double Irish tax structure. The case is ongoing and Meta faces a potential fine of $3–5bn. The U.S. Tax Cuts and Jobs Act of 2017 changed Facebook's global tax calculations. Meta Platforms Ireland is subject to the U.S. GILTI tax of 10.5% on global intangible profits (i.e. Irish profits). On the basis that Meta Platforms Ireland Limited is paying some tax, the effective minimum US tax for Facebook Ireland will be circa 11%. In contrast, Meta Platforms Inc. would incur a special IP tax rate of 13.125% (the FDII rate) if its Irish business relocated to the U.S. Tax relief in the U.S. (21% vs. Irish at the GILTI rate) and accelerated capital expensing, would make this effective U.S. rate around 12%. The insignificance of the U.S./Irish tax difference was demonstrated when Facebook moved 1.5bn non-EU accounts to the U.S. to limit exposure to GDPR. Facilities Users outside of the U.S. and Canada contract with Meta's Irish subsidiary, Meta Platforms Ireland Limited (formerly Facebook Ireland Limited), allowing Meta to avoid US taxes for all users in Europe, Asia, Australia, Africa and South America. Meta is making use of the Double Irish arrangement which allows it to pay 2–3% corporation tax on all international revenue. In 2010, Facebook opened its fourth office, in Hyderabad, India, which houses online advertising and developer support teams and provides support to users and advertisers. In India, Meta is registered as Facebook India Online Services Pvt Ltd. It also has offices or planned sites in Chittagong, Bangladesh; Dublin, Ireland; and Austin, Texas, among other cities. Facebook opened its London headquarters in 2017 in Fitzrovia in central London. Facebook opened an office in Cambridge, Massachusetts in 2018. The offices were initially home to the "Connectivity Lab", a group focused on bringing Internet access to those who do not have access to the Internet. In April 2019, Facebook opened its Taiwan headquarters in Taipei. In March 2022, Meta opened new regional headquarters in Dubai. In September 2023, it was reported that Meta had paid £149m to British Land to break the lease on Triton Square London office. Meta reportedly had another 18 years left on its lease on the site. As of 2023, Facebook operated 21 data centers. It committed to purchase 100% renewable energy and to reduce its greenhouse gas emissions 75% by 2020. Its data center technologies include Fabric Aggregator, a distributed network system that accommodates larger regions and varied traffic patterns. Reception US Representative Alexandria Ocasio-Cortez responded in a tweet to Zuckerberg's announcement about Meta, saying: "Meta as in 'we are a cancer to democracy metastasizing into a global surveillance and propaganda machine for boosting authoritarian regimes and destroying civil society ... for profit!'" Ex-Facebook employee Frances Haugen and whistleblower behind the Facebook Papers responded to the rebranding efforts by expressing doubts about the company's ability to improve while led by Mark Zuckerberg, and urged the chief executive officer to resign. In November 2021, a video published by Inspired by Iceland went viral, in which a Zuckerberg look-alike promoted the Icelandverse, a place of "enhanced actual reality without silly looking headsets". In a December 2021 interview, SpaceX and Tesla chief executive officer Elon Musk said he could not see a compelling use-case for the VR-driven metaverse, adding: "I don't see someone strapping a frigging screen to their face all day." In January 2022, Louise Eccles of The Sunday Times logged into the metaverse with the intention of making a video guide. She wrote: Initially, my experience with the Oculus went well. I attended work meetings as an avatar and tried an exercise class set in the streets of Paris. The headset enabled me to feel the thrill of carving down mountains on a snowboard and the adrenaline rush of climbing a mountain without ropes. Yet switching to the social apps, where you mingle with strangers also using VR headsets, it was at times predatory and vile. Eccles described being sexually harassed by another user, as well as "accents from all over the world, American, Indian, English, Australian, using racist, sexist, homophobic and transphobic language". She also encountered users as young as 7 years old on the platform, despite Oculus headsets being intended for users over 13. See also References External links 37°29′06″N 122°08′54″W / 37.48500°N 122.14833°W / 37.48500; -122.14833 |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Elon_Musk#cite_note-461] | [TOKENS: 10515] |
Contents Elon Musk Elon Reeve Musk (/ˈiːlɒn/ EE-lon; born June 28, 1971) is a businessman and entrepreneur known for his leadership of Tesla, SpaceX, Twitter, and xAI. Musk has been the wealthiest person in the world since 2025; as of February 2026,[update] Forbes estimates his net worth to be around US$852 billion. Born into a wealthy family in Pretoria, South Africa, Musk emigrated in 1989 to Canada; he has Canadian citizenship since his mother was born there. He received bachelor's degrees in 1997 from the University of Pennsylvania before moving to California to pursue business ventures. In 1995, Musk co-founded the software company Zip2. Following its sale in 1999, he co-founded X.com, an online payment company that later merged to form PayPal, which was acquired by eBay in 2002. Musk also became an American citizen in 2002. In 2002, Musk founded the space technology company SpaceX, becoming its CEO and chief engineer; the company has since led innovations in reusable rockets and commercial spaceflight. Musk joined the automaker Tesla as an early investor in 2004 and became its CEO and product architect in 2008; it has since become a leader in electric vehicles. In 2015, he co-founded OpenAI to advance artificial intelligence (AI) research, but later left; growing discontent with the organization's direction and their leadership in the AI boom in the 2020s led him to establish xAI, which became a subsidiary of SpaceX in 2026. In 2022, he acquired the social network Twitter, implementing significant changes, and rebranding it as X in 2023. His other businesses include the neurotechnology company Neuralink, which he co-founded in 2016, and the tunneling company the Boring Company, which he founded in 2017. In November 2025, a Tesla pay package worth $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Musk was the largest donor in the 2024 U.S. presidential election, where he supported Donald Trump. After Trump was inaugurated as president in early 2025, Musk served as Senior Advisor to the President and as the de facto head of the Department of Government Efficiency (DOGE). After a public feud with Trump, Musk left the Trump administration and returned to managing his companies. Musk is a supporter of global far-right figures, causes, and political parties. His political activities, views, and statements have made him a polarizing figure. Musk has been criticized for COVID-19 misinformation, promoting conspiracy theories, and affirming antisemitic, racist, and transphobic comments. His acquisition of Twitter was controversial due to a subsequent increase in hate speech and the spread of misinformation on the service, following his pledge to decrease censorship. His role in the second Trump administration attracted public backlash, particularly in response to DOGE. The emails he sent to Jeffrey Epstein are included in the Epstein files, which were published between 2025–26 and became a topic of worldwide debate. Early life Elon Reeve Musk was born on June 28, 1971, in Pretoria, South Africa's administrative capital. He is of British and Pennsylvania Dutch ancestry. His mother, Maye (née Haldeman), is a model and dietitian born in Saskatchewan, Canada, and raised in South Africa. Musk therefore holds both South African and Canadian citizenship from birth. His father, Errol Musk, is a South African electromechanical engineer, pilot, sailor, consultant, emerald dealer, and property developer, who partly owned a rental lodge at Timbavati Private Nature Reserve. His maternal grandfather, Joshua N. Haldeman, who died in a plane crash when Elon was a toddler, was an American-born Canadian chiropractor, aviator and political activist in the technocracy movement who moved to South Africa in 1950. Elon has a younger brother, Kimbal, a younger sister, Tosca, and four paternal half-siblings. Musk was baptized as a child in the Anglican Church of Southern Africa. Despite both Elon and Errol previously stating that Errol was a part owner of a Zambian emerald mine, in 2023, Errol recounted that the deal he made was to receive "a portion of the emeralds produced at three small mines". Errol was elected to the Pretoria City Council as a representative of the anti-apartheid Progressive Party and has said that his children shared their father's dislike of apartheid. After his parents divorced in 1979, Elon, aged around 9, chose to live with his father because Errol Musk had an Encyclopædia Britannica and a computer. Elon later regretted his decision and became estranged from his father. Elon has recounted trips to a wilderness school that he described as a "paramilitary Lord of the Flies" where "bullying was a virtue" and children were encouraged to fight over rations. In one incident, after an altercation with a fellow pupil, Elon was thrown down concrete steps and beaten severely, leading to him being hospitalized for his injuries. Elon described his father berating him after he was discharged from the hospital. Errol denied berating Elon and claimed, "The [other] boy had just lost his father to suicide, and Elon had called him stupid. Elon had a tendency to call people stupid. How could I possibly blame that child?" Elon was an enthusiastic reader of books, and had attributed his success in part to having read The Lord of the Rings, the Foundation series, and The Hitchhiker's Guide to the Galaxy. At age ten, he developed an interest in computing and video games, teaching himself how to program from the VIC-20 user manual. At age twelve, Elon sold his BASIC-based game Blastar to PC and Office Technology magazine for approximately $500 (equivalent to $1,600 in 2025). Musk attended Waterkloof House Preparatory School, Bryanston High School, and then Pretoria Boys High School, where he graduated. Musk was a decent but unexceptional student, earning a 61/100 in Afrikaans and a B on his senior math certification. Musk applied for a Canadian passport through his Canadian-born mother to avoid South Africa's mandatory military service, which would have forced him to participate in the apartheid regime, as well as to ease his path to immigration to the United States. While waiting for his application to be processed, he attended the University of Pretoria for five months. Musk arrived in Canada in June 1989, connected with a second cousin in Saskatchewan, and worked odd jobs, including at a farm and a lumber mill. In 1990, he entered Queen's University in Kingston, Ontario. Two years later, he transferred to the University of Pennsylvania, where he studied until 1995. Although Musk has said that he earned his degrees in 1995, the University of Pennsylvania did not award them until 1997 – a Bachelor of Arts in physics and a Bachelor of Science in economics from the university's Wharton School. He reportedly hosted large, ticketed house parties to help pay for tuition, and wrote a business plan for an electronic book-scanning service similar to Google Books. In 1994, Musk held two internships in Silicon Valley: one at energy storage startup Pinnacle Research Institute, which investigated electrolytic supercapacitors for energy storage, and another at Palo Alto–based startup Rocket Science Games. In 1995, he was accepted to a graduate program in materials science at Stanford University, but did not enroll. Musk decided to join the Internet boom of the 1990s, applying for a job at Netscape, to which he reportedly never received a response. The Washington Post reported that Musk lacked legal authorization to remain and work in the United States after failing to enroll at Stanford. In response, Musk said he was allowed to work at that time and that his student visa transitioned to an H1-B. According to numerous former business associates and shareholders, Musk said he was on a student visa at the time. Business career In 1995, Musk, his brother Kimbal, and Greg Kouri founded the web software company Zip2 with funding from a group of angel investors. They housed the venture at a small rented office in Palo Alto. Replying to Rolling Stone, Musk denounced the notion that they started their company with funds borrowed from Errol Musk, but in a tweet, he recognized that his father contributed 10% of a later funding round. The company developed and marketed an Internet city guide for the newspaper publishing industry, with maps, directions, and yellow pages. According to Musk, "The website was up during the day and I was coding it at night, seven days a week, all the time." To impress investors, Musk built a large plastic structure around a standard computer to create the impression that Zip2 was powered by a small supercomputer. The Musk brothers obtained contracts with The New York Times and the Chicago Tribune, and persuaded the board of directors to abandon plans for a merger with CitySearch. Musk's attempts to become CEO were thwarted by the board. Compaq acquired Zip2 for $307 million in cash in February 1999 (equivalent to $590,000,000 in 2025), and Musk received $22 million (equivalent to $43,000,000 in 2025) for his 7-percent share. In 1999, Musk co-founded X.com, an online financial services and e-mail payment company. The startup was one of the first federally insured online banks, and, in its initial months of operation, over 200,000 customers joined the service. The company's investors regarded Musk as inexperienced and replaced him with Intuit CEO Bill Harris by the end of the year. The following year, X.com merged with online bank Confinity to avoid competition. Founded by Max Levchin and Peter Thiel, Confinity had its own money-transfer service, PayPal, which was more popular than X.com's service. Within the merged company, Musk returned as CEO. Musk's preference for Microsoft software over Unix created a rift in the company and caused Thiel to resign. Due to resulting technological issues and lack of a cohesive business model, the board ousted Musk and replaced him with Thiel in 2000.[b] Under Thiel, the company focused on the PayPal service and was renamed PayPal in 2001. In 2002, PayPal was acquired by eBay for $1.5 billion (equivalent to $2,700,000,000 in 2025) in stock, of which Musk—the largest shareholder with 11.72% of shares—received $175.8 million (equivalent to $320,000,000 in 2025). In 2017, Musk purchased the domain X.com from PayPal for an undisclosed amount, stating that it had sentimental value. In 2001, Musk became involved with the nonprofit Mars Society and discussed funding plans to place a growth-chamber for plants on Mars. Seeking a way to launch the greenhouse payloads into space, Musk made two unsuccessful trips to Moscow to purchase intercontinental ballistic missiles (ICBMs) from Russian companies NPO Lavochkin and Kosmotras. Musk instead decided to start a company to build affordable rockets. With $100 million of his early fortune, (equivalent to $180,000,000 in 2025) Musk founded SpaceX in May 2002 and became the company's CEO and Chief Engineer. SpaceX attempted its first launch of the Falcon 1 rocket in 2006. Although the rocket failed to reach Earth orbit, it was awarded a Commercial Orbital Transportation Services program contract from NASA, then led by Mike Griffin. After two more failed attempts that nearly caused Musk to go bankrupt, SpaceX succeeded in launching the Falcon 1 into orbit in 2008. Later that year, SpaceX received a $1.6 billion NASA contract (equivalent to $2,400,000,000 in 2025) for Falcon 9-launched Dragon spacecraft flights to the International Space Station (ISS), replacing the Space Shuttle after its 2011 retirement. In 2012, the Dragon vehicle docked with the ISS, a first for a commercial spacecraft. Working towards its goal of reusable rockets, in 2015 SpaceX successfully landed the first stage of a Falcon 9 on a land platform. Later landings were achieved on autonomous spaceport drone ships, an ocean-based recovery platform. In 2018, SpaceX launched the Falcon Heavy; the inaugural mission carried Musk's personal Tesla Roadster as a dummy payload. Since 2019, SpaceX has been developing Starship, a reusable, super heavy-lift launch vehicle intended to replace the Falcon 9 and Falcon Heavy. In 2020, SpaceX launched its first crewed flight, the Demo-2, becoming the first private company to place astronauts into orbit and dock a crewed spacecraft with the ISS. In 2024, NASA awarded SpaceX an $843 million (equivalent to $865,000,000 in 2025) contract to build a spacecraft that NASA will use to deorbit the ISS at the end of its lifespan. In 2015, SpaceX began development of the Starlink constellation of low Earth orbit satellites to provide satellite Internet access. After the launch of prototype satellites in 2018, the first large constellation was deployed in May 2019. As of May 2025[update], over 7,600 Starlink satellites are operational, comprising 65% of all operational Earth satellites. The total cost of the decade-long project to design, build, and deploy the constellation was estimated by SpaceX in 2020 to be $10 billion (equivalent to $12,000,000,000 in 2025).[c] During the Russian invasion of Ukraine, Musk provided free Starlink service to Ukraine, permitting Internet access and communication at a yearly cost to SpaceX of $400 million (equivalent to $440,000,000 in 2025). However, Musk refused to block Russian state media on Starlink. In 2023, Musk denied Ukraine's request to activate Starlink over Crimea to aid an attack against the Russian navy, citing fears of a nuclear response. Tesla, Inc., originally Tesla Motors, was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning. Both men played active roles in the company's early development prior to Musk's involvement. Musk led the Series A round of investment in February 2004; he invested $6.35 million (equivalent to $11,000,000 in 2025), became the majority shareholder, and joined Tesla's board of directors as chairman. Musk took an active role within the company and oversaw Roadster product design, but was not deeply involved in day-to-day business operations. Following a series of escalating conflicts in 2007 and the 2008 financial crisis, Eberhard was ousted from the firm.[page needed] Musk assumed leadership of the company as CEO and product architect in 2008. A 2009 lawsuit settlement with Eberhard designated Musk as a Tesla co-founder, along with Tarpenning and two others. Tesla began delivery of the Roadster, an electric sports car, in 2008. With sales of about 2,500 vehicles, it was the first mass production all-electric car to use lithium-ion battery cells. Under Musk, Tesla has since launched several well-selling electric vehicles, including the four-door sedan Model S (2012), the crossover Model X (2015), the mass-market sedan Model 3 (2017), the crossover Model Y (2020), and the pickup truck Cybertruck (2023). In May 2020, Musk resigned as chairman of the board as part of the settlement of a lawsuit from the SEC over him tweeting that funding had been "secured" for potentially taking Tesla private. The company has also constructed multiple lithium-ion battery and electric vehicle factories, called Gigafactories. Since its initial public offering in 2010, Tesla stock has risen significantly; it became the most valuable carmaker in summer 2020, and it entered the S&P 500 later that year. In October 2021, it reached a market capitalization of $1 trillion (equivalent to $1,200,000,000,000 in 2025), the sixth company in U.S. history to do so. Musk provided the initial concept and financial capital for SolarCity, which his cousins Lyndon and Peter Rive founded in 2006. By 2013, SolarCity was the second largest provider of solar power systems in the United States. In 2014, Musk promoted the idea of SolarCity building an advanced production facility in Buffalo, New York, triple the size of the largest solar plant in the United States. Construction of the factory started in 2014 and was completed in 2017. It operated as a joint venture with Panasonic until early 2020. Tesla acquired SolarCity for $2 billion in 2016 (equivalent to $2,700,000,000 in 2025) and merged it with its battery unit to create Tesla Energy. The deal's announcement resulted in a more than 10% drop in Tesla's stock price; at the time, SolarCity was facing liquidity issues. Multiple shareholder groups filed a lawsuit against Musk and Tesla's directors, stating that the purchase of SolarCity was done solely to benefit Musk and came at the expense of Tesla and its shareholders. Tesla directors settled the lawsuit in January 2020, leaving Musk the sole remaining defendant. Two years later, the court ruled in Musk's favor. In 2016, Musk co-founded Neuralink, a neurotechnology startup, with an investment of $100 million. Neuralink aims to integrate the human brain with artificial intelligence (AI) by creating devices that are embedded in the brain. Such technology could enhance memory or allow the devices to communicate with software. The company also hopes to develop devices to treat neurological conditions like spinal cord injuries. In 2022, Neuralink announced that clinical trials would begin by the end of the year. In September 2023, the Food and Drug Administration approved Neuralink to initiate six-year human trials. Neuralink has conducted animal testing on macaques at the University of California, Davis. In 2021, the company released a video in which a macaque played the video game Pong via a Neuralink implant. The company's animal trials—which have caused the deaths of some monkeys—have led to claims of animal cruelty. The Physicians Committee for Responsible Medicine has alleged that Neuralink violated the Animal Welfare Act. Employees have complained that pressure from Musk to accelerate development has led to botched experiments and unnecessary animal deaths. In 2022, a federal probe was launched into possible animal welfare violations by Neuralink.[needs update] In 2017, Musk founded the Boring Company to construct tunnels; he also revealed plans for specialized, underground, high-occupancy vehicles that could travel up to 150 miles per hour (240 km/h) and thus circumvent above-ground traffic in major cities. Early in 2017, the company began discussions with regulatory bodies and initiated construction of a 30-foot (9.1 m) wide, 50-foot (15 m) long, and 15-foot (4.6 m) deep "test trench" on the premises of SpaceX's offices, as that required no permits. The Los Angeles tunnel, less than two miles (3.2 km) in length, debuted to journalists in 2018. It used Tesla Model Xs and was reported to be a rough ride while traveling at suboptimal speeds. Two tunnel projects announced in 2018, in Chicago and West Los Angeles, have been canceled. A tunnel beneath the Las Vegas Convention Center was completed in early 2021. Local officials have approved further expansions of the tunnel system. April 14, 2022 In early 2017, Musk expressed interest in buying Twitter and had questioned the platform's commitment to freedom of speech. By 2022, Musk had reached 9.2% stake in the company, making him the largest shareholder.[d] Musk later agreed to a deal that would appoint him to Twitter's board of directors and prohibit him from acquiring more than 14.9% of the company. Days later, Musk made a $43 billion offer to buy Twitter. By the end of April Musk had successfully concluded his bid for approximately $44 billion. This included approximately $12.5 billion in loans and $21 billion in equity financing. Having backtracked on his initial decision, Musk bought the company on October 27, 2022. Immediately after the acquisition, Musk fired several top Twitter executives including CEO Parag Agrawal; Musk became the CEO instead. Under Elon Musk, Twitter instituted monthly subscriptions for a "blue check", and laid off a significant portion of the company's staff. Musk lessened content moderation and hate speech also increased on the platform after his takeover. In late 2022, Musk released internal documents relating to Twitter's moderation of Hunter Biden's laptop controversy in the lead-up to the 2020 presidential election. Musk also promised to step down as CEO after a Twitter poll, and five months later, Musk stepped down as CEO and transitioned his role to executive chairman and chief technology officer (CTO). Despite Musk stepping down as CEO, X continues to struggle with challenges such as viral misinformation, hate speech, and antisemitism controversies. Musk has been accused of trying to silence some of his critics such as Twitch streamer Asmongold, who criticized him during one of his streams. Musk has been accused of removing their accounts' blue checkmarks, which hinders visibility and is considered a form of shadow banning, or suspending their accounts without justification. Other activities In August 2013, Musk announced plans for a version of a vactrain, and assigned engineers from SpaceX and Tesla to design a transport system between Greater Los Angeles and the San Francisco Bay Area, at an estimated cost of $6 billion. Later that year, Musk unveiled the concept, dubbed the Hyperloop, intended to make travel cheaper than any other mode of transport for such long distances. In December 2015, Musk co-founded OpenAI, a not-for-profit artificial intelligence (AI) research company aiming to develop artificial general intelligence, intended to be safe and beneficial to humanity. Musk pledged $1 billion of funding to the company, and initially gave $50 million. In 2018, Musk left the OpenAI board. Since 2018, OpenAI has made significant advances in machine learning. In July 2023, Musk launched the artificial intelligence company xAI, which aims to develop a generative AI program that competes with existing offerings like OpenAI's ChatGPT. Musk obtained funding from investors in SpaceX and Tesla, and xAI hired engineers from Google and OpenAI. December 16, 2022 Musk uses a private jet owned by Falcon Landing LLC, a SpaceX-linked company, and acquired a second jet in August 2020. His heavy use of the jets and the consequent fossil fuel usage have received criticism. Musk's flight usage is tracked on social media through ElonJet. In December 2022, Musk banned the ElonJet account on Twitter, and made temporary bans on the accounts of journalists that posted stories regarding the incident, including Donie O'Sullivan, Keith Olbermann, and journalists from The New York Times, The Washington Post, CNN, and The Intercept. In October 2025, Musk's company xAI launched Grokipedia, an AI-generated online encyclopedia that he promoted as an alternative to Wikipedia. Articles on Grokipedia are generated and reviewed by xAI's Grok chatbot. Media coverage and academic analysis described Grokipedia as frequently reusing Wikipedia content but framing contested political and social topics in line with Musk's own views and right-wing narratives. A study by Cornell University researchers and NBC News stated that Grokipedia cites sources that are blacklisted or considered "generally unreliable" on Wikipedia, for example, the conspiracy site Infowars and the neo-Nazi forum Stormfront. Wired, The Guardian and Time criticized Grokipedia for factual errors and for presenting Musk himself in unusually positive terms while downplaying controversies. Politics Musk is an outlier among business leaders who typically avoid partisan political advocacy. Musk was a registered independent voter when he lived in California. Historically, he has donated to both Democrats and Republicans, many of whom serve in states in which he has a vested interest. Since 2022, his political contributions have mostly supported Republicans, with his first vote for a Republican going to Mayra Flores in the 2022 Texas's 34th congressional district special election. In 2024, he started supporting international far-right political parties, activists, and causes, and has shared misinformation and numerous conspiracy theories. Since 2024, his views have been generally described as right-wing. Musk supported Barack Obama in 2008 and 2012, Hillary Clinton in 2016, Joe Biden in 2020, and Donald Trump in 2024. In the 2020 Democratic Party presidential primaries, Musk endorsed candidate Andrew Yang and expressed support for Yang's proposed universal basic income, and endorsed Kanye West's 2020 presidential campaign. In 2021, Musk publicly expressed opposition to the Build Back Better Act, a $3.5 trillion legislative package endorsed by Joe Biden that ultimately failed to pass due to unanimous opposition from congressional Republicans and several Democrats. In 2022, gave over $50 million to Citizens for Sanity, a conservative political action committee. In 2023, he supported Republican Ron DeSantis for the 2024 U.S. presidential election, giving $10 million to his campaign, and hosted DeSantis's campaign announcement on a Twitter Spaces event. From June 2023 to January 2024, Musk hosted a bipartisan set of X Spaces with Republican and Democratic candidates, including Robert F. Kennedy Jr., Vivek Ramaswamy, and Dean Phillips. In October 2025, former vice-president Kamala Harris commented that it was a mistake from the Democratic side to not invite Musk to a White House electric vehicle event organized in August 2021 and featuring executives from General Motors, Ford and Stellantis, despite Tesla being "the major American manufacturer of extraordinary innovation in this space." Fortune remarked that this was a nod to United Auto Workers and organized labor. Harris said presidents should put aside political loyalties when it came to recognizing innovation, and guessed that the non-invitation impacted Musk's perspective. Fortune noted that, at the time, Musk said, "Yeah, seems odd that Tesla wasn't invited." A month later, he criticized Biden as "not the friendliest administration." Jacob Silverman, author of the book Gilded Rage: Elon Musk and the Radicalization of Silicon Valley, said that the tech industry represented by Musk, Thiel, Andreessen and other capitalists, actually flourished under Biden, but the tech leaders chose Trump for their common ground on cultural issues. By early 2024, Musk had become a vocal and financial supporter of Donald Trump. In July 2024, minutes after the attempted assassination of Donald Trump, Musk endorsed him for president saying; "I fully endorse President Trump and hope for his rapid recovery." During the presidential campaign, Musk joined Trump on stage at a campaign rally, and during the campaign promoted conspiracy theories and falsehoods about Democrats, election fraud and immigration, in support of Trump. Musk was the largest individual donor of the 2024 election. In 2025, Musk contributed $19 million to the Wisconsin Supreme Court race, hoping to influence the state's future redistricting efforts and its regulations governing car manufacturers and dealers. In 2023, Musk said he shunned the World Economic Forum because it was boring. The organization commented that they had not invited him since 2015. He has participated in Dialog, dubbed "Tech Bilderberg" and organized by Peter Thiel and Auren Hoffman, though. Musk's international political actions and comments have come under increasing scrutiny and criticism, especially from the governments and leaders of France, Germany, Norway, Spain and the United Kingdom, particularly due to his position in the U.S. government as well as ownership of X. An NBC News analysis found he had boosted far-right political movements to cut immigration and curtail regulation of business in at least 18 countries on six continents since 2023. During his speech after the second inauguration of Donald Trump, Musk twice made a gesture interpreted by many as a Nazi or a fascist Roman salute.[e] He thumped his right hand over his heart, fingers spread wide, and then extended his right arm out, emphatically, at an upward angle, palm down and fingers together. He then repeated the gesture to the crowd behind him. As he finished the gestures, he said to the crowd, "My heart goes out to you. It is thanks to you that the future of civilization is assured." It was widely condemned as an intentional Nazi salute in Germany, where making such gestures is illegal. The Anti-Defamation League said it was not a Nazi salute, but other Jewish organizations disagreed and condemned the salute. American public opinion was divided on partisan lines as to whether it was a fascist salute. Musk dismissed the accusations of Nazi sympathies, deriding them as "dirty tricks" and a "tired" attack. Neo-Nazi and white supremacist groups celebrated it as a Nazi salute. Multiple European political parties demanded that Musk be banned from entering their countries. The concept of DOGE emerged in a discussion between Musk and Donald Trump, and in August 2024, Trump committed to giving Musk an advisory role, with Musk accepting the offer. In November and December 2024, Musk suggested that the organization could help to cut the U.S. federal budget, consolidate the number of federal agencies, and eliminate the Consumer Financial Protection Bureau, and that its final stage would be "deleting itself". In January 2025, the organization was created by executive order, and Musk was designated a "special government employee". Musk led the organization and was a senior advisor to the president, although his official role is not clear. In sworn statement during a lawsuit, the director of the White House Office of Administration stated that Musk "is not an employee of the U.S. DOGE Service or U.S. DOGE Service Temporary Organization", "is not the U.S. DOGE Service administrator", and has "no actual or formal authority to make government decisions himself". Trump said two days later that he had put Musk in charge of DOGE. A federal judge has ruled that Musk acted as the de facto leader of DOGE. Musk's role in the second Trump administration, particularly in response to DOGE, has attracted public backlash. He was criticized for his treatment of federal government employees, including his influence over the mass layoffs of the federal workforce. He has prioritized secrecy within the organization and has accused others of violating privacy laws. A Senate report alleged that Musk could avoid up to $2 billion in legal liability as a result of DOGE's actions. In May 2025, Bill Gates accused Musk of "killing the world's poorest children" through his cuts to USAID, which modeling by Boston University estimated had resulted in 300,000 deaths by this time, most of them of children. By November 2025, the estimated death toll had increased to 400,000 children and 200,000 adults. Musk announced on May 28, 2025, that he would depart from the Trump administration as planned when the special government employee's 130 day deadline expired, with a White House official confirming that Musk's offboarding from the Trump administration was already underway. His departure was officially confirmed during a joint Oval Office press conference with Trump on May 30, 2025. @realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. June 5, 2025 After leaving office, Musk criticized the Trump administration's Big Beautiful Bill, calling it a "disgusting abomination" due to its provisions increasing the deficit. A feud began between Musk and Trump, with its most notable event being Musk alleging Trump had ties to sex offender Jeffrey Epstein on X (formerly Twitter) on June 5, 2025. Trump responded on Truth Social stating that Musk went "CRAZY" after the "EV Mandate" was purportedly taken away and threatened to cut Musk's government contracts. Musk then called for a third Trump impeachment. The next day, Trump stated that he did not wish to reconcile with Musk, and added that Musk would face "very serious consequences" if he funds Democratic candidates. On June 11, Musk publicly apologized for the tweets against Trump, saying they "went too far". Views November 6, 2022 Rejecting the conservative label, Musk has described himself as a political moderate, even as his views have become more right-wing over time. His views have been characterized as libertarian and far-right, and after his involvement in European politics, they have received criticism from world leaders such as Emmanuel Macron and Olaf Scholz. Within the context of American politics, Musk supported Democratic candidates up until 2022, at which point he voted for a Republican for the first time. He has stated support for universal basic income, gun rights, freedom of speech, a tax on carbon emissions, and H-1B visas. Musk has expressed concern about issues such as artificial intelligence (AI) and climate change, and has been a critic of wealth tax, short-selling, and government subsidies. An immigrant himself, Musk has been accused of being anti-immigration, and regularly blames immigration policies for illegal immigration. He is also a pronatalist who believes population decline is the biggest threat to civilization, and identifies as a cultural Christian. Musk has long been an advocate for space colonization, especially the colonization of Mars. He has repeatedly pushed for humanity colonizing Mars, in order to become an interplanetary species and lower the risks of human extinction. Musk has promoted conspiracy theories and made controversial statements that have led to accusations of racism, sexism, antisemitism, transphobia, disseminating disinformation, and support of white pride. While describing himself as a "pro-Semite", his comments regarding George Soros and Jewish communities have been condemned by the Anti-Defamation League and the Biden White House. Musk was criticized during the COVID-19 pandemic for making unfounded epidemiological claims, defying COVID-19 lockdowns restrictions, and supporting the Canada convoy protest against vaccine mandates. He has amplified false claims of white genocide in South Africa. Musk has been critical of Israel's actions in the Gaza Strip during the Gaza war, praised China's economic and climate goals, suggested that Taiwan and China should resolve cross-strait relations, and was described as having a close relationship with the Chinese government. In Europe, Musk expressed support for Ukraine in 2022 during the Russian invasion, recommended referendums and peace deals on the annexed Russia-occupied territories, and supported the far-right Alternative for Germany political party in 2024. Regarding British politics, Musk blamed the 2024 UK riots on mass migration and open borders, criticized Prime Minister Keir Starmer for what he described as a "two-tier" policing system, and was subsequently attacked as being responsible for spreading misinformation and amplifying the far-right. He has also voiced his support for far-right activist Tommy Robinson and pledged electoral support for Reform UK. In February 2026, Musk described Spanish Prime Minister Pedro Sánchez as a "tyrant" following Sánchez's proposal to prohibit minors under the age of 16 from accessing social media platforms. Legal affairs In 2018, Musk was sued by the U.S. Securities and Exchange Commission (SEC) for a tweet stating that funding had been secured for potentially taking Tesla private.[f] The securities fraud lawsuit characterized the tweet as false, misleading, and damaging to investors, and sought to bar Musk from serving as CEO of publicly traded companies. Two days later, Musk settled with the SEC, without admitting or denying the SEC's allegations. As a result, Musk and Tesla were fined $20 million each, and Musk was forced to step down for three years as Tesla chairman but was able to remain as CEO. Shareholders filed a lawsuit over the tweet, and in February 2023, a jury found Musk and Tesla not liable. Musk has stated in interviews that he does not regret posting the tweet that triggered the SEC investigation. In 2019, Musk stated in a tweet that Tesla would build half a million cars that year. The SEC reacted by asking a court to hold him in contempt for violating the terms of the 2018 settlement agreement. A joint agreement between Musk and the SEC eventually clarified the previous agreement details, including a list of topics about which Musk needed preclearance. In 2020, a judge blocked a lawsuit that claimed a tweet by Musk regarding Tesla stock price ("too high imo") violated the agreement. Freedom of Information Act (FOIA)-released records showed that the SEC concluded Musk had subsequently violated the agreement twice by tweeting regarding "Tesla's solar roof production volumes and its stock price". In October 2023, the SEC sued Musk over his refusal to testify a third time in an investigation into whether he violated federal law by purchasing Twitter stock in 2022. In February 2024, Judge Laurel Beeler ruled that Musk must testify again. In January 2025, the SEC filed a lawsuit against Musk for securities violations related to his purchase of Twitter. In January 2024, Delaware judge Kathaleen McCormick ruled in a 2018 lawsuit that Musk's $55 billion pay package from Tesla be rescinded. McCormick called the compensation granted by the company's board "an unfathomable sum" that was unfair to shareholders. The Delaware Supreme Court overturned McCormick's decision in December 2025, restoring Musk's compensation package and awarding $1 in nominal damages. Personal life Musk became a U.S. citizen in 2002. From the early 2000s until late 2020, Musk resided in California, where both Tesla and SpaceX were founded. He then relocated to Cameron County, Texas, saying that California had become "complacent" about its economic success. While hosting Saturday Night Live in 2021, Musk stated that he has Asperger syndrome (an outdated term for autism spectrum disorder). When asked about his experience growing up with Asperger's syndrome in a TED2022 conference in Vancouver, Musk stated that "the social cues were not intuitive ... I would just tend to take things very literally ... but then that turned out to be wrong — [people were not] simply saying exactly what they mean, there's all sorts of other things that are meant, and [it] took me a while to figure that out." Musk suffers from back pain and has undergone several spine-related surgeries, including a disc replacement. In 2000, he contracted a severe case of malaria while on vacation in South Africa. Musk has stated he uses doctor-prescribed ketamine for occasional depression and that he doses "a small amount once every other week or something like that"; since January 2024, some media outlets have reported that he takes ketamine, marijuana, LSD, ecstasy, mushrooms, cocaine and other drugs. Musk at first refused to comment on his alleged drug use, before responding that he had not tested positive for drugs, and that if drugs somehow improved his productivity, "I would definitely take them!". The New York Times' investigations revealed Musk's overuse of ketamine and numerous other drugs, as well as strained family relationships and concerns from close associates who have become troubled by his public behavior as he became more involved in political activities and government work. According to The Washington Post, President Trump described Musk as "a big-time drug addict". Through his own label Emo G Records, Musk released a rap track, "RIP Harambe", on SoundCloud in March 2019. The following year, he released an EDM track, "Don't Doubt Ur Vibe", featuring his own lyrics and vocals. Musk plays video games, which he stated has a "'restoring effect' that helps his 'mental calibration'". Some games he plays include Quake, Diablo IV, Elden Ring, and Polytopia. Musk once claimed to be one of the world's top video game players but has since admitted to "account boosting", or cheating by hiring outside services to achieve top player rankings. Musk has justified the boosting by claiming that all top accounts do it so he has to as well to remain competitive. In 2024 and 2025, Musk criticized the video game Assassin's Creed Shadows and its creator Ubisoft for "woke" content. Musk posted to X that "DEI kills art" and specified the inclusion of the historical figure Yasuke in the Assassin's Creed game as offensive; he also called the game "terrible". Ubisoft responded by saying that Musk's comments were "just feeding hatred" and that they were focused on producing a game not pushing politics. Musk has fathered at least 14 children, one of whom died as an infant. The Wall Street Journal reported in 2025 that sources close to Musk suggest that the "true number of Musk's children is much higher than publicly known". He had six children with his first wife, Canadian author Justine Wilson, whom he met while attending Queen's University in Ontario, Canada; they married in 2000. In 2002, their first child Nevada Musk died of sudden infant death syndrome at the age of 10 weeks. After his death, the couple used in vitro fertilization (IVF) to continue their family; they had twins in 2004, followed by triplets in 2006. The couple divorced in 2008 and have shared custody of their children. The elder twin he had with Wilson came out as a trans woman and, in 2022, officially changed her name to Vivian Jenna Wilson, adopting her mother's surname because she no longer wished to be associated with Musk. Musk began dating English actress Talulah Riley in 2008. They married two years later at Dornoch Cathedral in Scotland. In 2012, the couple divorced, then remarried the following year. After briefly filing for divorce in 2014, Musk finalized a second divorce from Riley in 2016. Musk then dated the American actress Amber Heard for several months in 2017; he had reportedly been "pursuing" her since 2012. In 2018, Musk and Canadian musician Grimes confirmed they were dating. Grimes and Musk have three children, born in 2020, 2021, and 2022.[g] Musk and Grimes originally gave their eldest child the name "X Æ A-12", which would have violated California regulations as it contained characters that are not in the modern English alphabet; the names registered on the birth certificate are "X" as a first name, "Æ A-Xii" as a middle name, and "Musk" as a last name. They received criticism for choosing a name perceived to be impractical and difficult to pronounce; Musk has said the intended pronunciation is "X Ash A Twelve". Their second child was born via surrogacy. Despite the pregnancy, Musk confirmed reports that the couple were "semi-separated" in September 2021; in an interview with Time in December 2021, he said he was single. In October 2023, Grimes sued Musk over parental rights and custody of X Æ A-Xii. Elon Musk has taken X Æ A-Xii to multiple official events in Washington, D.C. during Trump's second term in office. Also in July 2022, The Wall Street Journal reported that Musk allegedly had an affair with Nicole Shanahan, the wife of Google co-founder Sergey Brin, in 2021, leading to their divorce the following year. Musk denied the report. Musk also had a relationship with Australian actress Natasha Bassett, who has been described as "an occasional girlfriend". In October 2024, The New York Times reported Musk bought a Texas compound for his children and their mothers, though Musk denied having done so. Musk also has four children with Shivon Zilis, director of operations and special projects at Neuralink: twins born via IVF in 2021, a child born in 2024 via surrogacy and a child born in 2025.[h] On February 14, 2025, Ashley St. Clair, an influencer and author, posted on X claiming to have given birth to Musk's son Romulus five months earlier, which media outlets reported as Musk's supposed thirteenth child.[i] On February 22, 2025, it was reported that St Clair had filed for sole custody of her five-month-old son and for Musk to be recognised as the child's father. On March 31, 2025, Musk wrote that, while he was unsure if he was the father of St. Clair's child, he had paid St. Clair $2.5 million and would continue paying her $500,000 per year.[j] Later reporting from the Wall Street Journal indicated that $1 million of these payments to St. Clair were structured as a loan. In 2014, Musk and Ghislaine Maxwell appeared together in a photograph taken at an Academy Awards after-party, which Musk later described as a "photobomb". The January 2026 Epstein files contain emails between Musk and Epstein from 2012 to 2013, after Epstein's first conviction. Emails released on January 30, 2026, indicated that Epstein invited Musk to visit his private island on multiple occasions. The correspondence showed that while Epstein repeatedly encouraged Musk to attend, Musk did not visit the island. In one instance, Musk discussed the possibility of attending a party with his then-wife Talulah Riley and asked which day would be the "wildest party"; according to the emails, the visit did not take place after Epstein later cancelled the plans.[k] On Christmas day in 2012, Musk emailed Epstein asking "Do you have any parties planned? I’ve been working to the edge of sanity this year and so, once my kids head home after Christmas, I really want to hit the party scene in St Barts or elsewhere and let loose. The invitation is much appreciated, but a peaceful island experience is the opposite of what I’m looking for". Epstein replied that the "ratio on my island" might make Musk's wife uncomfortable to which Musk responded, "Ratio is not a problem for Talulah". On September 11, 2013, Epstein sent an email asking Musk if he had any plans for coming to New York for the opening of the United Nations General Assembly where many "interesting people" would be coming to his house to which Musk responded that "Flying to NY to see UN diplomats do nothing would be an unwise use of time". Epstein responded by stating "Do you think i am retarded. Just kidding, there is no one over 25 and all very cute." Musk has denied any close relationship with Epstein and described him as a "creep" who attempted to ingratiate himself with influential people. When Musk was asked in 2019 if he introduced Epstein to Mark Zuckerberg, Musk responded: "I don’t recall introducing Epstein to anyone, as I don’t know the guy well enough to do so." The released emails nonetheless showed cordial exchanges on a range of topics, including Musk's inquiry about parties on the island. The correspondence also indicated that Musk suggested hosting Epstein at SpaceX, while Epstein separately discussed plans to tour SpaceX and bring "the girls", though there is no evidence that such a visit occurred. Musk has described the release of the files a "distraction", later accusing the second Trump administration of suppressing them to protect powerful individuals, including Trump himself.[l] Wealth Elon Musk is the wealthiest person in the world, with an estimated net worth of US$690 billion as of January 2026, according to the Bloomberg Billionaires Index, and $852 billion according to Forbes, primarily from his ownership stakes in SpaceX and Tesla. Having been first listed on the Forbes Billionaires List in 2012, around 75% of Musk's wealth was derived from Tesla stock in November 2020, although he describes himself as "cash poor". According to Forbes, he became the first person in the world to achieve a net worth of $300 billion in 2021; $400 billion in December 2024; $500 billion in October 2025; $600 billion in mid-December 2025; $700 billion later that month; and $800 billion in February 2026. In November 2025, a Tesla pay package worth potentially $1 trillion for Musk was approved, which he is to receive over 10 years if he meets specific goals. Public image Although his ventures have been highly influential within their separate industries starting in the 2000s, Musk only became a public figure in the early 2010s. He has been described as an eccentric who makes spontaneous and impactful decisions, while also often making controversial statements, contrary to other billionaires who prefer reclusiveness to protect their businesses. Musk's actions and his expressed views have made him a polarizing figure. Biographer Ashlee Vance described people's opinions of Musk as polarized due to his "part philosopher, part troll" persona on Twitter. He has drawn denouncement for using his platform to mock the self-selection of personal pronouns, while also receiving praise for bringing international attention to matters like British survivors of grooming gangs. Musk has been described as an American oligarch due to his extensive influence over public discourse, social media, industry, politics, and government policy. After Trump's re-election, Musk's influence and actions during the transition period and the second presidency of Donald Trump led some to call him "President Musk", the "actual president-elect", "shadow president" or "co-president". Awards for his contributions to the development of the Falcon rockets include the American Institute of Aeronautics and Astronautics George Low Transportation Award in 2008, the Fédération Aéronautique Internationale Gold Space Medal in 2010, and the Royal Aeronautical Society Gold Medal in 2012. In 2015, he received an honorary doctorate in engineering and technology from Yale University and an Institute of Electrical and Electronics Engineers Honorary Membership. Musk was elected a Fellow of the Royal Society (FRS) in 2018.[m] In 2022, Musk was elected to the National Academy of Engineering. Time has listed Musk as one of the most influential people in the world in 2010, 2013, 2018, and 2021. Musk was selected as Time's "Person of the Year" for 2021. Then Time editor-in-chief Edward Felsenthal wrote that, "Person of the Year is a marker of influence, and few individuals have had more influence than Musk on life on Earth, and potentially life off Earth too." Notes References Works cited Further reading External links |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.